Skip to content

Releases: explosion/spacy-llm

v0.4.3: LLama 2, Claude 2, fixes for Falcon & readme

25 Jul 14:02
40248a1
Compare
Choose a tag to compare

✨ New features and improvements

  • NEW: Support for LLama 2 via Huggingface (#225)
  • NEW: Support for Anthropic's Claude 2 (#231)

🔴 Bug fixes

  • Corrects how runtime arguments are passed to Falcon (#222)
  • Fixes incorrect capitalization in example in readme (#227)

📖 Documentation and examples

We've moved most of our documentation to the main spaCy docs:

👥 Contributors

@honnibal, @ines, @koaning, @rmitsch, @svlandeg, @victorialslocum

v0.4.2: Guard REL against entity index errors, warn instead of error for REST auth verification

14 Jul 09:47
32e6be5
Compare
Choose a tag to compare

🔴 Bug fixes

  • Guard REL against entity indices higher than the total number of entities available (#219)
  • Warn instead of raising an error when API credentials are wrong for REST-based models (#218)

👥 Contributors

@honnibal, @ines, @rmitsch, @svlandeg

v0.4.1: Authenticate models at init time, fix incorrect model names

11 Jul 11:36
cb1594b
Compare
Choose a tag to compare

✨ New features and improvements

  • Verify authentication details at init (instead of at run) time for Anthropic and Cohere models (#206)

🔴 Bug fixes

  • Update OpenLLaMA model names after updates on HuggingFace (#209)
  • Fix incorrectly spelled model names for OpenAI models in migration guide (#210)

👥 Contributors

@honnibal, @ines, @rmitsch, @svlandeg

v0.4.0: Falcon, sentiment analysis, summarization, backend refactoring

06 Jul 12:16
95db73a
Compare
Choose a tag to compare

✨ New features and improvements

  • NEW: Refactored to transition from backend- to model-centric architecture. Note: this is breaking, you'll need to adjust your configs (#176)
  • NEW: Support for Falcon via HuggingFace (#179)
  • NEW: Extract prompt examples from component initialization (#163)
  • NEW: Summary task spacy.Summary.v1 (#181)
  • NEW: Sentiment analysis task spacy.Sentiment.v1 (#200)
  • More thorough check for label inconsistencies in span-related tasks NER, REL, SpanCat, TextCat (#183)
  • Update langchain pin (#196)
  • Make fewshot file reader more robust w.r.t. file formats (#184)

⚠️ Backwards incompatibilities

  • Built-in support for MiniChain was dropped (#176)
  • The switch from a backend- to a model-centric architecture (#176) requires light adjustments in your config. Check out the migration guide to see how to update your config

👥 Contributors

@bdura, @honnibal, @ines, @kabirkhan, @koaning, @rmitsch, @shadeMe, @svlandeg, @vin-ivar

v0.3.2: Fixes for `nlp.pipe(..., as_tuples=True)` and caching of docs with LLM output

26 Jun 12:24
Compare
Choose a tag to compare

🔴 Bug fixes

  • Use doc._context to ensure that nlp.pipe(..., as_tuples=True) works (#188)
  • Fix issue with caching that prevented last doc in cache batch being cached with their LLM IO data (i. e. raw prompt and LLM response) (#191)

👥 Contributors

@honnibal, @ines, @kabirkhan, @rmitsch

v0.3.1: Make type validation optional, fix `pipe_labels()`

23 Jun 13:22
de89eb7
Compare
Choose a tag to compare

✨ New features and improvements

  • Make type validation optional with the new validate_types flag (#178)

🔴 Bug fixes

  • Fixed nlp.pipe_labels() not working for the llm component (#175)

👥 Contributors

@honnibal, @ines, @kabirkhan, @rmitsch

v0.3.0: Cohere, Anthropic, OpenLLaMa, StableLM + prompt & response logging + streamlit demo + lemmatization

14 Jun 09:42
6f800a5
Compare
Choose a tag to compare

✨ New features and improvements

  • NEW: Optional storing of prompts and responses in Doc objects (#127)
  • NEW: Optional logging of prompts and responses (#80)
  • NEW: Streamlit demo (#102)
  • NEW: Support for Cohere in backend spacy.REST.v1 (#165)
  • NEW: Support for Anthropic in backend spacy.REST.v1 (#157)
  • NEW: Support for OpenLLaMA via HuggingFace with the backend spacy.OpenLLaMa_HF.v1 (#151)
  • NEW: Support for StableLM via HuggingFace with the backend spacy.StableLM_HF.v1 (#141)
  • NEW: Lemmatization task spacy.Lemma.v1 (#164)

🔴 Bug fixes

  • Fix bug with sending empty prompts if all Doc objects are cached (#166)
  • Fix issue with LangChain model creation due to updated argument name (#162)

👥 Contributors

@adrianeboyd, @bdura, @honnibal, @ines, @kabirkhan, @ljvmiranda921, @rmitsch, @svlandeg, @victorialslocum, @vin-ivar

v0.2.1: Scoring support & bug fixes

05 Jun 14:21
55c0ae1
Compare
Choose a tag to compare

✨ New features and improvements

  • NEW: llm component supports scoring, like other spaCy components (#135)
  • Labels for spacy.NER.v2, spacy.REL.v1, spacy.SpanCat.v2, spacy.TextCat.v2 can be specified as list (#137)

🔴 Bug fixes

  • Fix type comparison in type checks failing on some platforms (#158)
  • Fix example 3 in readme failing (#137)

👥 Contributors

@adrianeboyd, @bdura, @honnibal, @ines, @kabirkhan, @KennethEnevoldsen, @rmitsch, @svlandeg, @vin-ivar

v0.2.0: New tasks for REL and spancat, reading prompt templates from file, and various improvements and bug fixes

30 May 16:19
1c1aef6
Compare
Choose a tag to compare

✨ New features and improvements

  • NEW: New relation extraction task spacy.REL.v1 (#114)
  • NEW: New spancat task spacy.SpanCat.v1 for entity recognition with overlapping spans (#101)
  • NEW: Set prompt templates as string or read them from files (e. g. Jinja templates) using spacy.FileReader.v1 (#95)
  • Improved prompt for NER task spacy.NER.v2 (#99)
  • Ability to describe labels in tasks spacy.NER.v2, spacy.SpanCat.v1 (#84)
  • Improved error handling and retry mechanics in REST backend spacy.REST.v1 (#110)
  • spacy-llm can now (#119) be installed with all dependencies for
    • MiniChain and LangChain with spacy-llm[minichain] or spacy-llm[langchain] respectively
      for locally running models on GPU with spacy-llm[transformers]
    • Improved type checking to ensure tasks and backend play nice with each other (#83)
  • Use Task abstraction instead of previously two functions template and parse to make the config easier to read (#91)

🔴 Bug fixes

  • Fix improper doc identity check in caching (#104)
  • Fix multiprocessing support for .pipe() (#117)

📖 Documentation and examples

👥 Contributors

@adrianeboyd, @bdura, @honnibal, @ines, @kabirkhan, @ljvmiranda921, @rmitsch, @svlandeg

v0.1.2: Bugfix for binary textcat task

12 May 13:01
53220c5
Compare
Choose a tag to compare
  • Fix processing of LLM response for binary textcat