-
-
Notifications
You must be signed in to change notification settings - Fork 92
Issues: explosion/spacy-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Support for China's large model API?
feat/model
Feature: models
feat/request
Requests for new features
#322
opened Oct 10, 2023 by
tianchiguaixia
Support for Amazon Bedrock Titan models
feat/model
Feature: models
feat/new
New feature
third party
Third-party software (integrations etc.)
#338
opened Oct 24, 2023 by
viveksilimkhan1
There is a significant difference between feed shot and zero shot
feat/task
Feature: tasks
#371
opened Nov 16, 2023 by
tianchiguaixia
Anthropic error : API failed prompt must start with "\n\nHuman:" turn
bug
Something isn't working
feat/model
Feature: models
#386
opened Nov 29, 2023 by
kaminosekai54
Inconsistent output on Dolly NER
feat/model
Feature: models
usage
How to use `spacy-llm`
#393
opened Dec 1, 2023 by
nxitik
'<' not supported between instances of 'str' and 'int'
usage
How to use `spacy-llm`
#411
opened Jan 8, 2024 by
BaptisteLoquette
FileNotFoundError: [Errno 2] No such file or directory: 'local-ner-cache/9963044417883968883.spacy'
bug
Something isn't working
feat/cache
Feature: caching
#414
opened Jan 18, 2024 by
nikolaysm
[Warning] the current text generation call will exceed the model's predefined maximum length (4096).
feat/model
Feature: models
usage
How to use `spacy-llm`
#423
opened Jan 21, 2024 by
yileitu
Working dummy example for custom LLM endpoint integration
usage
How to use `spacy-llm`
#436
opened Jan 29, 2024 by
borhenryk
How to surpass BERT through large models
usage
How to use `spacy-llm`
#442
opened Feb 17, 2024 by
tianchiguaixia
Many returns are not what I want
usage
How to use `spacy-llm`
#443
opened Feb 18, 2024 by
tianchiguaixia
Potential REL sharding issue
bug
Something isn't working
feat/sharding
Everything related to sharding/map-reduce.
feat/task
Feature: tasks
#450
opened Mar 8, 2024 by
peter-axion
transformers
> 4.38 causes bug in inference for HF models
bug
#463
opened Apr 24, 2024 by
rmitsch
Support a unified API (such as LiteLLM) for all LLM providers / models
#470
opened May 17, 2024 by
omri374
Update test suite to avoid crashing when Everything related to the test suite
en_core_web_md
is unavailable
tests
#471
opened May 17, 2024 by
svlandeg
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.