-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Issues: BerriAI/litellm
[Bug]: Function Calling Not Working with New o1 Model via lit...
#7292
by mvrodrig
was closed Dec 19, 2024
Closed
9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Bug]: Team Model dont use the model_name specifyed
bug
Something isn't working
#7305
opened Dec 19, 2024 by
superpoussin22
[Bug]: Taking too long to call fallback
bug
Something isn't working
#7303
opened Dec 19, 2024 by
DanielAmorimAraujo
[Feature]: Proxy server should fix OpenTelemetry span kind
enhancement
New feature or request
mlops user request
#7298
opened Dec 18, 2024 by
stevapple
[Feature]: Gemini 2.0 realtime endpoint support
enhancement
New feature or request
#7294
opened Dec 18, 2024 by
krrishdholakia
[Bug]: Unclear Error message with expired enterprise license
bug
Something isn't working
mlops user request
#7289
opened Dec 18, 2024 by
parkerkain-8451
[Bug]: Async acompletion only works once with vertex ai models
bug
Something isn't working
mlops user request
#7288
opened Dec 18, 2024 by
1greentangerine
[Bug]: valid tokens do not longer work after 1.52.14
bug
Something isn't working
#7287
opened Dec 18, 2024 by
chymian
[Feature]: Is there a model mapping file? ex: gpt-4o will map to gpt-4o-2024-08-06
enhancement
New feature or request
#7286
opened Dec 18, 2024 by
gziz
[Bug]: Gemini 1.5 Flash 8B Error in Model Config
bug
Something isn't working
#7269
opened Dec 17, 2024 by
raunakdoesdev
[Bug]: Retry policy - specified number of retries is incremented by 3
awaiting: user response
bug
Something isn't working
mlops user request
#7262
opened Dec 17, 2024 by
dbczumar
[Bug]: Success callback not called for async streaming
bug
Something isn't working
#7260
opened Dec 16, 2024 by
boosh
[Bug]: Context Window Exceeded Error gets wrapped in BadRequestError
bug
Something isn't working
mlops user request
#7259
opened Dec 16, 2024 by
enyst
[Bug]: no billing reported using rerank endpoint and amazon bedrock
bug
Something isn't working
#7258
opened Dec 16, 2024 by
ladrians
[Bug]: add model definition command-r7b-12-2024 from cohere
bug
Something isn't working
#7256
opened Dec 16, 2024 by
ladrians
[Bug]: Background errors/warnings during peak volume handling
bug
Something isn't working
mlops user request
#7255
opened Dec 16, 2024 by
suresiva
[Bug]: LiteLLM Azure TTS Streaming Issue
bug
Something isn't working
#7254
opened Dec 16, 2024 by
yigitkonur
[Bug]: Drop params not dropping logprobs for gemini
bug
Something isn't working
mlops user request
#7248
opened Dec 15, 2024 by
pbarker
[Feature]: Support Infinity Reranker (custom reranking models)
enhancement
New feature or request
#7246
opened Dec 15, 2024 by
haoshan98
[Bug]: Context/Citations/Intent object is missing in Azure response model when using chat extensions like AzureSearchChatDataSource
bug
Something isn't working
#7245
opened Dec 15, 2024 by
mkassm
[Feature]: Add request_stream_timeout setting for default stream_timeout value of all models
enhancement
New feature or request
#7244
opened Dec 15, 2024 by
jeromeroussin
[Bug]: Nova tool calling doesn't support tool choice
bug
Something isn't working
#7242
opened Dec 15, 2024 by
LorenzoBoccaccia
[Feature]: Integrating user information generated by LiteLLM with Langfuse
enhancement
New feature or request
#7238
opened Dec 15, 2024 by
vrvrv
Previous Next
ProTip!
Add no:assignee to see everything that’s not assigned.