Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

litellm.BadRequestError: DeepseekException #1288

Open
wuhq7 opened this issue Feb 12, 2025 · 3 comments
Open

litellm.BadRequestError: DeepseekException #1288

wuhq7 opened this issue Feb 12, 2025 · 3 comments

Comments

@wuhq7
Copy link

wuhq7 commented Feb 12, 2025

An error occurred during question recommendation generation: litellm.BadRequestError: DeepseekException - Failed to deserialize the JSON body into the target type: response_format: response_format.type json_schema is unavailable now at line 1 column 16335

config.txt

wrenai-wren-ai-service-1 log.txt

@cyyeh
Copy link
Member

cyyeh commented Feb 12, 2025

@wuhq7
Copy link
Author

wuhq7 commented Feb 12, 2025

@wuhq7 please check this config example for deepseek

https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.deepseek.yaml

Image

Initially I followed this config.deepseek.yaml example, then I followed the latest version of yaml you posted

@cyyeh
Copy link
Member

cyyeh commented Feb 12, 2025

@wuhq7 please check this config example for deepseek
https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.deepseek.yaml

Image

Initially I followed this config.deepseek.yaml example, then I followed the latest version of yaml you posted

The latest version of config.example.yaml is only used for making sure pipe definitions are the latest. llm, embedding model definitions please follow deepseek config

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants