You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need to remove the temperature param when OpenAI o1 is selected.
An unexpected error occurred: CompletionException - com.devoxx.genie.service.exception.ProviderUnavailableException: dev.ai4j.openai4j.OpenAiHttpException: { "error": { "message": "Unsupported parameter: 'temperature' is not supported with this model.", "type": "invalid_request_error", "param": "temperature", "code": "unsupported_parameter" } } Caused by: ProviderUnavailableException - dev.ai4j.openai4j.OpenAiHttpException: { "error": { "message": "Unsupported parameter: 'temperature' is not supported with this model.", "type": "invalid_request_error", "param": "temperature", "code": "unsupported_parameter" } } Please check the IDE log for more details.
The text was updated successfully, but these errors were encountered:
Langchain4J sets the default temperature to 0.7 when it has a null value. So there's no way to override this.
A fix will be included in a very near future release of LC4J 🤩
stephanj
changed the title
OpenAi O1 doesn't support temperature
[FEATURE] OpenAi O1 doesn't support temperature, waiting for LangChain4J upgrade
Jan 28, 2025
"Try out @openai's latest o3-mini model using #LangChain4j 1.0.0-alpha2-SNAPSHOT version (the official release version will be available early next week)"
We need to remove the temperature param when OpenAI o1 is selected.
The text was updated successfully, but these errors were encountered: