-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add temperature and top-p #77
Conversation
temperature: float = 1.0, | ||
top_p: float = 1.0, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are these the right defaults to use?
Should it not be None?
https://docs.litellm.ai/docs/completion/input:
temperature: Optional[float] = None,
If it is None, I assume litellm doesn't pass the temperature and the provider will the default that they have configured.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixing this!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Test with
Temperature
Temperature
|
Doing this since Negin requires it for UnnaturalInstructions.
Also to get an idea of the best way to do #62 and #74.
This is just a stop gap before I get to those two issues.