Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Achieving Deterministic Output #228

Open
antonkratz opened this issue Apr 21, 2024 · 0 comments
Open

Achieving Deterministic Output #228

antonkratz opened this issue Apr 21, 2024 · 0 comments

Comments

@antonkratz
Copy link

(I posted a very similar question in the ollama repo. This question is how or if this can be achieved with the inference engine code that comes with code llama).

For a research project, I am interested in exploring the effect of different prompts. The problem is, when I change the prompt even slightly, and I get a different result, I am unable to say how much has changed in the output because I changed the prompt input and how much has changed in the output because of the random and pseudo-random effects because of concepts such as top-k, top-n and temperature.

Is it possible, in principle, to get a deterministic output? Is it technically possible to get a deterministic output in practice with the code provided together with code llama?

Basically, I want to get responses in a way that the same prompt generates the same output, at any temperature. There can and should be pseudo-randomness but it must be necessary for me to fix the seed. I want only changes that are caused by the prompt. Is that possible with code llama?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant