-
Notifications
You must be signed in to change notification settings - Fork 8
LLM Explorer
noco-ai edited this page Feb 16, 2024
·
15 revisions
The Chat UI in LLM explorer offers an experience similar to the OpenAI chat sandbox for writing zero and multi-shot prompts for use in agents.
- System Message: Defines the system message to send to the LLM model(s). This is formatted and prepended to the input/output examples.
- Top K: Adjust the top K parameter (default is 50).
- Top P: Adjust the top P parameter (default is 0.9).
- Min P: Adjust the minimum P parameter (default is 0.05).
- Temperature: Adjust the temperature setting (default is 1).
- Mirostat: Turn the mirostat setting on or off (default is off).
- Seed: Set the seed for generation. Default is -1 (random).
The plus and minus button in the lower left-hand corner allow for the addition and removal of input/output example pairs to feed into the model(s).
Test local models side by side, or against OpenAI APIs.
When this checkbox is selected the input/output example is not sent to the model. This combined with the Copy to Output allows for saving validation examples for testing new models.
Each model that creates a generation will output how fast it did so and it's input and completion token count.