-
Notifications
You must be signed in to change notification settings - Fork 8
LLM Explorer
noco-ai edited this page Feb 16, 2024
·
15 revisions
The Chat UI in LLM explorer offers an experience similar to the OpenAI chat sandbox for writing zero and multi-shot prompts for use in agents.
The chat interface can be used for zero shot testing models.
The plus and minus button in the lower left-hand corner allow for the addition and removal of input/output example pairs to feed into the model(s).
Test local models side by side, or against OpenAI APIs.
When this checkbox is selected the input/output example is not sent to the model. This combined with the Copy to Output allows for saving validation examples for testing new models.