Add support for DEFAULT_LLM_MODEL environment variable #637
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR introduces support for a new environment variable
DEFAULT_LLM_MODEL
to configure the default LLM model used by the web-ui. This eliminates the need to repeatedly set the model manually.Changes Made
Added support for
DEFAULT_LLM_MODEL
in the executable script.Updated code to fetch the model using:
Ensures a consistent and convenient way to define the default model through environment variables.
How to Test
Set the environment variable before running:
export DEFAULT_LLM_MODEL=gpt-4
Start the web-ui
The selected model should default to the value set in
DEFAULT_LLM_MODEL
without requiring manual selection.Notes
DEFAULT_LLM_MODEL
is not set, fallback logic still uses the default model fromconfig.model_names
based on theDEFAULT_LLM
provider or defaults to"openai"
.Summary by cubic
Added support for the DEFAULT_LLM_MODEL environment variable to set the default LLM model in the web UI.