Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(2690): make llm models configurable #2716

Merged
merged 22 commits into from
Aug 26, 2024

Conversation

beelchester
Copy link
Contributor

@beelchester beelchester commented Aug 16, 2024

fixes: #2690
/claim #2690

Example

  • Using Gemini
    Set TAILCALL_LLM_API_KEY to your Gemini API key.
  "llm": {
    "model": "gemini-1.5-flash-latest",
    "secret": "{{.env.TAILCALL_LLM_API_KEY}}"
  }
  • Using Ollama
    Don't need secret.
  "llm": {
    "model": "gemma2"
  }

Models

--- Models for OpenAI
["gpt-4o", "gpt-4o-mini", "gpt-4-turbo", "gpt-4", "gpt-3.5-turbo"]

--- Models for Gemini
["gemini-1.5-pro", "gemini-1.5-flash", "gemini-1.0-pro", "gemini-1.5-flash-latest"]

--- Models for Anthropic
["claude-3-5-sonnet-20240620", "claude-3-opus-20240229", "claude-3-sonnet-20240229", "claude-3-haiku-20240307"]

--- Models for Groq
["llama-3.1-405b-reasoning", "llama-3.1-70b-versatile", "llama-3.1-8b-instant", "mixtral-8x7b-32768", "gemma-7b-it", "gemma2-9b-it", "llama3-groq-70b-8192-tool-use-preview", "llama3-groq-8b-8192-tool-use-preview", "llama3-8b-8192", "llama3-70b-8192"]

--- Models for Cohere
["command-r-plus", "command-r", "command", "command-nightly", "command-light", "command-light-nightly"]

Anything else is considered an Ollama model.
Refer to https://ollama.com/library

@github-actions github-actions bot added the type: feature Brand new functionality, features, pages, workflows, endpoints, etc. label Aug 16, 2024
Copy link

codecov bot commented Aug 16, 2024

Codecov Report

Attention: Patch coverage is 51.61290% with 15 lines in your changes missing coverage. Please review.

Project coverage is 86.66%. Comparing base (648c018) to head (7e3646c).

Files Patch % Lines
src/cli/generator/generator.rs 0.00% 9 Missing ⚠️
src/cli/llm/infer_type_name.rs 0.00% 4 Missing ⚠️
src/cli/generator/config.rs 94.11% 1 Missing ⚠️
src/cli/llm/wizard.rs 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2716      +/-   ##
==========================================
+ Coverage   86.31%   86.66%   +0.34%     
==========================================
  Files         256      255       -1     
  Lines       24999    25002       +3     
==========================================
+ Hits        21579    21669      +90     
+ Misses       3420     3333      -87     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@tusharmath tusharmath marked this pull request as draft August 17, 2024 10:14
@beelchester beelchester marked this pull request as ready for review August 17, 2024 11:37
Signed-off-by: Sahil Yeole <[email protected]>
@tusharmath tusharmath marked this pull request as draft August 17, 2024 13:33
Signed-off-by: Sahil Yeole <[email protected]>
@beelchester beelchester marked this pull request as ready for review August 17, 2024 13:41
@tusharmath tusharmath changed the title feat(2690): make llm configurable feat(2690): make llm models configurable Aug 17, 2024
@tusharmath tusharmath enabled auto-merge (squash) August 17, 2024 14:03
@tusharmath tusharmath marked this pull request as draft August 17, 2024 14:45
auto-merge was automatically disabled August 17, 2024 14:45

Pull request was converted to draft

@beelchester beelchester marked this pull request as ready for review August 21, 2024 05:39
Signed-off-by: Sahil Yeole <[email protected]>
Signed-off-by: Sahil Yeole <[email protected]>
Signed-off-by: Sahil Yeole <[email protected]>
Copy link

Action required: PR inactive for 5 days.
Status update or closure in 10 days.

@github-actions github-actions bot added state: inactive No current action needed/possible; issue fixed, out of scope, or superseded. and removed state: inactive No current action needed/possible; issue fixed, out of scope, or superseded. labels Aug 26, 2024
@tusharmath tusharmath enabled auto-merge (squash) August 26, 2024 15:11
@tusharmath tusharmath merged commit f66fc89 into tailcallhq:main Aug 26, 2024
30 of 31 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🙋 Bounty claim type: feature Brand new functionality, features, pages, workflows, endpoints, etc.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Make the LLM model configurable via config
4 participants