Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(llm): Make LLM model configurable via config #2704

Closed
wants to merge 3 commits into from

Conversation

mobley-trent
Copy link
Contributor

Summary:

  • Adds ModelKind enum to dictate supported models.

Issue Reference(s):
Fixes #2690
/claim #2690

Build & Testing:

  • I ran cargo test successfully.
  • I have run ./lint.sh --mode=fix to fix all linting issues raised by ./lint.sh --mode=check.

Checklist:

  • I have added relevant unit & integration tests.
  • I have updated the documentation accordingly.
  • I have performed a self-review of my code.
  • PR follows the naming convention of <type>(<optional scope>): <title>

@github-actions github-actions bot added the type: feature Brand new functionality, features, pages, workflows, endpoints, etc. label Aug 15, 2024
Copy link

codecov bot commented Aug 15, 2024

Codecov Report

Attention: Patch coverage is 0% with 27 lines in your changes missing coverage. Please review.

Project coverage is 86.22%. Comparing base (c7c0b23) to head (b7e8109).
Report is 5 commits behind head on main.

Files Patch % Lines
src/cli/llm/model.rs 0.00% 19 Missing ⚠️
src/cli/llm/wizard.rs 0.00% 5 Missing ⚠️
src/cli/llm/infer_type_name.rs 0.00% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2704      +/-   ##
==========================================
+ Coverage   86.12%   86.22%   +0.10%     
==========================================
  Files         255      255              
  Lines       24770    24698      -72     
==========================================
- Hits        21333    21297      -36     
+ Misses       3437     3401      -36     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🙋 Bounty claim type: feature Brand new functionality, features, pages, workflows, endpoints, etc.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Make the LLM model configurable via config
1 participant