Skip to content

More details for configuring Ollama #252

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jul 17, 2025
Merged

Conversation

jslag
Copy link
Contributor

@jslag jslag commented Jun 17, 2025

There didn't appear to be any existing docs on what ollama_api_base should look like.

What this does

Updates docs/configuration.md to make it clearer how to configure use with ollama

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • [n/a] I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

Sorry, something went wrong.

There didn't appear to be any existing docs on what `ollama_api_base`
should look like.
* `bedrock_api_key`, `bedrock_secret_key`, `bedrock_region`, `bedrock_session_token` (See AWS documentation for standard credential methods if not set explicitly).

## Provider API Base
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This section should be called

# Ollama API Base (`ollama_api_base`)

Comment on lines 111 to 115

When configuring a local model running via Ollama, you will configure the URL to the Ollama server rather than an API key. A typical value is `http://localhost:11434/v1`.

* `ollama_api_base`

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd rephrase this to:

When using a local model running via Ollama, set the `ollama_api_base` to the URL of your Ollama server, e.g. `http://localhost:11434/v1`

@crmne crmne added the documentation Improvements or additions to documentation label Jul 16, 2025
@jslag jslag requested a review from crmne July 16, 2025 22:13

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
@crmne crmne merged commit f2fffc3 into crmne:main Jul 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants