Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Ollama as a server no longer works as of 3.2.42 #916

Open
TheOnlyThing opened this issue Dec 7, 2024 · 3 comments
Open

Using Ollama as a server no longer works as of 3.2.42 #916

TheOnlyThing opened this issue Dec 7, 2024 · 3 comments

Comments

@TheOnlyThing
Copy link

Im using the Custom API(OpenAI format)

using the same configuration with the same values for both versions
image

after I updated to 3.2 it doesn't work anymore and I get these errors when I try to send a message:

Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'startsWith')
at SmartChatModelRequestAdapter.to_openai (plugin:smart-connections:11045:20)
at SmartChatModelRequestAdapter.to_platform (plugin:smart-connections:11029:17)
at SmartChatModelCustomAdapter.complete (plugin:smart-connections:10787:33)
at SmartChatModel.invoke_adapter_method (plugin:smart-connections:9947:38)
at SmartChatModel.complete (plugin:smart-connections:10087:23)
at SmartThread.complete (plugin:smart-connections:15033:46)
at async SmartMessage.init (plugin:smart-connections:15632:7)

@brianpetro
Copy link
Owner

There is a new Ollama option that should work without all the configuration. Is there a reason your not using that instead of the custom API?

Screenshot_20241207-082422

🌴

@TheOnlyThing
Copy link
Author

yes, I'm not using it locally so that option doesn't work for me (if there was a custom hostname option for that though I would find it helpful)

@brianpetro
Copy link
Owner

@TheOnlyThing thanks for clarifying that.

This commit brianpetro/jsbrains@6a488bb will fix the error in your first post when the next version is released.

🌴

brianpetro pushed a commit to brianpetro/jsbrains that referenced this issue Jan 31, 2025
…apter support (e.g. external Ollama server) brianpetro/obsidian-smart-connections#916

- Added flexible `adapters_map` to support multiple API adapter configurations
- Implemented dynamic request and response adapter selection via `api_adapter` setting
- Enhanced `settings_config` with new `api_adapter` dropdown option
- Improved endpoint generation and added method to retrieve available adapters
- Updated default request adapter to support custom model name configuration
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants