-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cant select model in Obsidian Plugin drop down #53
Comments
@danraymond at the moment ollama is missing support for model listing (at least when using the openAI compatible connection). |
Is there a way to get int to work in obsidian until a next update? |
I had a similar problem and I removes the plugin folder after I uninstalled and then reinstalled. It worked after. Also I found that is I had filled in a REST API entry (not oai) it would make other model disappear. |
v1.8.8 @danraymond - Let me know if keriati PR resolve this issue. Thanks |
The latest update responds with a "model not found" in the chat box. However, I still can';t set the model in the plugin parameters is that the issue. How then to set the model. Ollama happily running for other plugins |
Hmm, I'm not sure if fetching via the window instance of Ollama is different from the Linux/MacOS version. I will setup Ollama on a windows machine to see if I can replicate the same issue you are facing. |
Hi longy2k, |
Hi @danraymond, I just did a fresh install for Obsidian/Ollama on Windows 11 and was able to get it working. There are two potential issues that I can think of: The
|
v2.1.0 I also ran into the same issue of not being able to select a model in the dropdown at times. I have added a reload button next to the dropdown under in 'General > Model', let me know if this resolves your issue. Thanks |
Hi Longy2k, |
Please update to v2.1.1 and make sure your Ollama server handles the CORS restriction. You can follow these instructions here: https://github.com/longy2k/obsidian-bmo-chatbot/wiki/How-to-setup-with-Ollama Let me know if you got it working, thanks! |
Hi! I am running Ollama on PopOS (Ubuntu), and I read that your tutorial is for MacOS CORS. Is it the same for Linux? (I wouldn't want to break something...) Regards, |
Option 2 might be the easiest:
If you run into port conflicts (e.g
|
Hi longy2k. |
For option 2, is the server successfully running? Also check if you are on the latest Ollama version (v0.1.38). If you are on the latest Ollama version and you are running the Ollama server successfully and no models are loaded, try uninstalling and re-installing the plug-in. |
Hi Longy2k. The server is running properly, as I can chat with LLMs via Open WebUI. My version vas .37, but I've just updated it to the latest (.39) I have also done the uninstall plugin + install, but still doesn't work... I'm sorry for the time you are spending on that... i don't know why-df it doesn't dropsdown the models despite the "Models reloaded" msg... Regards, |
I'm having the same problem. Using WSL version of stable diffusion and Ollama. |
I'm having what looks to be the same issue. Inserted the CORS line on the remote (in-network) server where Ollama is running. I know (or think I know) that the server is accessible since I can reach Open WebUI from my Windows machine. Nevertheless I find that when I plug in the Ollama REST API (http://ollama.server.IP.address:11434) I get no loaded models. This is regardless of whether Ollama has an AI loaded at that moment or not. Also followed the troubleshooting instructions here and here, no dice. |
I am running a windows instance of ollama it works with other ollama obsidian plugins. With BMO though it won't show me any models to select in the dropdown list, it's empty. The interface works in obsidian in the right pane but its not responding at all.
I am assuming its because I can't select a model? What can I do?
The text was updated successfully, but these errors were encountered: