We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is RamaLama a drop-in replacement to Ollama? I'm trying to use RamaLama with JetBrains AI Assistant but failed to connect
failed to connect
Logs:
request: GET / 192.168.***.*** 200
It seems ramalama returned gzip is not supported by this browser to JetBrains AI Assistant:
gzip is not supported by this browser
ramalama --image localhost/ramalama/rocm-gfx9:latest serve qwen2.5-coder:7b
The text was updated successfully, but these errors were encountered:
@nikAizuddin https://www.jetbrains.com/pycharm/ is a valid IDE? Never used JetBrains, I would like to give it a try.
Sorry, something went wrong.
Yup, PyCharm Community Edition should be okay.
Settings
Plugins
Tools
AI Assistant
No branches or pull requests
Is RamaLama a drop-in replacement to Ollama? I'm trying to use RamaLama with JetBrains AI Assistant but
failed to connect
Logs:
It seems ramalama returned
gzip is not supported by this browser
to JetBrains AI Assistant:Steps to reproduce
ramalama --image localhost/ramalama/rocm-gfx9:latest serve qwen2.5-coder:7b
The text was updated successfully, but these errors were encountered: