-
-
Notifications
You must be signed in to change notification settings - Fork 126
Open
Description
Bug Description
When using gp.nvim with Ollama as a provider, streaming responses fail with a Lua error at dispatcher.lua:328. The plugin is unable to parse Ollama's Server-Sent Events (SSE) response format.
Error Message
Error executing callback:
...nvim-data/lazy/gp.nvim/lua/gp/dispatcher.lua:328: attempt to call method 'match' (a nil value)
stack traceback:
...nvim-data/lazy/gp.nvim/lua/gp/dispatcher.lua:328: in function 'process_lines'
...nvim-data/lazy/gp.nvim/lua/gp/dispatcher.lua:362: in function 'out_reader'
...nvim-data/lazy/gp.nvim/lua/gp/tasker.lua:166: in function <...nvim-data/lazy/gp.nvim/lua/gp/tasker.lua:158>
Additionally:
Gp.nvim: ollama response is empty:
'data: {"id":"chatcmpl-223","object":"chat.completion.chunk","created":1770957764,"model":"qwen2.5-coder:14b",...}'
Configuration
{
"robitx/gp.nvim",
config = function()
require("gp").setup({
providers = {
openai = {
disable = true,
},
ollama = {
endpoint = "http://localhost:11434/v1/chat/completions",
secret = "ollama",
},
},
agents = {
{
name = "Clanker",
provider = "ollama",
chat = true,
command = true,
model = { model = "qwen2.5-coder:14b", temperature = 0.2, top_p = 1 },
system_prompt = "...",
},
},
default_chat_agent = "Clanker",
default_command_agent = "Clanker",
})
end,
}Root Cause
Ollama's OpenAI-compatible API returns streaming responses in SSE format with a data: prefix:
data: {"id":"chatcmpl-223","object":"chat.completion.chunk",...}
At line 328 in dispatcher.lua, the code attempts to call line:match() without checking if line is a string or handling the data: prefix. This causes the error when processing Ollama's response format.
Current Code (line 328)
if qt.provider == "ollama" then
if line:match('"message":') and line:match('"content":') then
local success, decoded = pcall(vim.json.decode, line)
if success and decoded.message and decoded.message.content then
content = decoded.message.content
end
end
endProposed Fix
if qt.provider == "ollama" then
-- Strip "data: " prefix that Ollama adds to SSE streams
if type(line) == "string" then
line = line:gsub("^data: ", "")
end
if type(line) == "string" and line:match('"message":') and line:match('"content":') then
local success, decoded = pcall(vim.json.decode, line)
if success and decoded.message and decoded.message.content then
content = decoded.message.content
end
end
endEnvironment
- Neovim version: v0.11.6
- gp.nvim version: latest (main branch)
- Ollama version: 0.16.0
- OS: Windows 11
Steps to Reproduce
- Install and configure Ollama with any model (e.g.,
qwen2.5-coder:14b) - Configure gp.nvim with Ollama provider as shown above
- Attempt to use
:GpChatNewand send a message - Observe the error
Expected Behavior
Ollama streaming responses should be parsed correctly and displayed in the chat buffer.
Actual Behavior
Error is thrown and response shows as "empty" despite Ollama successfully returning data.
Workaround
Manually edit dispatcher.lua line 328 with the proposed fix above.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels