-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"API call didn't return a message" error repeating very frequently in v5.5, with openai. #2255
Comments
Are you able to reproduce this error in the latest version? |
Should I try with the latest version ? because this thing should have been fixed way before v5.5, so I am not sure. |
If you're able to reproduce with the latest version that would be great and help us debug the ticket faster (if the bug is still happening), whereas we're not able to actively debug issues for older version (eg v0.5.5) |
For example I don't think this should be happening when we use structured outputs, which should be on by default for |
I have the exact issue with version 0.6.5 for existing agents. new agents work fine. |
Tested the 0.6.6 and the issue still remains. |
I suspect this error is arising because we are trying to make openai call the send message which is a custom function we have provided, but due to the system message, agent persona and human persona added to the context of the current message, which may include 2-3 function calls, leads to context limit being filled in openai. leading to the send message not being called. just a thought. |
I don't think this is the case since OpenAI throws a special context overflow related error when this happens, which Letta has a special catch for (
@AliSayyah interesting - so you get this error on old agents, but not new agents. Are you able to share more details about the agent payloads? For example, it would really help debug if you're able to share the output of |
@cpacker does your latest pr solve the issue ? Can you explain how you solved it ? |
@HG2407 / @AliSayyah yep I think the PR should fix the bug! Basically, there was a bug in the interpretation of some API calls that were missing tool calls, which turns the entire message choice into null, instead of just the tool choice section. Will be part of the next release which should be out asap! If you want to try out the fix now you can try pulling main or use the nightly builds. |
@cpacker I tried the nightly build and the issue remains, unfortunately. |
@AliSayyah @HG2407 are either of you able to provide me a quick way to reproduce the issue on the latest docker or source? For example, something like:
|
@cpacker I haven't found any particular pattern to reproduce it, but I'll try again today. |
@cpacker this issue usually arises when the agent needs to call two three functions in one go, sequentially. At that time, usually 2-3 functions are called then the request ends. |
On the latest version (0.6.7) I get this error message. maybe a different issue? |
interesting - @AliSayyah is it possible to share the payload (request) going to OpenAI here? Feel free to replace any sensitive strings - I'm more curious how this payload ended up happening. Is it a |
Also - is the agent here effectively "bricked"? E.g. every step/message causes this error? Or is it stochastic? |
Do I need to edit the source code to do this?is there an easier way? |
Describe the bug
A clear and concise description of what the bug is.
The issue is "API call didn't return a message", error is repeating very frequently with letta v5.5. I am using openai's gpt 4o. The problem is that the inner monologue is displayed everytime on the ui but there is no actual user response given. I have been able to debug the issue. It is because in the normal response, when everything is correct, the message.choices contains tool calls for send message. But whenever this issue arises the message.choices doesn't contain any function call. It should have been fixed according to a previous closed bug. But it is still there and is very frustating. I am attaching the logs as well.
Please describe your setup
pip install letta
?pip install letta-nightly
?git clone
?letta
? (cmd.exe
/Powershell/Anaconda Shell/Terminal)Screenshots
If applicable, add screenshots to help explain your problem.
correct response:
error response:
Additional context
Add any other context about the problem here.
I have also modified the parameters of _get_ai_reply, a little:
Letta Config
Please attach your
~/.letta/config
file or copy past it below.If you're not using OpenAI, please provide additional information on your local LLM setup:
Local LLM details
If you are trying to run Letta with local LLMs, please provide the following information:
dolphin-2.1-mistral-7b.Q6_K.gguf
)The text was updated successfully, but these errors were encountered: