-
Notifications
You must be signed in to change notification settings - Fork 2.5k
[BUG] #416
Copy link
Copy link
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
During the running of the Strix app, I eventually get this error: X LLM Request Failed
Details: litellm. BadRequestError: AnthropicException
{"type":"error","error":{"type": "invalid_request_error", "message":"This model does not support assistant message prefill. The
conversation must end with a user message."},
To Reproduce
Steps to reproduce the behavior:
- Use Anthropic provider
- Use Sonnet 4.6
- Run against a large python app
Expected behavior
For that error to not happen.
Screenshots
If applicable, add screenshots to help explain your problem.
System Information:
- OS: Ubuntu 22.04
- Strix Version or Commit: Latest -- I think 🐳 Pulling Docker image: ghcr.io/usestrix/strix-sandbox:0.1.10
- Python Version: 3.12
- LLM Used: Claude Sonnet 4.6
Additional context
Add any other context about the problem here.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working