Skip to content

[BUG] #416

@jarildson

Description

@jarildson

Describe the bug
During the running of the Strix app, I eventually get this error: X LLM Request Failed
Details: litellm. BadRequestError: AnthropicException
{"type":"error","error":{"type": "invalid_request_error", "message":"This model does not support assistant message prefill. The
conversation must end with a user message."},

To Reproduce
Steps to reproduce the behavior:

  1. Use Anthropic provider
  2. Use Sonnet 4.6
  3. Run against a large python app

Expected behavior
For that error to not happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Image

System Information:

  • OS: Ubuntu 22.04
  • Strix Version or Commit: Latest -- I think 🐳 Pulling Docker image: ghcr.io/usestrix/strix-sandbox:0.1.10
  • Python Version: 3.12
  • LLM Used: Claude Sonnet 4.6

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions