-
Notifications
You must be signed in to change notification settings - Fork 910
Open
Labels
bugSomething is broken or behaving incorrectly.Something is broken or behaving incorrectly.needs-triageIssue needs maintainer review and initial categorization.Issue needs maintainer review and initial categorization.
Description
Before submitting
- I searched existing issues and did not find a duplicate.
- I included enough detail to reproduce or investigate the problem.
Area
apps/web
Steps to reproduce
- Start the app
- Start a new thread for an existing project
- Select "Claude" -> "Claude Opus 4.6"
- T3 Code requires a login (although via Claude Code CLI, all works without any login requirements)
- "Not logged in · Please run /login"
Another way (since LiteLLM uses OpenAI-compatible API):
- Start the app
- Go to Settings -> Models -> Codex -> Custom model slug ->
claude-opus-4-6 - Start a new thread for an existing project
- Select "Codex" ->
claude-opus-4-6 - Error:
Provider turn start failed - Error: Provider adapter request failed (claudeAgent) for thread.turn.start: Thread '89e787be-e659-46bf-b18b-935b6c49...
Expected behavior
All would work like it works in the Claude CLI tool - with no issues and no auth required
Actual behavior
Requires a login or shows an error
Impact
Blocks work completely
Version or commit
0.0.13 (2a237c2)
Environment
macOS 15.7.4 (24G517)
Logs or stack traces
Our company uses LiteLLM to provide access to different models via the "Google Vertex AI Platform".
I have an internal tool running on http://localhost:36253 (this is a local LiteLLM proxy, I assume).
I just have these envs set in my ~/.zshrc, and Claude Code CLI works without any login requests:
// Claude Code
export ANTHROPIC_BASE_URL="http://localhost:36253"
export ANTHROPIC_AUTH_TOKEN="REDACTED"
export CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS="1"
export DISABLE_TELEMETRY="1"
export DISABLE_ERROR_REPORTING="1"
// Codex CLI
export OPENAI_BASE_URL="http://localhost:36253"
export OPENAI_API_KEY="REDACTED" // same as for `ANTHROPIC_AUTH_TOKEN`Here's additional metadata about available LiteLLM models (from internal tool API call):
{
"data": [
{
"id": "gpt-5-mini",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-2-medium",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-codex-medium",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-opus-4-5-20251101",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-opus-4-6",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gemini-3-flash-preview",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-1-low",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gemini-3-pro-preview",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gemini-embedding-001",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-1-high",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-high",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-codex-low",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-1-codex-high",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-4-low",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-3-5-haiku-20241022",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-1-medium",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-1-codex-low",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gemini-2-5-pro",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-low",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-4-high",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-3-7-sonnet-20250219",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-opus-4-1-20250805",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-1-codex-medium",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-3-codex-low",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-haiku-4-5-20251001",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gemini-3-1-pro-preview",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-opus-4-20250514",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-sonnet-4-5-20250929",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-medium",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-2-high",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-sonnet-4-6",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-3-codex-high",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-4o-mini",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-4-medium",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-2-low",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-3-codex-medium",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gemini-2-5-flash",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "gpt-5-codex-high",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
},
{
"id": "claude-sonnet-4-20250514",
"object": "model",
"created": 1761865200,
"owned_by": "openai"
}
],
"object": "list"
}Here's an example of opencode.json config that works just fine as well:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"devinfra": {
"type": "local",
"command": ["dp-devinfra", "mcp"],
"enabled": true
}
},
"provider": {
"litellm-proxy": {
"npm": "@ai-sdk/openai-compatible",
"name": "LiteLLM Proxy",
"options": {
"baseURL": "http://localhost:36253/v1",
"apiKey": "cloudflare"
},
"models": {
"claude-opus-4-6": {
"name": "Claude Opus 4.6 (via LiteLLM)",
"limit": {
"context": 200000,
"output": 32768
}
},
"claude-sonnet-4-6": {
"name": "Claude Sonnet 4.6 (via LiteLLM)",
"limit": {
"context": 200000,
"output": 64000
}
},
"claude-haiku-4-5-20251001": {
"name": "Claude Haiku 4.5 (via LiteLLM)",
"limit": {
"context": 200000,
"output": 8192
}
},
"gpt-5-2-high": {
"name": "GPT-5.2 High (via LiteLLM)",
"limit": {
"context": 272000,
"output": 128000
}
},
"gpt-5-high": {
"name": "GPT-5 High (via LiteLLM)",
"limit": {
"context": 272000,
"output": 128000
}
},
"gpt-5-mini": {
"name": "GPT-5 Mini (via LiteLLM)",
"limit": {
"context": 272000,
"output": 128000
}
},
"gemini-3-1-pro-preview": {
"name": "Gemini 3.1 Pro (via LiteLLM)",
"limit": {
"context": 1048576,
"output": 65535
}
},
"gemini-2-5-flash": {
"name": "Gemini 2.5 Flash (via LiteLLM)",
"limit": {
"context": 1048576,
"output": 65535
}
}
}
}
}
}Screenshots, recordings, or supporting files
Workaround
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething is broken or behaving incorrectly.Something is broken or behaving incorrectly.needs-triageIssue needs maintainer review and initial categorization.Issue needs maintainer review and initial categorization.