-
Notifications
You must be signed in to change notification settings - Fork 12.5k
Open
Labels
coreAnything pertaining to core functionality of the application (opencode server stuff)Anything pertaining to core functionality of the application (opencode server stuff)
Description
Problem
When using a custom provider configured via opencode.jsonc with @ai-sdk/openai-compatible, the UI does not track or display $ Spent. The cost remains at $0.00 regardless of token usage.
Built-in providers get their pricing data from models.dev and cost tracking works for those. For custom providers, the cost config fields are available in the schema but do not appear to have any effect on the UI.
Steps to Reproduce
- Configure a custom provider in
opencode.jsoncwithcostfields specified per the config schema:
- Start a session using the custom provider model.
- Send messages and observe the cost display in the UI.
Expected Behavior
The UI calculates and displays a running cost using the cost values defined in the model config and the token usage from the response.
Actual Behavior
$ Spent remains at $0.00 for the entire session. The cost config fields on custom provider models have no observable effect.
Environment
- Custom provider backed by a LiteLLM proxy via
@ai-sdk/openai-compatible - Multiple models tested — issue is consistent across all custom provider models
coststructure matches the config JSON schema athttps://opencode.ai/config.json, includinginput,output,cache_read,cache_write, andcontext_over_200ksub-fields
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
coreAnything pertaining to core functionality of the application (opencode server stuff)Anything pertaining to core functionality of the application (opencode server stuff)
{ "provider": { "my-provider": { "name": "My Provider", "npm": "@ai-sdk/openai-compatible", "models": { "my-model": { "name": "My Model", "cost": { "input": 0.0000055, "output": 0.0000275, "cache_read": 5.5e-7, "cache_write": 0.000006875 }, "limit": { "context": 200000, "output": 64000, "input": 200000 }, "tool_call": true } }, "options": { "baseURL": "https://my-proxy.example.com" } } } }