-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: Support include_usage
for bedrock
#4407
Labels
enhancement
New feature or request
Comments
this already works @Manouchehri e.g. response from predibase. ![]() can you share a case where it didn't work? and we can file that as an issue |
It's missing from Bedrock. curl -v "${OPENAI_API_BASE}/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "claude-3-5-sonnet-20240620",
"max_tokens": 10,
"seed": 4242,
"stream": true,
"temperature": 0.0,
"messages": [
{
"role": "user",
"content": "Hello"
}
],
"stream_options": {
"include_usage": true
}
}'
I think it's missing from Azure OpenAI as well, haven't confirmed yet though. |
include_usage
for all LLM providers?include_usage
for bedrock
Confirmed, it is also missing/not working for Azure OpenAI requests. |
It's missing from Anthropic (directly) too. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The Feature
It'd be really nice if
include_usage
worked on providers other than just OpenAI. I think LiteLLM should be able to do this, since we already calculate the cost elsewhere?Motivation, pitch
It's really useful for users to know how much they've spend in tokens for each streaming request.
Twitter / LinkedIn details
https://www.linkedin.com/in/davidmanouchehri/
The text was updated successfully, but these errors were encountered: