Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LiteLLM Minor Fixes & Improvements (01/16/2025) - p2 #7828

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

krrishdholakia
Copy link
Contributor

Title

Relevant issues

Type

🆕 New Feature
🐛 Bug Fix
🧹 Refactoring
📖 Documentation
🚄 Infrastructure
✅ Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

If UI changes, send a screenshot/GIF of working UI fixes

krrishdholakia and others added 8 commits January 16, 2025 10:46
* Update azure.py

Added optional parameter azure ad token provider

* Added parameter to main.py

* Found token provider arg location

* Fixed embeddings

* Fixed ad token provider

---------

Co-authored-by: Krish Dholakia <[email protected]>
get v0 out for sync azure gpt route to begin with
Copy link

vercel bot commented Jan 17, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 29, 2025 1:29am

@krrishdholakia krrishdholakia changed the title Litellm dev 01 16 2025 p1 LiteLLM Minor Fixes & Improvements (01/16/2025) - p2 Jan 17, 2025
model does not support it
@sterankin
Copy link

Does this PR allow calling of a Bedrock model via a REST endpoint with a Bearer token?

@krrishdholakia
Copy link
Contributor Author

Does this PR allow calling of a Bedrock model via a REST endpoint with a Bearer token?

i'm confused @sterankin that's always been possible - https://docs.litellm.ai/docs/providers/bedrock#3-test-it

google ai studio raises errors
@sterankin
Copy link

sterankin commented Jan 24, 2025

@krrishdholakia - that seems to require an AWS Account ID and secret - however I have my bedrock behind an API Gateway with an authorizer so I need to be able to call the endpoint with a Bearer Token - we are not hitting bedrock directly.

Api Gateway - > Lambda -> Bedrock

Is it possible to use LiteLLM to call a custom url and add custom header as well as Bearer token for Auth?

for example like this:

import os
import litellm
from litellm import completion

litellm.set_verbose = True # 👈 SEE RAW REQUEST

response = completion(
            model="bedrock/converse/us.amazon.nova-pro-v1:0",
            messages=[{ "content": "Hello, how are you?","role": "user"}],
            aws_access_key_id="",
            aws_secret_access_key="",
            aws_region_name="",
            aws_bedrock_runtime_endpoint="https://my-apigateway-url/bedrock",
            extra_headers={"user_id": "1234567", "authorization": "BEARER TOKEN GOES HERE"}

litellm.exceptions.AuthenticationError: litellm.AuthenticationError: BedrockException Invalid Authentication - Unable to locate credentials

@krrishdholakia
Copy link
Contributor Author

krrishdholakia commented Jan 24, 2025

Yes that should already work @sterankin

I believe we have a test for this as well. Let me add the flow to our docs.

I think there's a minor change you need to make in your example

@sterankin
Copy link

Thanks for the help @krrishdholakia - I look forward to seeing what change is needed I couldnt see it in the docs but maybe they need updated.

@sterankin
Copy link

Hi @krrishdholakia did you find how my example should be changed?

@krrishdholakia
Copy link
Contributor Author

@krrishdholakia
Copy link
Contributor Author

Does this solve your problem? @sterankin

@sterankin
Copy link

@krrishdholakia Thankyou for updating the docs here:

https://docs.litellm.ai/docs/providers/bedrock#calling-via-proxy

However in the code you have the line:

client=client

What should client be set to in the example?

Thanks

@krrishdholakia
Copy link
Contributor Author

@sterankin that's from my test - forgot to clean it up, done now.

It was from this test -

async def test_bedrock_extra_headers():

@sterankin
Copy link

I kep getting this error:

Traceback (most recent call last):
  File "/myenv/lib/python3.13/site-packages/litellm/main.py", line 2538, in completion
    response = bedrock_converse_chat_completion.completion(
        model=model,
    ...<13 lines>...
        api_base=api_base,
    )
  File "/myenv/lib/python3.13/site-packages/litellm/llms/bedrock/chat/converse_handler.py", line 413, in completion
    prepped = self.get_request_headers(
        credentials=credentials,
    ...<4 lines>...
        headers=headers,
    )
  File "/myenv/lib/python3.13/site-packages/litellm/llms/bedrock/base_aws_llm.py", line 497, in get_request_headers
    sigv4.add_auth(request)
    ~~~~~~~~~~~~~~^^^^^^^^^
  File "/myenv/lib/python3.13/site-packages/botocore/auth.py", line 423, in add_auth
    raise NoCredentialsError()
botocore.exceptions.NoCredentialsError: Unable to locate credentials

Why is it trying to use aws auth and boto3 when I am passing a Bearer token to the API for auth?

@krrishdholakia
Copy link
Contributor Author

krrishdholakia commented Jan 29, 2025

@sterankin can we track this on a separate ticket - i believe we currently assume you have credentials, just need to route the call via proxy.

in your case, does your proxy contain the aws credentials?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants