-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(adapters): Support additional headers for OpenAI backend #533
base: main
Are you sure you want to change the base?
Conversation
84f8989
to
65e8b01
Compare
Signed-off-by: Nigel Jones <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks... just a couple comments. your thoughts are appreciated
@@ -30,6 +31,21 @@ def provider_id(self) -> ProviderName: | |||
def __init__(self, model_id: str | None = None, settings: dict | None = None) -> None: | |||
_settings = settings.copy() if settings is not None else {} | |||
|
|||
# Extra headers -- ignored if invalid | |||
extra_headers = None | |||
extra_headers_json = _settings.get("extra_headers", os.getenv("OPENAI_EXTRA_HEADERS")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
extra_headers
is supported across multiple services, e.g.,
- https://docs.litellm.ai/docs/providers/anthropic#passing-extra-headers-to-anthropic-api
- https://docs.litellm.ai/docs/providers/bedrock#passing-extra-headers--custom-api-endpoints
to that end should this really be OPENAI_EXTRA_HEADERS
and should it be in openai adapter? i would think we want this in the base LiteLLM adapter so it could be applicable across all adapters (and replace the OPENAI_
prefix with something less service specific)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I'd agree - if we can do this at the litellm level it will be much more flexible
pass | ||
|
||
if extra_headers: | ||
_settings["headers"] = extra_headers |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it probably should be extra_headers so that we just augment what is already there?
I can look at this tomorrow to refactor to make more generic. (if urgent ok to merge than update after) |
if isinstance(parsed_headers, dict): | ||
extra_headers = parsed_headers | ||
except json.JSONDecodeError: | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would say that more adapters will use it so I would extract it into a separate file (maybe utils/strings.py
)
Rather than adding multiple environment variables and managing precedence across inheritance, are we ok to just add a single one? Previously we had If this is set at the LiteLLM adapter level it might be It might be more generic in future if we use My thought is to just have a single variable, to name it |
Which issue(s) does this pull-request address?
Closes: #532
Description
Adds support for OpenAI provider to add additional headers
To test also set:
Currently draft - will be testing more
Note also that #527 may update some parameter values/usage to align more closely with typescript version
Tested ok
Checklist
yarn lint
oryarn lint:fix
yarn format
oryarn format:fix
yarn test:unit
yarn test:e2e