Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(adapters): Support additional headers for OpenAI backend #533

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

planetf1
Copy link
Contributor

@planetf1 planetf1 commented Mar 7, 2025

Which issue(s) does this pull-request address?

Closes: #532

Description

Adds support for OpenAI provider to add additional headers

export OPENAI_EXTRA_HEADERS='{"MY_API_KEY": "ab592c9f3e001b"}'

To test also set:

OPENAI_API_KEY=my-api-key
OPENAI_CHAT_MODEL=my-model
OPENAI_API_BASE=https://my-endpoint/
OPENAI_EXTRA_HEADERS={"MY_API_KEY": "ab592c9f3e001b"}

Currently draft - will be testing more

Note also that #527 may update some parameter values/usage to align more closely with typescript version

Tested ok

Checklist

  • I have read the contributor guide
  • I have signed off on my commit
  • Linting passes: yarn lint or yarn lint:fix
  • Formatting is applied: yarn format or yarn format:fix
  • Unit tests pass: yarn test:unit
  • E2E tests pass: yarn test:e2e
  • Tests are included
  • Documentation is changed or added
  • Commit messages and PR title follow conventional commits

@planetf1 planetf1 added the python Python related functionality label Mar 7, 2025
@planetf1 planetf1 force-pushed the extraheader branch 3 times, most recently from 84f8989 to 65e8b01 Compare March 10, 2025 14:23
@planetf1 planetf1 marked this pull request as ready for review March 10, 2025 15:17
@planetf1 planetf1 requested a review from a team as a code owner March 10, 2025 15:17
Copy link
Contributor

@vabarbosa vabarbosa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks... just a couple comments. your thoughts are appreciated

@@ -30,6 +31,21 @@ def provider_id(self) -> ProviderName:
def __init__(self, model_id: str | None = None, settings: dict | None = None) -> None:
_settings = settings.copy() if settings is not None else {}

# Extra headers -- ignored if invalid
extra_headers = None
extra_headers_json = _settings.get("extra_headers", os.getenv("OPENAI_EXTRA_HEADERS"))
Copy link
Contributor

@vabarbosa vabarbosa Mar 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

extra_headers is supported across multiple services, e.g.,

to that end should this really be OPENAI_EXTRA_HEADERS and should it be in openai adapter? i would think we want this in the base LiteLLM adapter so it could be applicable across all adapters (and replace the OPENAI_ prefix with something less service specific)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I'd agree - if we can do this at the litellm level it will be much more flexible

pass

if extra_headers:
_settings["headers"] = extra_headers
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LiteLLM also accepts extra_headers params:
image

so do we have to copy it over to headers? we end up with both headers and extra_headers in the _settings

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it probably should be extra_headers so that we just augment what is already there?

@planetf1
Copy link
Contributor Author

I can look at this tomorrow to refactor to make more generic. (if urgent ok to merge than update after)

if isinstance(parsed_headers, dict):
extra_headers = parsed_headers
except json.JSONDecodeError:
pass
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say that more adapters will use it so I would extract it into a separate file (maybe utils/strings.py)

@planetf1
Copy link
Contributor Author

Rather than adding multiple environment variables and managing precedence across inheritance, are we ok to just add a single one?

Previously we had OPENAI_EXTRA_HEADERS (in the above working code)

If this is set at the LiteLLM adapter level it might be LIGHTLLM_EXTRA_HEADERS

It might be more generic in future if we use BEEAI_EXTRA_HEADERS or BEEAI_LLM_EXTRA_HEADERS ... ? But the implementation would then differ, and potentially be incomplete.

My thought is to just have a single variable, to name it BEEAI_LLM_EXTRA_HEADERS, but only to implement it in LiteLLM adapter

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
python Python related functionality
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for extra headers to be sent using openai backend
3 participants