Skip to content

Conversation

@AlbertDeFusco
Copy link

@AlbertDeFusco AlbertDeFusco commented Oct 24, 2025

Title

Python entry-point for CustomLLM subclasses

Relevant issues

Fixes #7733

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

This feature allows 3rd party packages to declare custom LLM providers in the pyproject.toml file using entry-point loading from importlib.metadata. Providers instantiated and registered at runtime automatically without having to manually populate litellm.custom_providers_map.

As an example I may develop a 3rd party python package called litellm-parrot. In my src/litellm_parrot/provider.py I will define my subclass of CustomLLM

# src/litellm_parrot/provider.py
from litellm.llms.custom_llm import CustomLLM

class Parrot(CustomLLM):
    # implementation for Parrot provider

And then in my pyproject.toml file this section ensures that the provider is registered. (More than one provider can be defined here if desired)

[project.entry-point.litellm]
parrot = "litellm_parrot.provider:Parrot"
#  ^                                ^
#  |                                |  this specific object is imported from the module 
#  |                                |  and instantiated to become "custom_handler"
#  |
#  | - - this becomes "provider" in the llitellm.custom_providers_map entry

When the 3rd party package litellm-parrot is installed in the same env as litellm a user need only request inference from the parrot provider

import litellm

response = litellm.completion(
    model='parrot/pining-model',
    messages=[{"role": "user", "content": "Hello, Polly!!!"}]
)

New test

I've added a test that mocks the importlib.metadata.entry_points function to simulate reading the entrypoint configuration from a pyprojec.toml file.

Screenshot 2025-10-23 at 23 40 52

@vercel
Copy link

vercel bot commented Oct 24, 2025

@AlbertDeFusco is attempting to deploy a commit to the CLERKIEAI Team on Vercel.

A member of the Team first needs to authorize it.

@CLAassistant
Copy link

CLAassistant commented Oct 24, 2025

CLA assistant check
All committers have signed the CLA.

@AlbertDeFusco
Copy link
Author

One unit test fails for me, but I believe that I was not the cause of this

$ make test-unit
...

>           await handle_bedrock_passthrough_router_model(
                model=model,
                endpoint=endpoint,
                request=mock_request,
                request_body=mock_request_body,
                llm_router=mock_llm_router,
            )
E           TypeError: handle_bedrock_passthrough_router_model() missing 11 required positional arguments: 'user_api_key_dict', 'proxy_logging_obj', 'general_settings', 'proxy_config', 'select_data_generator', 'user_model', 'user_temperature', 'user_request_timeout', 'user_max_tokens', 'user_api_base', and 'version'

tests/test_litellm/proxy/pass_through_endpoints/test_llm_pass_through_endpoints.py:1046: TypeError

...

FAILED tests/test_litellm/proxy/pass_through_endpoints/test_llm_pass_through_endpoints.py::TestBedrockLLMProxyRoute::test_bedrock_error_handling_returns_actual_error - TypeError: handle_bedrock_passthrough_router_model() missing 11 required positional arguments: 'user_api_key_dict', 'pro...
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! xdist.dsession.Interrupted: stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================ 1 failed, 943 passed, 11 skipped, 472 warnings in 313.85s (0:05:13) ============================
make: *** [test-unit] Error 2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature]: Allow extending litellm.custom_provider_map using entrypoints

2 participants