Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide Protocol/Interface to implement custom methods #313

Open
baggiponte opened this issue Jun 8, 2024 · 4 comments
Open

Provide Protocol/Interface to implement custom methods #313

baggiponte opened this issue Jun 8, 2024 · 4 comments
Labels
Feature Request New feature or request

Comments

@baggiponte
Copy link

Description

Following up on this Twitter thread I would like to suggest to provide a protocol class, e.g. a CustomLLMCall and CustomLLMParams, to arbitrarily extend mirascope to any model.

The usecase for this is: my client offers models via custom endpoints and I would like a quick way to integrate inference with those to build a PoC with mirascope.

There are two ways to go about this:

  1. Abstract Base Class (ABC). I don't recall right now whether they require all methods to be implemented or else a runtime error is raised. Should double check on that.
  2. Protocols. More Pythonic, should work with static typing as well to provide diagnostic messages if the end user does not implement the appropriate methods (including return types).

What do you think? 😊

@baggiponte baggiponte added the Feature Request New feature or request label Jun 8, 2024
@willbakst
Copy link
Contributor

willbakst commented Jun 8, 2024

So all of our call classes e.g. OpenAIClass/AnthropicCall etc. all extend BaseCall, which is an ABC that provides all of the expected methods.

While not necessarily trivial, you can extend BaseCall similarly for any CustomLLMCall class -- as you mentioned, you'll just need to implement the abstract methods (or a runtime error is raised). You can then take advantage of any of the conveniences provided by BaseCall (and also BasePrompt) while implementing the interface.

What I'm curious about here is what exactly the "custom endpoints" look like in terms of usage. Is it simply a raw API? Or is there an SDK that you're working with?

@baggiponte
Copy link
Author

What I'm curious about here is what exactly the "custom endpoints" look like in terms of usage. Is it simply a raw API? Or is there an SDK that you're working with?

In my case, it can be both! We maintain an internal Python SDK to interact with the APIs. There's an integrations modules that ships classes to be interoperable with langchain and llamaindex (see here and here).

@willbakst
Copy link
Contributor

Yeah taking a look at the links you provided, I think this would just be implementing your CustomLLM(BaseCall) class that uses your SDK/API for the call, stream methods (and their async counterparts)

@baggiponte
Copy link
Author

Cool! Will try that when I have time and contribute a small tutorial in the docs, if you like the idea 😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature Request New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants