-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Provide Protocol/Interface to implement custom methods #313
Comments
So all of our call classes e.g. While not necessarily trivial, you can extend What I'm curious about here is what exactly the "custom endpoints" look like in terms of usage. Is it simply a raw API? Or is there an SDK that you're working with? |
In my case, it can be both! We maintain an internal Python SDK to interact with the APIs. There's an |
Yeah taking a look at the links you provided, I think this would just be implementing your |
Cool! Will try that when I have time and contribute a small tutorial in the docs, if you like the idea 😊 |
Description
Following up on this Twitter thread I would like to suggest to provide a protocol class, e.g. a
CustomLLMCall
andCustomLLMParams
, to arbitrarily extend mirascope to any model.The usecase for this is: my client offers models via custom endpoints and I would like a quick way to integrate inference with those to build a PoC with mirascope.
There are two ways to go about this:
What do you think? 😊
The text was updated successfully, but these errors were encountered: