We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Cerebras.ai as a provider. Their inference is insanely fast and cheap, which would unlock a lot of new applications in our Vercel AI use.
https://inference.cerebras.ai/
And they're pretty OpenAI-compatible: https://inference-docs.cerebras.ai/openai
General inference and function calling in use-cases that require high throughput.
No response
The text was updated successfully, but these errors were encountered:
If they are openai compatible you can already use them with the ai sdk: https://sdk.vercel.ai/providers/openai-compatible-providers#setup
Sorry, something went wrong.
No branches or pull requests
Feature Description
Support Cerebras.ai as a provider. Their inference is insanely fast and cheap, which would unlock a lot of new applications in our Vercel AI use.
https://inference.cerebras.ai/
And they're pretty OpenAI-compatible: https://inference-docs.cerebras.ai/openai
Use Cases
General inference and function calling in use-cases that require high throughput.
Additional context
No response
The text was updated successfully, but these errors were encountered: