You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Additionally support a route to get batches output. something like /v1/batches/:batchId/output
This would require some considerable amount of changes in the gateway, so for v1
support the following providers: openai, bedrock, cohere
bedrock would require uploading to s3 as chunks as streaming a file to s3 without knowing the content-length is not allowed, please make provisions for that
File uploads should not increase memory consumption in worker environments. even when transforming the files, only 100kb chunks should be read
Context for your Request
No response
Your Twitter/LinkedIn
No response
The text was updated successfully, but these errors were encountered:
Hi @narengogi, when do you think this feature will be available?
I'm really looking up to it :)
Also is Google's Vertex AI & Azure OpenAI's batch API will be supported?
What Would You Like to See with the Gateway?
Requirements:
This would require some considerable amount of changes in the gateway, so for v1
Context for your Request
No response
Your Twitter/LinkedIn
No response
The text was updated successfully, but these errors were encountered: