Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic batch mode not available #225

Open
RyanMarten opened this issue Dec 7, 2024 · 6 comments
Open

Anthropic batch mode not available #225

RyanMarten opened this issue Dec 7, 2024 · 6 comments
Assignees

Comments

@RyanMarten RyanMarten self-assigned this Dec 7, 2024
@RyanMarten
Copy link
Contributor Author

Example (from the batch console welcome page)

import anthropic

client = anthropic.Anthropic()

message_batch = client.beta.messages.batches.create(
    requests=[
        {
            "custom_id": "first-prompt-in-my-batch",
            "params": {
                "model": "claude-3-5-haiku-20241022",
                "max_tokens": 100,
                "messages": [
                    {
                        "role": "user",
                        "content": "Hey Claude, tell me a short fun fact about video games!",
                    }
                ],
            },
        },
        {
            "custom_id": "second-prompt-in-my-batch",
            "params": {
                "model": "claude-3-5-sonnet-20241022",
                "max_tokens": 100,
                "messages": [
                    {
                        "role": "user",
                        "content": "Hey Claude, tell me a short fun fact about bees!",
                    }
                ],
            },
        },
    ]
)
print(message_batch)

Stdout

BetaMessageBatch(id='msgbatch_01XWYEcAqybHAWXqyinUyp8K', archived_at=None, cancel_initiated_at=None, created_at=datetime.datetime(2024, 12, 10, 21, 30, 23, 225753, tzinfo=datetime.timezone.utc), ended_at=None, expires_at=datetime.datetime(2024, 12, 11, 21, 30, 23, 225753, tzinfo=datetime.timezone.utc), processing_status='in_progress', request_counts=BetaMessageBatchRequestCounts(canceled=0, errored=0, expired=0, processing=2, succeeded=0), results_url=None, type='message_batch'

Batch Output

{"custom_id":"first-prompt-in-my-batch","result":{"type":"succeeded","message":{"id":"msg_014KfxurNm3n65CGkqUNTkCk","type":"message","role":"assistant","model":"claude-3-5-haiku-20241022","content":[{"type":"text","text":"Here's a fun video game fact: The first video game Easter egg was hidden in the Atari 2600 game Adventure in 1979. Created by programmer Warren Robinett, it was a hidden room with his name that players could only access through a secret sequence of actions."}],"stop_reason":"end_turn","stop_sequence":null,"usage":{"input_tokens":20,"output_tokens":64}}}}
{"custom_id":"second-prompt-in-my-batch","result":{"type":"succeeded","message":{"id":"msg_01DLmwptRqXuVMJsdzgR4Ntp","type":"message","role":"assistant","model":"claude-3-5-sonnet-20241022","content":[{"type":"text","text":"Here's a fun fact: Bees can recognize human faces! Scientists have discovered that honey bees can be trained to remember and distinguish between different human facial features, despite having a brain about the size of a grass seed. They do this using a technique called \"configural processing\" - the same way humans process faces!"}],"stop_reason":"end_turn","stop_sequence":null,"usage":{"input_tokens":20,"output_tokens":69}}}}

@RyanMarten
Copy link
Contributor Author

RyanMarten commented Dec 10, 2024

Details / meaningful differences from openai

https://docs.anthropic.com/en/docs/build-with-claude/message-batches

Different limits

A Message Batch is limited to either 10,000 Message requests or 32 MB in size, whichever is reached first.

List instead of file content

A unique custom_id for identifying the Messages request
A params object with the standard Messages API parameters
You can create a batch by passing this list into the requests parameter:

Different batch statuses

When a batch is first created, the response will have a processing status of in_progress. updated to ended once all the requests in the batch have finished processing, and results are ready.
in_progress, canceling, ended

Different request statuses

Once batch processing has ended, each Messages request in the batch will have a result. There are 4 result types: succeeded, errored, cancelled, expired. request_counts, which shows how many requests reached each of these four states.

Recommend streaming finished requests instead of downloading all of them

Results of the batch are available for download both in the Console and at the results_url on the Message Batch. Because of the potentially large size of the results, it’s recommended to stream results back rather than download them all at once.

Different errors

If your result has an error, its result.error will be set to our standard error shape.

@RyanMarten
Copy link
Contributor Author

RyanMarten commented Dec 10, 2024

Examples: https://docs.anthropic.com/en/api/messages-batch-examples

Polling example shows interval of 60 seconds

When retrieving results
not sure if we want to use streaming or not...

Cancelling a batch
Cancelled batches also have partial results

Immediately after cancellation, a batch’s processing_status will be canceling. You can use the same polling for batch completion technique to poll for when cancellation is finalized as canceled batches also end up ended and may contain results.

@RyanMarten
Copy link
Contributor Author

API Reference, notable differences from openai
https://docs.anthropic.com/en/api/creating-message-batches

System prompt is a parameter not a message

Note that if you want to include a system prompt, you can use the top-level system parameter — there is no "system" role for input messages in the Messages API.

How is structured output done? Through tool use? @CharlieJCJ will provide the details based on the litellm work

@RyanMarten
Copy link
Contributor Author

We can't store metadata in the batch, so we will need to store a map of request_file to batch_id

@RyanMarten
Copy link
Contributor Author

They just increased the limits significantly for batch:

100,000 Message requests or 256 MB

https://docs.anthropic.com/en/docs/build-with-claude/message-batches#batch-limitations

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant