Skip to content

Commit

Permalink
feat(api)!: messages is generally available (#287)
Browse files Browse the repository at this point in the history
This is a breaking change as we've removed the `beta` namespace from
the messages API. To migrate you'll just need to remove all `.beta`
references, everything else is the same!
  • Loading branch information
stainless-bot committed Feb 13, 2024
1 parent 19a1451 commit 57b7135
Show file tree
Hide file tree
Showing 16 changed files with 121 additions and 164 deletions.
110 changes: 48 additions & 62 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,15 +25,17 @@ The full API of this library can be found in [api.md](api.md).
import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic({
apiKey: 'my api key', // defaults to process.env["ANTHROPIC_API_KEY"]
apiKey: process.env['ANTHROPIC_API_KEY'], // This is the default and can be omitted
});

async function main() {
const completion = await anthropic.completions.create({
const message = await anthropic.messages.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'How does a court case get to the supreme court?' }],
model: 'claude-2.1',
max_tokens_to_sample: 300,
prompt: `${Anthropic.HUMAN_PROMPT} how does a court case get to the Supreme Court?${Anthropic.AI_PROMPT}`,
});

console.log(message.content);
}

main();
Expand All @@ -48,14 +50,14 @@ import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic();

const stream = await anthropic.completions.create({
prompt: `${Anthropic.HUMAN_PROMPT} Your prompt here${Anthropic.AI_PROMPT}`,
const stream = await anthropic.messages.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'your prompt here' }],
model: 'claude-2.1',
stream: true,
max_tokens_to_sample: 300,
});
for await (const completion of stream) {
console.log(completion.completion);
for await (const messageStreamEvent of stream) {
console.log(messageStreamEvent.type);
}
```

Expand All @@ -71,16 +73,16 @@ This library includes TypeScript definitions for all request params and response
import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic({
apiKey: 'my api key', // defaults to process.env["ANTHROPIC_API_KEY"]
apiKey: process.env['ANTHROPIC_API_KEY'], // This is the default and can be omitted
});

async function main() {
const params: Anthropic.CompletionCreateParams = {
prompt: `${Anthropic.HUMAN_PROMPT} how does a court case get to the Supreme Court?${Anthropic.AI_PROMPT}`,
max_tokens_to_sample: 300,
const params: Anthropic.MessageCreateParams = {
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
};
const completion: Anthropic.Completion = await anthropic.completions.create(params);
const message: Anthropic.Message = await anthropic.messages.create(params);
}

main();
Expand All @@ -104,7 +106,7 @@ import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic();

async function main() {
const stream = anthropic.beta.messages
const stream = anthropic.messages
.stream({
model: 'claude-2.1',
max_tokens: 1024,
Expand All @@ -126,9 +128,9 @@ async function main() {
main();
```

Streaming with `client.beta.messages.stream(...)` exposes [various helpers for your convenience](helpers.md) including event handlers and accumulation.
Streaming with `client.messages.stream(...)` exposes [various helpers for your convenience](helpers.md) including event handlers and accumulation.

Alternatively, you can use `client.beta.messages.create({ ..., stream: true })` which only returns an async iterable of the events in the stream and thus uses less memory (it does not build up a final message object for you).
Alternatively, you can use `client.messages.create({ ..., stream: true })` which only returns an async iterable of the events in the stream and thus uses less memory (it does not build up a final message object for you).

## Handling errors

Expand All @@ -139,10 +141,10 @@ a subclass of `APIError` will be thrown:
<!-- prettier-ignore -->
```ts
async function main() {
const completion = await anthropic.completions
const message = await anthropic.messages
.create({
prompt: `${Anthropic.HUMAN_PROMPT} Your prompt here${Anthropic.AI_PROMPT}`,
max_tokens_to_sample: 300,
max_tokens: 1024,
messages: [{ role: 'user', content: 'your prompt here' }],
model: 'claude-2.1',
})
.catch((err) => {
Expand Down Expand Up @@ -188,16 +190,9 @@ const anthropic = new Anthropic({
});

// Or, configure per-request:
await anthropic.completions.create(
{
prompt: `${Anthropic.HUMAN_PROMPT} Can you help me effectively ask for a raise at work?${Anthropic.AI_PROMPT}`,
max_tokens_to_sample: 300,
model: 'claude-2.1',
},
{
maxRetries: 5,
},
);
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Can you help me effectively ask for a raise at work?' }], model: 'claude-2.1' }, {
maxRetries: 5,
});
```

### Timeouts
Expand All @@ -212,16 +207,9 @@ const anthropic = new Anthropic({
});

// Override per-request:
await anthropic.completions.create(
{
prompt: `${Anthropic.HUMAN_PROMPT} Where can I get a good coffee in my neighbourhood?${Anthropic.AI_PROMPT}`,
max_tokens_to_sample: 300,
model: 'claude-2.1',
},
{
timeout: 5 * 1000,
},
);
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }], model: 'claude-2.1' }, {
timeout: 5 * 1000,
});
```

On timeout, an `APIConnectionTimeoutError` is thrown.
Expand All @@ -241,11 +229,11 @@ import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic();

const completion = await anthropic.completions.create(
const message = await anthropic.messages.create(
{
max_tokens_to_sample: 300,
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
prompt: `${Anthropic.HUMAN_PROMPT} Where can I get a good coffee in my neighbourhood?${Anthropic.AI_PROMPT}`,
},
{ headers: { 'anthropic-version': 'My-Custom-Value' } },
);
Expand All @@ -263,19 +251,25 @@ You can also use the `.withResponse()` method to get the raw `Response` along wi
```ts
const anthropic = new Anthropic();

const response = await anthropic.completions
const response = await anthropic.messages
.create({
prompt: `${Anthropic.HUMAN_PROMPT} Can you help me effectively ask for a raise at work?${Anthropic.AI_PROMPT}`,
max_tokens_to_sample: 300,
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
})
.asResponse();
console.log(response.headers.get('X-My-Header'));
console.log(response.raw.statusText); // access the underlying Response object
console.log(response.statusText); // access the underlying Response object

// parses the response body, returning an object if the API responds with JSON
const completion: Completions.Completion = await response.parse();
console.log(completion.completion);
const { data: message, response: raw } = await anthropic.messages
.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
})
.withResponse();
console.log(raw.headers.get('X-My-Header'));
console.log(message.content);
```

## Customizing the fetch client
Expand Down Expand Up @@ -325,7 +319,6 @@ If you would like to disable or customize this behavior, for example to use the
<!-- prettier-ignore -->
```ts
import http from 'http';
import Anthropic from '@anthropic-ai/sdk';
import HttpsProxyAgent from 'https-proxy-agent';

// Configure the default for all requests:
Expand All @@ -334,17 +327,10 @@ const anthropic = new Anthropic({
});

// Override per-request:
await anthropic.completions.create(
{
prompt: `${Anthropic.HUMAN_PROMPT} How does a court case get to the Supreme Court?${Anthropic.AI_PROMPT}`,
max_tokens_to_sample: 300,
model: 'claude-2.1',
},
{
baseURL: 'http://localhost:8080/test-api',
httpAgent: new http.Agent({ keepAlive: false }),
},
);
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }], model: 'claude-2.1' }, {
baseURL: 'http://localhost:8080/test-api',
httpAgent: new http.Agent({ keepAlive: false }),
})
```

## Semantic Versioning
Expand Down
36 changes: 17 additions & 19 deletions api.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,28 +10,26 @@ Methods:

- <code title="post /v1/complete">client.completions.<a href="./src/resources/completions.ts">create</a>({ ...params }) -> Completion</code>

# Beta

## Messages
# Messages

Types:

- <code><a href="./src/resources/beta/messages.ts">ContentBlock</a></code>
- <code><a href="./src/resources/beta/messages.ts">ContentBlockDeltaEvent</a></code>
- <code><a href="./src/resources/beta/messages.ts">ContentBlockStartEvent</a></code>
- <code><a href="./src/resources/beta/messages.ts">ContentBlockStopEvent</a></code>
- <code><a href="./src/resources/beta/messages.ts">Message</a></code>
- <code><a href="./src/resources/beta/messages.ts">MessageDeltaEvent</a></code>
- <code><a href="./src/resources/beta/messages.ts">MessageDeltaUsage</a></code>
- <code><a href="./src/resources/beta/messages.ts">MessageParam</a></code>
- <code><a href="./src/resources/beta/messages.ts">MessageStartEvent</a></code>
- <code><a href="./src/resources/beta/messages.ts">MessageStopEvent</a></code>
- <code><a href="./src/resources/beta/messages.ts">MessageStreamEvent</a></code>
- <code><a href="./src/resources/beta/messages.ts">TextBlock</a></code>
- <code><a href="./src/resources/beta/messages.ts">TextDelta</a></code>
- <code><a href="./src/resources/beta/messages.ts">Usage</a></code>
- <code><a href="./src/resources/messages.ts">ContentBlock</a></code>
- <code><a href="./src/resources/messages.ts">ContentBlockDeltaEvent</a></code>
- <code><a href="./src/resources/messages.ts">ContentBlockStartEvent</a></code>
- <code><a href="./src/resources/messages.ts">ContentBlockStopEvent</a></code>
- <code><a href="./src/resources/messages.ts">Message</a></code>
- <code><a href="./src/resources/messages.ts">MessageDeltaEvent</a></code>
- <code><a href="./src/resources/messages.ts">MessageDeltaUsage</a></code>
- <code><a href="./src/resources/messages.ts">MessageParam</a></code>
- <code><a href="./src/resources/messages.ts">MessageStartEvent</a></code>
- <code><a href="./src/resources/messages.ts">MessageStopEvent</a></code>
- <code><a href="./src/resources/messages.ts">MessageStreamEvent</a></code>
- <code><a href="./src/resources/messages.ts">TextBlock</a></code>
- <code><a href="./src/resources/messages.ts">TextDelta</a></code>
- <code><a href="./src/resources/messages.ts">Usage</a></code>

Methods:

- <code title="post /v1/messages">client.beta.messages.<a href="./src/resources/beta/messages.ts">create</a>({ ...params }) -> Message</code>
- <code>client.beta.messages.<a href="./src/resources/beta/messages.ts">stream</a>(body, options?) -> MessageStream</code>
- <code title="post /v1/messages">client.messages.<a href="./src/resources/messages.ts">create</a>({ ...params }) -> Message</code>
- <code>client.messages.<a href="./src/resources/messages.ts">stream</a>(body, options?) -> MessageStream</code>
2 changes: 1 addition & 1 deletion examples/streaming.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic(); // gets API Key from environment variable ANTHROPIC_API_KEY

async function main() {
const stream = client.beta.messages
const stream = client.messages
.stream({
messages: [
{
Expand Down
6 changes: 3 additions & 3 deletions helpers.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,14 @@
## Streaming Responses

```ts
anthropic.beta.messages.stream({ … }, options?): MessageStream
anthropic.messages.stream({ … }, options?): MessageStream
```
`anthropic.beta.messages.stream()` returns a `MessageStream`, which emits events, has an async
`anthropic.messages.stream()` returns a `MessageStream`, which emits events, has an async
iterator, and exposes helper methods to accumulate stream events into a convenient shape and make it easy to reason
about the conversation.
Alternatively, you can use `anthropic.beta.messages.create({ stream: true, … })` which returns an async
Alternatively, you can use `anthropic.messages.create({ stream: true, … })` which returns an async
iterable of the chunks in the stream and uses less memory (most notably, it does not accumulate a message
object for you).
Expand Down
2 changes: 1 addition & 1 deletion packages/vertex-sdk/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ import { AnthropicVertex } from '@anthropic-ai/vertex-sdk';
const client = new AnthropicVertex();

async function main() {
const result = await client.beta.messages.create({
const result = await client.messages.create({
messages: [
{
role: 'user',
Expand Down
2 changes: 1 addition & 1 deletion packages/vertex-sdk/examples/vertex.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ import { AnthropicVertex } from '@anthropic-ai/vertex-sdk';
const client = new AnthropicVertex();

async function main() {
const result = await client.beta.messages.create({
const result = await client.messages.create({
messages: [
{
role: 'user',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ const pkgJson = require('../dist/package.json');
for (const dep in pkgJson.dependencies) {
// ensure we point to NPM instead of a local directory
if (dep === '@anthropic-ai/sdk') {
pkgJson.dependencies[dep] = '^0';
pkgJson.dependencies[dep] = '^0.14';
}
}

Expand Down
2 changes: 1 addition & 1 deletion packages/vertex-sdk/src/client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ export class AnthropicVertex extends Core.APIClient {
this._authClientPromise = this._auth.getClient();
}

beta: Resources.Beta = new Resources.Beta(this);
messages: Resources.Messages = new Resources.Messages(this);

protected override defaultQuery(): Core.DefaultQuery | undefined {
return this._options.defaultQuery;
Expand Down
22 changes: 20 additions & 2 deletions src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ export class Anthropic extends Core.APIClient {
}

completions: API.Completions = new API.Completions(this);
beta: API.Beta = new API.Beta(this);
messages: API.Messages = new API.Messages(this);

protected override defaultQuery(): Core.DefaultQuery | undefined {
return this._options.defaultQuery;
Expand Down Expand Up @@ -236,7 +236,25 @@ export namespace Anthropic {
export import CompletionCreateParamsNonStreaming = API.CompletionCreateParamsNonStreaming;
export import CompletionCreateParamsStreaming = API.CompletionCreateParamsStreaming;

export import Beta = API.Beta;
export import Messages = API.Messages;
export import ContentBlock = API.ContentBlock;
export import ContentBlockDeltaEvent = API.ContentBlockDeltaEvent;
export import ContentBlockStartEvent = API.ContentBlockStartEvent;
export import ContentBlockStopEvent = API.ContentBlockStopEvent;
export import Message = API.Message;
export import MessageDeltaEvent = API.MessageDeltaEvent;
export import MessageDeltaUsage = API.MessageDeltaUsage;
export import MessageParam = API.MessageParam;
export import MessageStartEvent = API.MessageStartEvent;
export import MessageStopEvent = API.MessageStopEvent;
export import MessageStreamEvent = API.MessageStreamEvent;
export import TextBlock = API.TextBlock;
export import TextDelta = API.TextDelta;
export import Usage = API.Usage;
export import MessageCreateParams = API.MessageCreateParams;
export import MessageCreateParamsNonStreaming = API.MessageCreateParamsNonStreaming;
export import MessageCreateParamsStreaming = API.MessageCreateParamsStreaming;
export import MessageStreamParams = API.MessageStreamParams;
}

export default Anthropic;
2 changes: 1 addition & 1 deletion src/lib/MessageStream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import {
MessageParam,
MessageCreateParams,
MessageStreamParams,
} from '@anthropic-ai/sdk/resources/beta/messages';
} from '@anthropic-ai/sdk/resources/messages';
import { type ReadableStream } from '@anthropic-ai/sdk/_shims/index';
import { Stream } from '@anthropic-ai/sdk/streaming';

Expand Down
Loading

0 comments on commit 57b7135

Please sign in to comment.