Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 18 additions & 3 deletions python/packages/anthropic/AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,24 @@ Integration with Anthropic's Claude API.

## Main Classes

- **`AnthropicClient`** - Chat client for Anthropic Claude models
- **`AnthropicClient`** - Full-featured chat client for Anthropic Claude models (includes middleware, telemetry, and function invocation support)
- **`RawAnthropicClient`** - Low-level chat client without middleware, telemetry, or function invocation layers. Use this only when you need to compose custom layers manually.
- **`AnthropicChatOptions`** - Options TypedDict for Anthropic-specific parameters

## Client Architecture

`AnthropicClient` composes the standard public layer stack around `RawAnthropicClient`:

```
AnthropicClient
└─ FunctionInvocationLayer ← owns the tool/function calling loop
└─ ChatMiddlewareLayer ← applies chat middleware per model call
└─ ChatTelemetryLayer ← per-call telemetry (inside middleware)
└─ RawAnthropicClient ← raw Anthropic API calls
```

Most users should use `AnthropicClient`. Use `RawAnthropicClient` only if you need to apply a custom subset of layers.

## Usage

```python
Expand All @@ -19,7 +34,7 @@ response = await client.get_response("Hello")
## Import Path

```python
from agent_framework.anthropic import AnthropicClient
from agent_framework.anthropic import AnthropicClient, RawAnthropicClient
# or directly:
from agent_framework_anthropic import AnthropicClient
from agent_framework_anthropic import AnthropicClient, RawAnthropicClient
```
24 changes: 24 additions & 0 deletions python/packages/core/AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,30 @@ class LoggingMiddleware(AgentMiddleware):
agent = Agent(..., middleware=[LoggingMiddleware()])
```

### Chat Client Layer Architecture

Public chat clients (e.g., `OpenAIChatClient`, `AnthropicClient`) compose a standard stack of mixin layers on top of a raw/base client. The layer ordering from outermost to innermost is:

```
PublicClient (e.g., OpenAIChatClient)
└─ FunctionInvocationLayer ← owns the tool/function calling loop; routes function middleware
└─ ChatMiddlewareLayer ← applies chat middleware per inner model call (outside telemetry)
└─ ChatTelemetryLayer ← per-call OpenTelemetry spans (inside chat middleware)
└─ Raw/BaseChatClient ← raw provider API calls
```


**Key behaviors:**
- **Chat middleware runs per inner model call** — within the function calling loop, so middleware sees each individual LLM call rather than only the outer request.
- **Chat middleware is outside telemetry** — middleware latency does not skew per-call telemetry timings.
- **Per-call middleware** can be passed via `client_kwargs={"middleware": [...]}` on `get_response()`. Mixed chat and function middleware is automatically categorized and routed to the appropriate layer.


**Raw vs Public clients:**
- **Raw clients** (e.g., `RawOpenAIChatClient`, `RawAnthropicClient`) only extend `BaseChatClient` — no middleware, telemetry, or function invocation support.
- **Public clients** compose all standard layers around the raw client and are what most users should use.
- Use raw clients only when you need to compose a custom subset of layers.

### Custom Chat Client

```python
Expand Down