Skip to content

Conversation

@mschristensen
Copy link
Contributor

Description

AIT DOCS INTEGRATION BRANCH
Not (yet) intended to merge but opening to create review apps

Checklist

@coderabbitai
Copy link

coderabbitai bot commented Dec 17, 2025

Important

Review skipped

Auto reviews are disabled on this repository.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@mschristensen mschristensen added the review-app Create a Heroku review app label Dec 17, 2025
```
</Code>

When publishing tokens, don't await the `channel.appendMessage()` call. Ably rolls up acknowledgments and debounces them for efficiency, which means awaiting each append would unnecessarily slow down your token stream. Messages are still published in the order that `appendMessage()` is called, so delivery order is not affected.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do we suggest that clients check for the success or failure of the publish?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have created a separate ticket to address this: https://ably.atlassian.net/browse/AIT-238

@matt423 matt423 force-pushed the AIT-129-AIT-Docs-release-branch branch from 400eb09 to f8056cb Compare December 23, 2025 10:41
@matt423 matt423 added review-app Create a Heroku review app and removed review-app Create a Heroku review app labels Dec 23, 2025
@ably-ci ably-ci temporarily deployed to ably-docs-ait-129-ait-d-5wu2wt December 23, 2025 10:48 Inactive

This pattern is useful when clients only care about the most recent part of a response and you are happy to treat the channel history as a short sliding window rather than a full conversation log. For example:

- **Backend-stored responses**: The backend writes complete responses to a database and clients load those full responses from there, while Ably is used only to deliver live tokens for the current in-progress response.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we want to promote using this pattern with backend-stored responses, because the backend storage solves the efficient hydration problem but does not solve the problem of missing tokens for the current response (assuming the response is not written to the database until it has been completely streamed, which is the pattern most existing customers are using and struggling with). It would be simpler for a customers to load history from their own database and then use message-per-response to catch up with any in-progress response

@mschristensen mschristensen mentioned this pull request Jan 5, 2026
3 tasks
@matt423 matt423 added review-app Create a Heroku review app and removed review-app Create a Heroku review app labels Jan 7, 2026
GregHolmes and others added 19 commits January 7, 2026 11:41
Link to the pending `/ai-transport` overview page.
Add intro describing the pattern, its properties, and use cases.
Includes continuous token streams, correlating tokens for distinct
responses, and explicit start/end events.
Splits each token streaming approach into distinct patterns and shows
both the publish and subscribe side behaviour alongside one another.
Includes hydration with rewind and hydration with persisted history +
untilAttach. Describes the pattern for handling in-progress live
responses with complete responses loaded from the database.
Add doc explaining streaming tokens with appendMessage and update
compaction allowing message-per-response history.
Unifies the token streaming nav for token streaming after rebase.
Refines the intro copy in message-per-response to have structural
similarity with the message-per-token page.
Refine the Publishing section of the message-per-response docs.

- Include anchor tags on title
- Describe the `serial` identifier
- Align with stream pattern used in message-per-token docs
- Remove duplicate example
Refine the Subscribing section of the message-per-response docs.

- Add anchor tag to heading
- Describes each action upfront
- Uses RANDOM_CHANNEL_NAME
Refine the rewind section of the message-per-response docs.

- Include description of allowed rewind paameters
- Tweak copy
Refines the history section for the message-per-response docs.

- Adds anchor to heading
- Uses RANDOM_CHANNEL_NAME
- Use message serial in code snippet instead of ID
- Tweaks copy
Fix the hydration of in progress responses via rewind by using the responseId in the extras to correlate messages with completed responses loaded from the database.
Fix the hydration of in progress responses using history by obtaining
the timestamp of the last completed response loaded from the database
and paginating history forwards from that point.
Removes the headers/metadata section, as this covers the specific
semantics of extras.headers handling with appends, which is better
addressed by the (upcoming) message append pub/sub docs. Instead, a
callout is used to describe header mixin semantics in the appropriate
place insofar as it relates to the discussion at hand.
mschristensen and others added 6 commits January 7, 2026 11:41
Update the token streaming with message per token docs to include a
callout describing resume behaviour in case of transient disconnection.
Fix the message per token docs headers to include anchors and align with
naming in the message per response page.
Adds an overview page for a Sessions & Identity section which describes the channel-oriented session model and its benefits over the traditional connection-oriented model.

Describes how identity relates to session management and how this works in the context of channel-oriented sessions.

Shows how to use identified clients to assign a trusted identity to users and obtain this identity from the agent side.

Shows how to use Ably capabilities to control which operations
authenticated users can perform on which channels.

Shows how to use authenticated user claims to associated a role or other attribute with a user.

Updates the docs to describe how to handle authentication, capabilities, identity and roles/attributes for agents separately from end users.

Describes how to use presence to mark users and agents as online/offline. Includes description of synthetic leaves in the event of abrupt disconnection.

Describe how to subscribe to presence to see who is online, and take action when a user is offline across all devices.

Add docs for resuming user and agent sessions, linking to hydration patterns for different token streaming approaches for user resumes and describing agent resume behaviour with message catch up.
Adds a guide for using the OpenAI SDK to consume streaming events from
the Responses API and publish them over Ably using the message per token
pattern.
@mschristensen mschristensen force-pushed the AIT-129-AIT-Docs-release-branch branch from aebe2c1 to ea0ac8d Compare January 7, 2026 11:48
- Uses a further-reading callout instead of note
- Removes repeated code initialising Ably client (OpenAI client already
  instantiated)
Adds an anchor tag to the "Client hydration" heading
Similar to the open ai message per token guide, but using the message
per response pattern with appends.
@mschristensen mschristensen added review-app Create a Heroku review app and removed review-app Create a Heroku review app labels Jan 8, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

review-app Create a Heroku review app

Development

Successfully merging this pull request may close these issues.

9 participants