Skip to content

Conversation

@starpit
Copy link
Member

@starpit starpit commented Nov 14, 2025

This introduces the enum variant and provides a client-side implementation which simply unrolls the batch into n separate Generates. We were already doing this for Bulk::Repeat, i.e. client-side expansion.

TODO:

  • shift this into the executor backend, to expose server-side support for both forms of bulk execution
  • do we need some kind of capabilities system, so that we do client-side expansion if the backend does not support this form of bulk? for now, these are standard openai features, so there aren't any use cases for client-side support at the moment... hmm

This introduces the enum variant and provides a client-side implementation which simply unrolls the batch into n separate Generates. We were already doing this for Bulk::Repeat, i.e. client-side expansion.

TODO:
- shift this into the executor backend, to expose server-side support for both forms of bulk execution
- do we need some kind of capabilities system, so that we do client-side expansion if the backend does not support this form of bulk? for now, these are standard openai features, so there aren't any use cases for client-side support at the moment... hmm

Signed-off-by: Nick Mitchell <[email protected]>
@starpit starpit changed the title feat: initial support for Bulk::Batch feat: initial support for Bulk::Map Nov 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant