Skip to content

Conversation

@kieranklaassen
Copy link
Contributor

Problem

When using the Rails integration (acts_as_chat), persistence callbacks are set up on the underlying RubyLLM::Chat instance. However, because the original implementation of @on callbacks in Chat simply replaced the existing proc, users setting their own callbacks (e.g., chat.to_llm.on_new_message { ... }) would accidentally overwrite and remove the persistence logic. This resulted in lost chat history.

Solution

This PR implements a Callback Fanout pattern directly in RubyLLM::Chat.

  1. CallbackFanout: A new internal class that acts like a hash but stores an array of callables for each key. When invoked, it calls all registered callbacks in order.
  2. Simplified Integration: The ActiveRecord integration code (ChatMethods and ActsAsLegacy) was previously trying to manually wrap and chain callbacks to prevent this issue. This brittle logic has been removed in favor of the robust native support in Chat.
  3. Behavior: Now, calling on_new_message (and others) appends the new callback rather than replacing the existing one. This ensures that both system callbacks (persistence) and user callbacks run successfully.

Testing

  • Added a regression test in spec/ruby_llm/active_record/acts_as_spec.rb specifically checking that chaining callbacks on to_llm preserves persistence.
  • Validated that existing tests continue to pass.

kieranklaassen and others added 4 commits September 11, 2025 15:03
The migration template uses RubyLLM.models.resolve which requires
the instance to delegate to the class method. This commit adds the
missing delegation and comprehensive tests using TDD approach.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
- Implements `CallbackFanout` in `RubyLLM::Chat` to allow multiple callbacks per event.
- Removes brittle wrapper workarounds in ActiveRecord integrations.
- Ensures persistence callbacks are not overwritten when users define custom callbacks on the underlying LLM object.
- Adds regression test for direct callback chaining.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants