Problem Statement
The current debounce concurrency strategy resets the timer on every incoming message, but it only processes the final message in the burst. Earlier messages are superseded and never reach the handler.
That works for "only care about the latest correction" cases, but it does not work well for chat apps where users send a burst of short messages that all matter:
- "hey"
- "how do I reset my password?"
- "can you help with that?"
In that flow, I want debounce timing, but I do not want message loss.
queue is also not a fit, because it processes the first message immediately and only batches the follow-ups while the handler is already running. What is missing is a mode with debounce timing and queue-style message preservation.
Proposed Solution
Add a new built-in concurrency mode, or an option on debounce, with semantics like:
- Every message starts/resets the debounce timer
- All messages received during the burst are preserved
- When the debounce window elapses, the handler runs once
- The handler receives:
- the last message as the primary
message
- all earlier messages in the burst as
context.skipped or a new context.burst
- The bot response is anchored to the thread of the last message in the burst
In practice, this would behave like "debounced queue drain" instead of "final-message-only debounce".
Possible API shapes:
concurrency: {
strategy: "debounce-all",
debounceMs: 10_000,
}
or
concurrency: {
strategy: "debounce",
debounceMs: 10_000,
preserveBurstMessages: true,
}
Expected handler behavior:
bot.onNewMessage(/.+/i, async (thread, message, context) => {
const allMessages = [...(context?.skipped ?? []), message];
await thread.post(`Responding to ${allMessages.length} messages`);
});
This would fill the gap between current queue and current debounce:
queue: preserve messages, but no quiet-period debounce before first processing
debounce: quiet-period debounce, but drops earlier messages
- requested mode: quiet-period debounce and preserve all messages
This would be especially useful for Slack, WhatsApp, Telegram, customer support bots, and AI assistants where users often split one thought across multiple rapid messages.
Alternatives Considered
No response
Use Case
Priority
Important
Contribution
Additional Context
No response
Problem Statement
The current
debounceconcurrency strategy resets the timer on every incoming message, but it only processes the final message in the burst. Earlier messages are superseded and never reach the handler.That works for "only care about the latest correction" cases, but it does not work well for chat apps where users send a burst of short messages that all matter:
In that flow, I want debounce timing, but I do not want message loss.
queueis also not a fit, because it processes the first message immediately and only batches the follow-ups while the handler is already running. What is missing is a mode with debounce timing and queue-style message preservation.Proposed Solution
Add a new built-in concurrency mode, or an option on
debounce, with semantics like:messagecontext.skippedor a newcontext.burstIn practice, this would behave like "debounced queue drain" instead of "final-message-only debounce".
Possible API shapes:
or
Expected handler behavior:
This would fill the gap between current
queueand currentdebounce:queue: preserve messages, but no quiet-period debounce before first processingdebounce: quiet-period debounce, but drops earlier messagesThis would be especially useful for Slack, WhatsApp, Telegram, customer support bots, and AI assistants where users often split one thought across multiple rapid messages.
Alternatives Considered
No response
Use Case
Priority
Important
Contribution
Additional Context
No response