feat: enable message queuing in ask mode#366
Conversation
Remove agent-only restrictions so users can queue messages and use stop-and-send while a response is streaming in ask mode too. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
✅ Files skipped from review due to trivial changes (1)
📝 WalkthroughWalkthroughDecouples agent-only gating from queue and streaming submission logic so submissions and queued-message processing/display occur across all chat modes, and minor wording/logging tweaks were made. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant ChatInput
participant ChatController
participant QueueManager
participant StreamProcessor
User->>ChatInput: submit message
ChatInput->>ChatController: handleSubmit(status, chatMode)
ChatController->>QueueManager: enqueue or route (queue / stop-and-send)
alt status == "streaming"
ChatController->>StreamProcessor: allow submit during streaming
StreamProcessor->>QueueManager: process queued items
else status == "ready"
QueueManager->>StreamProcessor: normal send flow
end
QueueManager->>ChatInput: update messageQueue
ChatInput->>User: render QueuedMessagesPanel (when queue length > 0)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
app/hooks/useChatHandlers.ts (2)
200-228:⚠️ Potential issue | 🟠 MajorValidate first, then await
stopActiveStream()for stop-and-send.This branch now runs in ask mode too, but it stops the active response before the validation block below. An invalid replacement prompt can cancel the in-flight answer and then be rejected. It also bypasses
stopActiveStream(), so cancel/save are fire-and-forget and local Codex chats fall into the temp-stream cancel path instead of the local-provider no-op path.Suggested change
+ let shouldStopAndSend = false; + if (status === "streaming") { const validFiles = uploadedFiles.filter( (file) => file.uploaded && file.url && file.fileId, ); if (queueBehavior === "queue") { // Queue the message - will auto-send after current response completes queueMessage( input, validFiles.map((f) => ({ file: f.file, fileId: f.fileId! as Id<"files">, url: f.url!, })), ); clearInput(); clearUploadedFiles(); return; } else if (queueBehavior === "stop-and-send") { - // Immediately stop current stream and send right away - stop(); - - // Cancel the stream in database and save current message state - if ( - !temporaryChatsEnabledRef.current && - !isCodexLocal(selectedModel) - ) { - cancelStreamMutation({ chatId }).catch((error) => { - console.error("Failed to cancel stream:", error); - }); - - const lastMessage = messages[messages.length - 1]; - if (lastMessage && lastMessage.role === "assistant") { - saveAssistantMessage({ - id: lastMessage.id, - chatId, - role: lastMessage.role, - parts: lastMessage.parts, - }).catch((error) => { - console.error("Failed to save message on stop:", error); - }); - } - } else { - // Temporary chats: signal cancel via temp stream coordination - cancelTempStreamMutation({ chatId }).catch(() => {}); - } - // Continue to send the new message immediately below (don't return) + shouldStopAndSend = true; } } // Check token limit before sending based on user plan const tokenCount = countInputTokens(input, uploadedFiles); const maxTokens = getMaxTokensForSubscription(subscription, { mode: chatMode, }); + + if (shouldStopAndSend) { + await stopActiveStream(); + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@app/hooks/useChatHandlers.ts` around lines 200 - 228, The "stop-and-send" branch currently calls stop() and performs fire-and-forget cancel/save before validating the new prompt; change it to validate first, then await stopActiveStream() to gracefully stop the in-flight response; replace the direct stop() call with await stopActiveStream() and ensure cancelStreamMutation/saveAssistantMessage/cancelTempStreamMutation are awaited (or handled inside stopActiveStream) so local Codex chats use the local-provider no-op path via isCodexLocal(selectedModel) and temporaryChatsEnabledRef.current is respected rather than falling into the temp-stream cancel path.
182-200:⚠️ Potential issue | 🟠 MajorQueued ask-mode messages now skip token enforcement.
With Line 182 no longer gated to agent mode, the
queueBehavior === "queue"path returns before the ask-mode file limit andmaxTokenschecks on Lines 237-260. Those queued items are later submitted byhandleSendNowwithout revalidation, and the auto-dequeue flow inapp/components/chat.tsxalso runs regardless of mode, so oversized ask-mode requests can be enqueued and then sent anyway.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@app/hooks/useChatHandlers.ts` around lines 182 - 200, The current early return in the status === "streaming" block lets the queueBehavior === "queue" path enqueue messages without running ask-mode validations (file count and maxTokens) before queueMessage is called; update the logic in useChatHandlers so that either (A) the queueBehavior === "queue" branch only executes for agent mode, or (B) run the ask-mode validation checks (the same file limit and maxTokens logic used later) before calling queueMessage and returning; adjust the queueMessage call site and/or handleSendNow to ensure queued ask-mode items are validated (use the existing symbols queueMessage, handleSendNow, queueBehavior, and uploadedFiles) so oversized ask-mode requests cannot be enqueued.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@app/hooks/useChatHandlers.ts`:
- Around line 200-228: The "stop-and-send" branch currently calls stop() and
performs fire-and-forget cancel/save before validating the new prompt; change it
to validate first, then await stopActiveStream() to gracefully stop the
in-flight response; replace the direct stop() call with await stopActiveStream()
and ensure cancelStreamMutation/saveAssistantMessage/cancelTempStreamMutation
are awaited (or handled inside stopActiveStream) so local Codex chats use the
local-provider no-op path via isCodexLocal(selectedModel) and
temporaryChatsEnabledRef.current is respected rather than falling into the
temp-stream cancel path.
- Around line 182-200: The current early return in the status === "streaming"
block lets the queueBehavior === "queue" path enqueue messages without running
ask-mode validations (file count and maxTokens) before queueMessage is called;
update the logic in useChatHandlers so that either (A) the queueBehavior ===
"queue" branch only executes for agent mode, or (B) run the ask-mode validation
checks (the same file limit and maxTokens logic used later) before calling
queueMessage and returning; adjust the queueMessage call site and/or
handleSendNow to ensure queued ask-mode items are validated (use the existing
symbols queueMessage, handleSendNow, queueBehavior, and uploadedFiles) so
oversized ask-mode requests cannot be enqueued.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: bae2346f-cd6e-40a0-b5d2-7d18d878fc8d
📒 Files selected for processing (3)
app/components/ChatInput/ChatInput.tsxapp/components/chat.tsxapp/hooks/useChatHandlers.ts
💤 Files with no reviewable changes (1)
- app/components/chat.tsx
…f reset time Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This is an unrecoverable condition (stale WebSocket on long local streams) that doesn't affect the user — logging as error is noise. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
QueuedMessagesPanelto render in all chat modesTest plan
🤖 Generated with Claude Code
Summary by CodeRabbit
New Features
Bug Fixes
Style
Chores