fix(chat): support background streaming when navigating away and back (#1160)#1196
fix(chat): support background streaming when navigating away and back (#1160)#1196fbricon wants to merge 1 commit intokortex-hub:mainfrom
Conversation
|
I noticed we're losing the time spent reasoning when going back and forth. I'm still pondering whether I care or not. |
b3f1eea to
83f79a8
Compare
📝 WalkthroughWalkthroughAdds per-chat persistent streaming state in preload, exposes reconnect/disconnect/get APIs via contextBridge, replays buffered chunks to the renderer for reconnection, updates renderer transport and UI to rehydrate in-progress streams, and adds a Playwright smoketest for background streaming across navigation. Changes
Sequence Diagram(s)sequenceDiagram
participant User as User / UI
participant ChatComp as Chat Component
participant WindowAPI as Window (contextBridge)
participant Preload as Preload IPC Layer
participant Backend as Backend
Note over Backend,Preload: Backend produces an ongoing stream
Backend->>Preload: onChunk(chunk, chatId)
Preload->>Preload: append chunk to activeStreamsByChatId[chatId].bufferedChunks
Preload->>WindowAPI: invoke onChunk callback (if registered)
WindowAPI->>ChatComp: deliver onChunk
User->>ChatComp: Navigate away (callbacks detached)
User->>ChatComp: Return to chat
ChatComp->>WindowAPI: inferenceGetActiveStream(chatId)
WindowAPI->>Preload: read activeStreamsByChatId[chatId]
Preload-->>WindowAPI: { onDataId, bufferedChunks, isComplete }
WindowAPI-->>ChatComp: snapshot
ChatComp->>WindowAPI: inferenceReconnectToStream(chatId,onChunk,onError,onEnd)
WindowAPI->>Preload: register callbacks for onDataId
Preload-->>WindowAPI: return { bufferedChunks, onDataId }
WindowAPI-->>ChatComp: replay bufferedChunks
Note over ChatComp: reconstruct in-progress assistant message from chunks
Backend->>Preload: more onChunk / onEnd
Preload->>WindowAPI: forward to registered callbacks
WindowAPI->>ChatComp: deliver live chunks / onEnd
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Possibly related PRs
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (3)
tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts (1)
489-489: Consider replacing hard-coded delays with more deterministic waits.The
page.waitForTimeout(250)calls (lines 489, 499, 503, 509) introduce timing sensitivity. While understandable for streaming scenarios, Playwright best practices recommend using explicit conditions where possible.For example, after navigating to Flows page, the
await flowsPage.waitForLoad()already provides a deterministic wait. The additional delays may be unnecessary or could be replaced with specific element visibility checks.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts` at line 489, The test defines a hard-coded delay helper const delay = (): Promise<void> => page.waitForTimeout(250) which creates timing sensitivity; replace uses of delay with deterministic waits: where sequencing follows navigation or page load prefer awaiting flowsPage.waitForLoad() or awaiting specific element states (e.g., element.waitFor({ state: 'visible' }) / page.waitForSelector) or use Playwright’s expect(...).toBeVisible() for the target elements; search for the delay symbol and remove its usages, replacing each with the appropriate explicit wait tied to the next action or UI element instead of page.waitForTimeout.packages/preload/src/index.ts (2)
1548-1559: Verify 30-second cleanup timeout is appropriate for all use cases.The 30-second delay before removing buffered stream state (line 1556) allows reconnection after navigation. However, if a user navigates away for longer than 30 seconds, the stream state (including any buffered chunks) will be lost, and reconnection will fail silently.
Consider whether this timeout should be configurable or documented, especially for scenarios with slow network conditions or extended user absence.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/preload/src/index.ts` around lines 1548 - 1559, The hardcoded 30_000ms cleanup in the activeStreamsByChatId loop can drop buffered stream state too quickly for some users; make this timeout configurable and referenced by a named constant (e.g., STREAM_BUFFER_TTL_MS) or by reading a preload config/env var, update the deletion logic that uses setTimeout(() => activeStreamsByChatId.delete(chatId), 30000) to use that constant/setting, and ensure any relevant docs mention the configurable STREAM_BUFFER_TTL_MS so callers can increase it for slow networks or longer navigation gaps; keep the existing streamState.isComplete and callbackId logic unchanged.
1508-1521: Consider memory implications for very long streaming responses.The
bufferedChunksarray grows unbounded during streaming. For extended responses (e.g., code generation or long explanations), this could accumulate significant memory, especially when the user has navigated away and chunks are only buffered, not rendered.For a future improvement, consider implementing a maximum buffer size with oldest-chunk eviction, or compacting text deltas into larger chunks periodically.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/preload/src/index.ts` around lines 1508 - 1521, The handler registered with ipcRenderer.on for 'inference:streamText-onChunk' currently pushes every chunk into streamState.bufferedChunks unbounded; update this to enforce a maximum buffer length (e.g., MAX_BUFFERED_CHUNKS) and evict oldest entries when exceeded, or perform periodic compaction of small text-delta chunks into larger ones before pushing; modify the logic that references activeStreamsByChatId and streamState.bufferedChunks so that when pushing a new chunk it checks size, drops/compacts oldest chunk(s) as needed, and preserves delivery to the active callback via onDataCallbacksStreamText and callback.onChunk.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 228-244: The reconnectedStreamingMessage is left set after the
stream completes, causing a stale assistant message to remain visible; in the
stream onEnd handler (the callback that currently sets activeStreamOnDataId =
undefined) also clear reconnectedStreamingMessage (set it to undefined/null) and
trigger a chat history refetch (call your existing refetch/fetch conversation
function) so the finalized message comes from the canonical history instead of
the temporary reconnectedStreamingMessage.
---
Nitpick comments:
In `@packages/preload/src/index.ts`:
- Around line 1548-1559: The hardcoded 30_000ms cleanup in the
activeStreamsByChatId loop can drop buffered stream state too quickly for some
users; make this timeout configurable and referenced by a named constant (e.g.,
STREAM_BUFFER_TTL_MS) or by reading a preload config/env var, update the
deletion logic that uses setTimeout(() => activeStreamsByChatId.delete(chatId),
30000) to use that constant/setting, and ensure any relevant docs mention the
configurable STREAM_BUFFER_TTL_MS so callers can increase it for slow networks
or longer navigation gaps; keep the existing streamState.isComplete and
callbackId logic unchanged.
- Around line 1508-1521: The handler registered with ipcRenderer.on for
'inference:streamText-onChunk' currently pushes every chunk into
streamState.bufferedChunks unbounded; update this to enforce a maximum buffer
length (e.g., MAX_BUFFERED_CHUNKS) and evict oldest entries when exceeded, or
perform periodic compaction of small text-delta chunks into larger ones before
pushing; modify the logic that references activeStreamsByChatId and
streamState.bufferedChunks so that when pushing a new chunk it checks size,
drops/compacts oldest chunk(s) as needed, and preserves delivery to the active
callback via onDataCallbacksStreamText and callback.onChunk.
In `@tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts`:
- Line 489: The test defines a hard-coded delay helper const delay = ():
Promise<void> => page.waitForTimeout(250) which creates timing sensitivity;
replace uses of delay with deterministic waits: where sequencing follows
navigation or page load prefer awaiting flowsPage.waitForLoad() or awaiting
specific element states (e.g., element.waitFor({ state: 'visible' }) /
page.waitForSelector) or use Playwright’s expect(...).toBeVisible() for the
target elements; search for the delay symbol and remove its usages, replacing
each with the appropriate explicit wait tied to the next action or UI element
instead of page.waitForTimeout.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 1b79261f-2c2c-4246-a385-54f65ab05708
📒 Files selected for processing (5)
packages/preload/src/index.tspackages/renderer/src/lib/chat/components/chat.sveltepackages/renderer/src/lib/chat/components/ipc-chat-transport.tspackages/renderer/src/lib/chat/components/multimodal-input.sveltetests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
| } else if (chunk.type === 'text-end' && chunk.messageId === reconnectedMessageId) { | ||
| const parts: Array<{ type: string; text: string; state?: string }> = []; | ||
| if (reconnectedReasoning) { | ||
| parts.push({ type: 'reasoning', text: reconnectedReasoning, state: 'done' }); | ||
| } | ||
| if (reconnectedText) { | ||
| parts.push({ type: 'text', text: reconnectedText, state: 'done' }); | ||
| } | ||
| reconnectedStreamingMessage = { | ||
| id: reconnectedMessageId, | ||
| role: 'assistant', | ||
| parts, | ||
| content: reconnectedText, | ||
| createdAt: new Date(), | ||
| experimental_attachments: [], | ||
| }; | ||
| } |
There was a problem hiding this comment.
Reconnected message is not cleared after stream completion.
When the stream completes (via the onEnd callback at line 257-260), activeStreamOnDataId is set to undefined, but reconnectedStreamingMessage remains set. This causes the message to continue displaying in the UI even after completion.
Since the stream is complete, the chat history should be refetched and reconnectedStreamingMessage should be cleared to avoid showing a stale/duplicate message.
🛠️ Proposed fix
(): void => {
console.log('Reconnected stream completed');
activeStreamOnDataId = undefined;
+ reconnectedStreamingMessage = undefined;
},🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/renderer/src/lib/chat/components/chat.svelte` around lines 228 -
244, The reconnectedStreamingMessage is left set after the stream completes,
causing a stale assistant message to remain visible; in the stream onEnd handler
(the callback that currently sets activeStreamOnDataId = undefined) also clear
reconnectedStreamingMessage (set it to undefined/null) and trigger a chat
history refetch (call your existing refetch/fetch conversation function) so the
finalized message comes from the canonical history instead of the temporary
reconnectedStreamingMessage.
Codecov Report❌ Patch coverage is 📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
packages/renderer/src/lib/chat/components/chat.svelte (1)
257-260:⚠️ Potential issue | 🟠 MajorRefresh canonical chat state when the reconnected stream ends.
Reconnected streams bypass
chatClient.onFinish, so this callback is the only place that can reconcile the synthetic message withchatClient.messages. Right now it only clearsactiveStreamOnDataId; the temporary message stays rendered and follow-up sends/exports still use stale history until the user navigates again.Suggested fix
(): void => { console.log('Reconnected stream completed'); activeStreamOnDataId = undefined; + void chatHistory + .refetch() + .then(() => { + reconnectedStreamingMessage = undefined; + }) + .catch(console.error); },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/renderer/src/lib/chat/components/chat.svelte` around lines 257 - 260, The reconnected-stream completion callback currently only clears activeStreamOnDataId, leaving the synthetic message out of sync with chatClient.messages; update that callback (the anonymous completion handler) to also reconcile the synthetic message with the canonical chat state by invoking the existing chat client finish/reconciliation path (e.g., call chatClient.onFinish(...) with the synthetic message or call the local reconcile function that merges the temporary message into chatClient.messages), then clear activeStreamOnDataId as before and trigger any UI/message list refresh so follow-up sends/exports use the updated history.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 201-210: When handling a 'text-delta' chunk for
reconnectedMessageId, you are rebuilding reconnectedStreamingMessage with only a
text part and thus dropping any previously buffered reasoning parts (e.g.,
'reasoning-delta'); instead, merge the incoming text into reconnectedText and
update reconnectedStreamingMessage by preserving existing parts (copy existing
reconnectedStreamingMessage.parts if present, replacing or appending the text
part) and keeping other fields (experimental_attachments, createdAt, content)
intact; specifically modify the branch that constructs
reconnectedStreamingMessage so it locates any existing text part to update its
text/state or appends a new text part while retaining any prior reasoning-delta
parts and metadata.
---
Duplicate comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 257-260: The reconnected-stream completion callback currently only
clears activeStreamOnDataId, leaving the synthetic message out of sync with
chatClient.messages; update that callback (the anonymous completion handler) to
also reconcile the synthetic message with the canonical chat state by invoking
the existing chat client finish/reconciliation path (e.g., call
chatClient.onFinish(...) with the synthetic message or call the local reconcile
function that merges the temporary message into chatClient.messages), then clear
activeStreamOnDataId as before and trigger any UI/message list refresh so
follow-up sends/exports use the updated history.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 256eee66-0272-438e-a810-42e0d0575909
📒 Files selected for processing (5)
packages/preload/src/index.tspackages/renderer/src/lib/chat/components/chat.sveltepackages/renderer/src/lib/chat/components/ipc-chat-transport.tspackages/renderer/src/lib/chat/components/multimodal-input.sveltetests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
✅ Files skipped from review due to trivial changes (1)
- packages/preload/src/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
- packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
…kortex-hub#1160) Buffer streaming chunks in the preload layer so that when the user navigates away from a chat mid-stream and returns, the response continues rendering. Implements reconnectToStream in IPCChatTransport and adds an E2E test. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: Fred Bricon <fbricon@gmail.com>
83f79a8 to
b082171
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (2)
packages/renderer/src/lib/chat/components/chat.svelte (1)
244-250: Consider callingchatHistory.refetch()on reconnected stream completion.The normal streaming flow calls
chatHistory.refetch()inonFinish(line 118-120) to sync the completed message. The reconnection path doesn't, which could leave chat history stale until the user performs another action.Add refetch on reconnected stream completion
(): void => { console.log('Reconnected stream completed'); activeStreamOnDataId = undefined; + chatHistory.refetch().catch(console.error); // Keep reconnectedStreamingMessage visible — the Chat SDK doesn't have this // message in its state. It will be replaced when the user sends a new message // or navigates away and back (which reloads messages from the database). },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/renderer/src/lib/chat/components/chat.svelte` around lines 244 - 250, The reconnection completion handler (the block that logs "Reconnected stream completed" and clears activeStreamOnDataId) needs to trigger chatHistory.refetch() to sync the rebuilt message state; update the handler that references activeStreamOnDataId and reconnectedStreamingMessage to call chatHistory.refetch() (same as the existing onFinish flow) after clearing activeStreamOnDataId so the UI reloads the authoritative messages.packages/renderer/src/lib/chat/components/ipc-chat-transport.ts (1)
91-93: Type mismatch:errorparameter isunknownbut preload expectsstring.The
onErrorcallback signature here usesunknown, but perpackages/preload/src/index.ts:1471, the preload defines it asonError: (error: string) => void. This is a minor discrepancy that doesn't cause runtime issues, but aligning types improves consistency.Align type to match preload signature
- (error: unknown) => { - console.error('Error during reconnected stream:', error); + (error: string) => { + console.error('Error during reconnected stream:', error);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/renderer/src/lib/chat/components/ipc-chat-transport.ts` around lines 91 - 93, The onError callback in ipc-chat-transport.ts currently types its parameter as unknown; change its signature to accept a string to match the preload definition (onError: (error: string) => void). Update the anonymous handler (the (error) => { ... } passed to the reconnected stream) to declare error: string and pass that string into controller.error (or cast/convert to string) and into console.error so the types align with the preload contract and avoid the unknown type.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 244-250: The reconnection completion handler (the block that logs
"Reconnected stream completed" and clears activeStreamOnDataId) needs to trigger
chatHistory.refetch() to sync the rebuilt message state; update the handler that
references activeStreamOnDataId and reconnectedStreamingMessage to call
chatHistory.refetch() (same as the existing onFinish flow) after clearing
activeStreamOnDataId so the UI reloads the authoritative messages.
In `@packages/renderer/src/lib/chat/components/ipc-chat-transport.ts`:
- Around line 91-93: The onError callback in ipc-chat-transport.ts currently
types its parameter as unknown; change its signature to accept a string to match
the preload definition (onError: (error: string) => void). Update the anonymous
handler (the (error) => { ... } passed to the reconnected stream) to declare
error: string and pass that string into controller.error (or cast/convert to
string) and into console.error so the types align with the preload contract and
avoid the unknown type.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: d252dfa4-e0ae-4c78-b85d-728973c8c4fb
📒 Files selected for processing (5)
packages/preload/src/index.tspackages/renderer/src/lib/chat/components/chat.sveltepackages/renderer/src/lib/chat/components/ipc-chat-transport.tspackages/renderer/src/lib/chat/components/multimodal-input.sveltetests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
- packages/preload/src/index.ts
Buffer streaming chunks in the preload layer so that when the user navigates
away from a chat mid-stream and returns, the response continues rendering.
Implements reconnectToStream in IPCChatTransport and adds an E2E test.
background-streaming.mp4
Fixes #1160