Skip to content

fix(chat): support background streaming when navigating away and back (#1160)#1196

Open
fbricon wants to merge 1 commit intokortex-hub:mainfrom
fbricon:worktree-background-chat-streaming
Open

fix(chat): support background streaming when navigating away and back (#1160)#1196
fbricon wants to merge 1 commit intokortex-hub:mainfrom
fbricon:worktree-background-chat-streaming

Conversation

@fbricon
Copy link
Copy Markdown
Contributor

@fbricon fbricon commented Mar 27, 2026

Buffer streaming chunks in the preload layer so that when the user navigates
away from a chat mid-stream and returns, the response continues rendering.
Implements reconnectToStream in IPCChatTransport and adds an E2E test.

background-streaming.mp4

Fixes #1160

@fbricon fbricon requested a review from a team as a code owner March 27, 2026 17:01
@fbricon fbricon requested review from benoitf and bmahabirbu and removed request for a team March 27, 2026 17:01
@fbricon
Copy link
Copy Markdown
Contributor Author

fbricon commented Mar 27, 2026

I noticed we're losing the time spent reasoning when going back and forth. I'm still pondering whether I care or not.

@fbricon fbricon force-pushed the worktree-background-chat-streaming branch from b3f1eea to 83f79a8 Compare March 27, 2026 17:03
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 27, 2026

📝 Walkthrough

Walkthrough

Adds per-chat persistent streaming state in preload, exposes reconnect/disconnect/get APIs via contextBridge, replays buffered chunks to the renderer for reconnection, updates renderer transport and UI to rehydrate in-progress streams, and adds a Playwright smoketest for background streaming across navigation.

Changes

Cohort / File(s) Summary
Preload: stream state & IPC APIs
packages/preload/src/index.ts
Added activeStreamsByChatId tracking { onDataId, bufferedChunks, isComplete, error }. inferenceStreamText registers per-chat state and records errors. New contextBridge APIs: inferenceGetActiveStream, inferenceReconnectToStream, inferenceDisconnectFromStream. IPC handlers now append chunks, mark errors/completion, and TTL-delete state after 30s.
Renderer: chat reconnection UI & logic
packages/renderer/src/lib/chat/components/chat.svelte
Added reconnection state (activeStreamOnDataId, reconnectedStreamingMessage, hasActiveStream). On chat mount/effect queries inferenceGetActiveStream and calls inferenceReconnectToStream to replay buffered chunks and receive live updates; reconstructs an in-progress assistant message from chunk types and manages lifecycle/cleanup. Updates messages/loading rendering and passes props to input.
Renderer: transport reconnect implementation
packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
Implemented async reconnectToStream(...) returning `ReadableStream
Renderer: input UI props & stop handling
packages/renderer/src/lib/chat/components/multimodal-input.svelte
Added optional props hasActiveStream?: boolean and activeStreamOnDataId?: number. loading now includes hasActiveStream. Stop handler also calls window.inferenceStopStream(activeStreamOnDataId) when present; minor event/formatting updates.
Tests: background streaming smoke test
tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
Added serial test [CHAT-18] Background streaming continues when returning to chat that starts a long response, navigates away and back, verifies buffered/streamed response presence, and asserts stop/send visibility and proper cancellation handling.

Sequence Diagram(s)

sequenceDiagram
    participant User as User / UI
    participant ChatComp as Chat Component
    participant WindowAPI as Window (contextBridge)
    participant Preload as Preload IPC Layer
    participant Backend as Backend

    Note over Backend,Preload: Backend produces an ongoing stream
    Backend->>Preload: onChunk(chunk, chatId)
    Preload->>Preload: append chunk to activeStreamsByChatId[chatId].bufferedChunks
    Preload->>WindowAPI: invoke onChunk callback (if registered)
    WindowAPI->>ChatComp: deliver onChunk

    User->>ChatComp: Navigate away (callbacks detached)
    User->>ChatComp: Return to chat
    ChatComp->>WindowAPI: inferenceGetActiveStream(chatId)
    WindowAPI->>Preload: read activeStreamsByChatId[chatId]
    Preload-->>WindowAPI: { onDataId, bufferedChunks, isComplete }
    WindowAPI-->>ChatComp: snapshot

    ChatComp->>WindowAPI: inferenceReconnectToStream(chatId,onChunk,onError,onEnd)
    WindowAPI->>Preload: register callbacks for onDataId
    Preload-->>WindowAPI: return { bufferedChunks, onDataId }
    WindowAPI-->>ChatComp: replay bufferedChunks
    Note over ChatComp: reconstruct in-progress assistant message from chunks

    Backend->>Preload: more onChunk / onEnd
    Preload->>WindowAPI: forward to registered callbacks
    WindowAPI->>ChatComp: deliver live chunks / onEnd
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related PRs

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Title check ✅ Passed The title 'fix(chat): support background streaming when navigating away and back' clearly and specifically describes the main change—preserving streaming responses during navigation.
Description check ✅ Passed The description accurately explains the solution (buffering chunks in preload layer), mentions the reconnectToStream implementation, and references the fixed issue.
Linked Issues check ✅ Passed The PR fully addresses issue #1160 by buffering streaming chunks, implementing reconnection, preventing update loss, and restoring UI indicators for active streaming.
Out of Scope Changes check ✅ Passed All changes are directly aligned with the objective of supporting background streaming: preload-layer buffering, reconnection APIs, UI updates, IPC transport implementation, and E2E testing.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts (1)

489-489: Consider replacing hard-coded delays with more deterministic waits.

The page.waitForTimeout(250) calls (lines 489, 499, 503, 509) introduce timing sensitivity. While understandable for streaming scenarios, Playwright best practices recommend using explicit conditions where possible.

For example, after navigating to Flows page, the await flowsPage.waitForLoad() already provides a deterministic wait. The additional delays may be unnecessary or could be replaced with specific element visibility checks.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts` at line 489,
The test defines a hard-coded delay helper const delay = (): Promise<void> =>
page.waitForTimeout(250) which creates timing sensitivity; replace uses of delay
with deterministic waits: where sequencing follows navigation or page load
prefer awaiting flowsPage.waitForLoad() or awaiting specific element states
(e.g., element.waitFor({ state: 'visible' }) / page.waitForSelector) or use
Playwright’s expect(...).toBeVisible() for the target elements; search for the
delay symbol and remove its usages, replacing each with the appropriate explicit
wait tied to the next action or UI element instead of page.waitForTimeout.
packages/preload/src/index.ts (2)

1548-1559: Verify 30-second cleanup timeout is appropriate for all use cases.

The 30-second delay before removing buffered stream state (line 1556) allows reconnection after navigation. However, if a user navigates away for longer than 30 seconds, the stream state (including any buffered chunks) will be lost, and reconnection will fail silently.

Consider whether this timeout should be configurable or documented, especially for scenarios with slow network conditions or extended user absence.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/preload/src/index.ts` around lines 1548 - 1559, The hardcoded
30_000ms cleanup in the activeStreamsByChatId loop can drop buffered stream
state too quickly for some users; make this timeout configurable and referenced
by a named constant (e.g., STREAM_BUFFER_TTL_MS) or by reading a preload
config/env var, update the deletion logic that uses setTimeout(() =>
activeStreamsByChatId.delete(chatId), 30000) to use that constant/setting, and
ensure any relevant docs mention the configurable STREAM_BUFFER_TTL_MS so
callers can increase it for slow networks or longer navigation gaps; keep the
existing streamState.isComplete and callbackId logic unchanged.

1508-1521: Consider memory implications for very long streaming responses.

The bufferedChunks array grows unbounded during streaming. For extended responses (e.g., code generation or long explanations), this could accumulate significant memory, especially when the user has navigated away and chunks are only buffered, not rendered.

For a future improvement, consider implementing a maximum buffer size with oldest-chunk eviction, or compacting text deltas into larger chunks periodically.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/preload/src/index.ts` around lines 1508 - 1521, The handler
registered with ipcRenderer.on for 'inference:streamText-onChunk' currently
pushes every chunk into streamState.bufferedChunks unbounded; update this to
enforce a maximum buffer length (e.g., MAX_BUFFERED_CHUNKS) and evict oldest
entries when exceeded, or perform periodic compaction of small text-delta chunks
into larger ones before pushing; modify the logic that references
activeStreamsByChatId and streamState.bufferedChunks so that when pushing a new
chunk it checks size, drops/compacts oldest chunk(s) as needed, and preserves
delivery to the active callback via onDataCallbacksStreamText and
callback.onChunk.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 228-244: The reconnectedStreamingMessage is left set after the
stream completes, causing a stale assistant message to remain visible; in the
stream onEnd handler (the callback that currently sets activeStreamOnDataId =
undefined) also clear reconnectedStreamingMessage (set it to undefined/null) and
trigger a chat history refetch (call your existing refetch/fetch conversation
function) so the finalized message comes from the canonical history instead of
the temporary reconnectedStreamingMessage.

---

Nitpick comments:
In `@packages/preload/src/index.ts`:
- Around line 1548-1559: The hardcoded 30_000ms cleanup in the
activeStreamsByChatId loop can drop buffered stream state too quickly for some
users; make this timeout configurable and referenced by a named constant (e.g.,
STREAM_BUFFER_TTL_MS) or by reading a preload config/env var, update the
deletion logic that uses setTimeout(() => activeStreamsByChatId.delete(chatId),
30000) to use that constant/setting, and ensure any relevant docs mention the
configurable STREAM_BUFFER_TTL_MS so callers can increase it for slow networks
or longer navigation gaps; keep the existing streamState.isComplete and
callbackId logic unchanged.
- Around line 1508-1521: The handler registered with ipcRenderer.on for
'inference:streamText-onChunk' currently pushes every chunk into
streamState.bufferedChunks unbounded; update this to enforce a maximum buffer
length (e.g., MAX_BUFFERED_CHUNKS) and evict oldest entries when exceeded, or
perform periodic compaction of small text-delta chunks into larger ones before
pushing; modify the logic that references activeStreamsByChatId and
streamState.bufferedChunks so that when pushing a new chunk it checks size,
drops/compacts oldest chunk(s) as needed, and preserves delivery to the active
callback via onDataCallbacksStreamText and callback.onChunk.

In `@tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts`:
- Line 489: The test defines a hard-coded delay helper const delay = ():
Promise<void> => page.waitForTimeout(250) which creates timing sensitivity;
replace uses of delay with deterministic waits: where sequencing follows
navigation or page load prefer awaiting flowsPage.waitForLoad() or awaiting
specific element states (e.g., element.waitFor({ state: 'visible' }) /
page.waitForSelector) or use Playwright’s expect(...).toBeVisible() for the
target elements; search for the delay symbol and remove its usages, replacing
each with the appropriate explicit wait tied to the next action or UI element
instead of page.waitForTimeout.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 1b79261f-2c2c-4246-a385-54f65ab05708

📥 Commits

Reviewing files that changed from the base of the PR and between 757516e and b3f1eea.

📒 Files selected for processing (5)
  • packages/preload/src/index.ts
  • packages/renderer/src/lib/chat/components/chat.svelte
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
  • packages/renderer/src/lib/chat/components/multimodal-input.svelte
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts

Comment on lines +228 to +244
} else if (chunk.type === 'text-end' && chunk.messageId === reconnectedMessageId) {
const parts: Array<{ type: string; text: string; state?: string }> = [];
if (reconnectedReasoning) {
parts.push({ type: 'reasoning', text: reconnectedReasoning, state: 'done' });
}
if (reconnectedText) {
parts.push({ type: 'text', text: reconnectedText, state: 'done' });
}
reconnectedStreamingMessage = {
id: reconnectedMessageId,
role: 'assistant',
parts,
content: reconnectedText,
createdAt: new Date(),
experimental_attachments: [],
};
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Reconnected message is not cleared after stream completion.

When the stream completes (via the onEnd callback at line 257-260), activeStreamOnDataId is set to undefined, but reconnectedStreamingMessage remains set. This causes the message to continue displaying in the UI even after completion.

Since the stream is complete, the chat history should be refetched and reconnectedStreamingMessage should be cleared to avoid showing a stale/duplicate message.

🛠️ Proposed fix
       (): void => {
         console.log('Reconnected stream completed');
         activeStreamOnDataId = undefined;
+        reconnectedStreamingMessage = undefined;
       },
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/renderer/src/lib/chat/components/chat.svelte` around lines 228 -
244, The reconnectedStreamingMessage is left set after the stream completes,
causing a stale assistant message to remain visible; in the stream onEnd handler
(the callback that currently sets activeStreamOnDataId = undefined) also clear
reconnectedStreamingMessage (set it to undefined/null) and trigger a chat
history refetch (call your existing refetch/fetch conversation function) so the
finalized message comes from the canonical history instead of the temporary
reconnectedStreamingMessage.

@codecov
Copy link
Copy Markdown

codecov bot commented Mar 27, 2026

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
packages/renderer/src/lib/chat/components/chat.svelte (1)

257-260: ⚠️ Potential issue | 🟠 Major

Refresh canonical chat state when the reconnected stream ends.

Reconnected streams bypass chatClient.onFinish, so this callback is the only place that can reconcile the synthetic message with chatClient.messages. Right now it only clears activeStreamOnDataId; the temporary message stays rendered and follow-up sends/exports still use stale history until the user navigates again.

Suggested fix
       (): void => {
         console.log('Reconnected stream completed');
         activeStreamOnDataId = undefined;
+        void chatHistory
+          .refetch()
+          .then(() => {
+            reconnectedStreamingMessage = undefined;
+          })
+          .catch(console.error);
       },
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/renderer/src/lib/chat/components/chat.svelte` around lines 257 -
260, The reconnected-stream completion callback currently only clears
activeStreamOnDataId, leaving the synthetic message out of sync with
chatClient.messages; update that callback (the anonymous completion handler) to
also reconcile the synthetic message with the canonical chat state by invoking
the existing chat client finish/reconciliation path (e.g., call
chatClient.onFinish(...) with the synthetic message or call the local reconcile
function that merges the temporary message into chatClient.messages), then clear
activeStreamOnDataId as before and trigger any UI/message list refresh so
follow-up sends/exports use the updated history.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 201-210: When handling a 'text-delta' chunk for
reconnectedMessageId, you are rebuilding reconnectedStreamingMessage with only a
text part and thus dropping any previously buffered reasoning parts (e.g.,
'reasoning-delta'); instead, merge the incoming text into reconnectedText and
update reconnectedStreamingMessage by preserving existing parts (copy existing
reconnectedStreamingMessage.parts if present, replacing or appending the text
part) and keeping other fields (experimental_attachments, createdAt, content)
intact; specifically modify the branch that constructs
reconnectedStreamingMessage so it locates any existing text part to update its
text/state or appends a new text part while retaining any prior reasoning-delta
parts and metadata.

---

Duplicate comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 257-260: The reconnected-stream completion callback currently only
clears activeStreamOnDataId, leaving the synthetic message out of sync with
chatClient.messages; update that callback (the anonymous completion handler) to
also reconcile the synthetic message with the canonical chat state by invoking
the existing chat client finish/reconciliation path (e.g., call
chatClient.onFinish(...) with the synthetic message or call the local reconcile
function that merges the temporary message into chatClient.messages), then clear
activeStreamOnDataId as before and trigger any UI/message list refresh so
follow-up sends/exports use the updated history.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 256eee66-0272-438e-a810-42e0d0575909

📥 Commits

Reviewing files that changed from the base of the PR and between b3f1eea and 83f79a8.

📒 Files selected for processing (5)
  • packages/preload/src/index.ts
  • packages/renderer/src/lib/chat/components/chat.svelte
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
  • packages/renderer/src/lib/chat/components/multimodal-input.svelte
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
✅ Files skipped from review due to trivial changes (1)
  • packages/preload/src/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts

…kortex-hub#1160)

Buffer streaming chunks in the preload layer so that when the user navigates
away from a chat mid-stream and returns, the response continues rendering.
Implements reconnectToStream in IPCChatTransport and adds an E2E test.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Fred Bricon <fbricon@gmail.com>
@fbricon fbricon force-pushed the worktree-background-chat-streaming branch from 83f79a8 to b082171 Compare March 27, 2026 18:19
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (2)
packages/renderer/src/lib/chat/components/chat.svelte (1)

244-250: Consider calling chatHistory.refetch() on reconnected stream completion.

The normal streaming flow calls chatHistory.refetch() in onFinish (line 118-120) to sync the completed message. The reconnection path doesn't, which could leave chat history stale until the user performs another action.

Add refetch on reconnected stream completion
       (): void => {
         console.log('Reconnected stream completed');
         activeStreamOnDataId = undefined;
+        chatHistory.refetch().catch(console.error);
         // Keep reconnectedStreamingMessage visible — the Chat SDK doesn't have this
         // message in its state. It will be replaced when the user sends a new message
         // or navigates away and back (which reloads messages from the database).
       },
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/renderer/src/lib/chat/components/chat.svelte` around lines 244 -
250, The reconnection completion handler (the block that logs "Reconnected
stream completed" and clears activeStreamOnDataId) needs to trigger
chatHistory.refetch() to sync the rebuilt message state; update the handler that
references activeStreamOnDataId and reconnectedStreamingMessage to call
chatHistory.refetch() (same as the existing onFinish flow) after clearing
activeStreamOnDataId so the UI reloads the authoritative messages.
packages/renderer/src/lib/chat/components/ipc-chat-transport.ts (1)

91-93: Type mismatch: error parameter is unknown but preload expects string.

The onError callback signature here uses unknown, but per packages/preload/src/index.ts:1471, the preload defines it as onError: (error: string) => void. This is a minor discrepancy that doesn't cause runtime issues, but aligning types improves consistency.

Align type to match preload signature
-          (error: unknown) => {
-            console.error('Error during reconnected stream:', error);
+          (error: string) => {
+            console.error('Error during reconnected stream:', error);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/renderer/src/lib/chat/components/ipc-chat-transport.ts` around lines
91 - 93, The onError callback in ipc-chat-transport.ts currently types its
parameter as unknown; change its signature to accept a string to match the
preload definition (onError: (error: string) => void). Update the anonymous
handler (the (error) => { ... } passed to the reconnected stream) to declare
error: string and pass that string into controller.error (or cast/convert to
string) and into console.error so the types align with the preload contract and
avoid the unknown type.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@packages/renderer/src/lib/chat/components/chat.svelte`:
- Around line 244-250: The reconnection completion handler (the block that logs
"Reconnected stream completed" and clears activeStreamOnDataId) needs to trigger
chatHistory.refetch() to sync the rebuilt message state; update the handler that
references activeStreamOnDataId and reconnectedStreamingMessage to call
chatHistory.refetch() (same as the existing onFinish flow) after clearing
activeStreamOnDataId so the UI reloads the authoritative messages.

In `@packages/renderer/src/lib/chat/components/ipc-chat-transport.ts`:
- Around line 91-93: The onError callback in ipc-chat-transport.ts currently
types its parameter as unknown; change its signature to accept a string to match
the preload definition (onError: (error: string) => void). Update the anonymous
handler (the (error) => { ... } passed to the reconnected stream) to declare
error: string and pass that string into controller.error (or cast/convert to
string) and into console.error so the types align with the preload contract and
avoid the unknown type.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: d252dfa4-e0ae-4c78-b85d-728973c8c4fb

📥 Commits

Reviewing files that changed from the base of the PR and between 83f79a8 and b082171.

📒 Files selected for processing (5)
  • packages/preload/src/index.ts
  • packages/renderer/src/lib/chat/components/chat.svelte
  • packages/renderer/src/lib/chat/components/ipc-chat-transport.ts
  • packages/renderer/src/lib/chat/components/multimodal-input.svelte
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
🚧 Files skipped from review as they are similar to previous changes (2)
  • tests/playwright/src/specs/provider-specs/chat-smoke.spec.ts
  • packages/preload/src/index.ts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Streaming updates lost when switching between chats

1 participant