Skip to content

feat(Lightspeed): Add stop button to interrupt a streaming conversation#2587

Open
karthikjeeyar wants to merge 3 commits intoredhat-developer:mainfrom
karthikjeeyar:add-stop-button
Open

feat(Lightspeed): Add stop button to interrupt a streaming conversation#2587
karthikjeeyar wants to merge 3 commits intoredhat-developer:mainfrom
karthikjeeyar:add-stop-button

Conversation

@karthikjeeyar
Copy link
Member

Hey, I just made a Pull Request!

Fixes:

https://redhat.atlassian.net/browse/RHIDP-12490
https://redhat.atlassian.net/browse/RHDHBUGS-2745

This PR contains the following changes:

  1. Add interrupt endpoint to stop the streaming conversation.
  2. Add Stop button in the UI to interrupt.
  3. Avoid Sending message if the user simply presses Enter with a emtpy string
Lightspeed_stop_button.mov

How to test:

  1. Pull changes from lightspeed-stack main branch (contains the topic_summary generation for interrupt flow).
  2. Checkout the LCORE PR - LCORE-1514: persist conversation on stream interrupt with async topic summary lightspeed-core/lightspeed-stack#1378 and run the lcore and llamastack.
  3. Run this PR changes to see the stop button in the UI.

✔️ Checklist

  • A changeset describing the change and affected packages. (more info)
  • Added or Updated documentation
  • Tests for new functionality and regression tests for bug fixes
  • Screenshots attached (for UI changes)

@rhdh-gh-app
Copy link

rhdh-gh-app bot commented Mar 23, 2026

Changed Packages

Package Name Package Path Changeset Bump Current Version
@red-hat-developer-hub/backstage-plugin-lightspeed-backend workspaces/lightspeed/plugins/lightspeed-backend patch v1.4.0
@red-hat-developer-hub/backstage-plugin-lightspeed workspaces/lightspeed/plugins/lightspeed patch v1.4.0

@rhdh-qodo-merge
Copy link

Review Summary by Qodo

Add stop button to interrupt streaming conversations

✨ Enhancement

Grey Divider

Walkthroughs

Description
• Add interrupt endpoint to stop streaming conversations
• Implement stop button UI with request tracking
• Add auto-refetch for conversations with pending topic summaries
• Prevent sending empty messages on Enter key press
Diagram
flowchart LR
  UI["Stop Button in UI"]
  Hook["useStopConversation Hook"]
  Client["LightspeedApiClient.stopMessage"]
  Backend["Backend /v1/query/interrupt"]
  Core["Lightspeed-Core Server"]
  
  UI -- "triggers" --> Hook
  Hook -- "calls" --> Client
  Client -- "POST request" --> Backend
  Backend -- "forwards" --> Core
  Core -- "interrupts stream" --> Backend
  Backend -- "returns success" --> Client
Loading

Grey Divider

File Changes

1. workspaces/lightspeed/plugins/lightspeed-backend/__fixtures__/lcsHandlers.ts 🧪 Tests +4/-0

Add mock interrupt endpoint handler

workspaces/lightspeed/plugins/lightspeed-backend/fixtures/lcsHandlers.ts


2. workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.test.ts 🧪 Tests +26/-0

Add tests for interrupt endpoint authorization

workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.test.ts


3. workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts ✨ Enhancement +45/-1

Implement interrupt endpoint with auth checks

workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts


View more (15)
4. workspaces/lightspeed/plugins/lightspeed/src/api/LightspeedApiClient.ts ✨ Enhancement +19/-0

Add stopMessage method to API client

workspaces/lightspeed/plugins/lightspeed/src/api/LightspeedApiClient.ts


5. workspaces/lightspeed/plugins/lightspeed/src/api/__tests__/LightspeedApiClient.test.ts 🧪 Tests +32/-0

Add tests for stopMessage API method

workspaces/lightspeed/plugins/lightspeed/src/api/tests/LightspeedApiClient.test.ts


6. workspaces/lightspeed/plugins/lightspeed/src/api/api.ts ✨ Enhancement +1/-0

Add stopMessage to LightspeedAPI interface

workspaces/lightspeed/plugins/lightspeed/src/api/api.ts


7. workspaces/lightspeed/plugins/lightspeed/src/hooks/index.ts ✨ Enhancement +1/-0

Export new useStopConversation hook

workspaces/lightspeed/plugins/lightspeed/src/hooks/index.ts


8. workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversationMessages.ts ✨ Enhancement +53/-12

Track request ID and handle interrupted events

workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversationMessages.ts


9. workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversations.ts ✨ Enhancement +6/-0

Add auto-refetch for pending topic summaries

workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversations.ts


10. workspaces/lightspeed/plugins/lightspeed/src/hooks/useCreateCoversationMessage.ts Formatting +1/-7

Refactor to use CreateMessageVariables type

workspaces/lightspeed/plugins/lightspeed/src/hooks/useCreateCoversationMessage.ts


11. workspaces/lightspeed/plugins/lightspeed/src/hooks/useStopConversation.ts ✨ Enhancement +39/-0

New hook for stopping conversations

workspaces/lightspeed/plugins/lightspeed/src/hooks/useStopConversation.ts


12. workspaces/lightspeed/.changeset/many-toys-sing.md 📝 Documentation +6/-0

Document stop button feature addition

workspaces/lightspeed/.changeset/many-toys-sing.md


13. workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx ✨ Enhancement +28/-1

Add stop button UI and empty message validation

workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx


14. workspaces/lightspeed/plugins/lightspeed/src/components/__tests__/LightspeedChat.test.tsx 🧪 Tests +1/-0

Add stopMessage mock to test setup

workspaces/lightspeed/plugins/lightspeed/src/components/tests/LightspeedChat.test.tsx


15. workspaces/lightspeed/plugins/lightspeed/src/hooks/__tests__/useConversationMessages.test.tsx 🧪 Tests +65/-0

Add test for interrupted event handling

workspaces/lightspeed/plugins/lightspeed/src/hooks/tests/useConversationMessages.test.tsx


16. workspaces/lightspeed/plugins/lightspeed/src/hooks/__tests__/useConversations.test.tsx 🧪 Tests +152/-0

Add tests for conversation refetch logic

workspaces/lightspeed/plugins/lightspeed/src/hooks/tests/useConversations.test.tsx


17. workspaces/lightspeed/plugins/lightspeed/src/hooks/__tests__/useStopConversation.test.tsx 🧪 Tests +92/-0

Add comprehensive tests for stop hook

workspaces/lightspeed/plugins/lightspeed/src/hooks/tests/useStopConversation.test.tsx


18. workspaces/lightspeed/plugins/lightspeed/src/utils/lightspeed-chatbox-utils.tsx 🐞 Bug fix +6/-2

Handle null topic summaries and add spinner

workspaces/lightspeed/plugins/lightspeed/src/utils/lightspeed-chatbox-utils.tsx


Grey Divider

Qodo Logo

@rhdh-qodo-merge
Copy link

rhdh-qodo-merge bot commented Mar 23, 2026

Code Review by Qodo

🐞 Bugs (2) 📘 Rule violations (0) 📎 Requirement gaps (0) 📐 Spec deviations (0)

Grey Divider


Action required

1. Interrupt endpoint sends twice📎 Requirement gap ⛯ Reliability
Description
The new interrupt route can attempt to write two responses for a single request when the upstream
returns a non-OK status, risking ERR_HTTP_HEADERS_SENT and a broken stop flow. This can prevent
stopping from ending gracefully.
Code

workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts[R226-233]

+      if (!fetchResponse.ok) {
+        const errorBody = await fetchResponse.json();
+        const errormsg = `Error from lightspeed-core server: ${errorBody.error?.message || errorBody?.detail?.cause || 'Unknown error'}`;
+        logger.error(errormsg);
+        response.status(500).json({ error: errormsg });
+      }
+      response.status(fetchResponse.status).json(await fetchResponse.json());
+    } catch (error) {
Evidence
PR Compliance ID 2 requires stopping streaming to end gracefully without errors. In the new
/v1/query/interrupt handler, when !fetchResponse.ok the code sends
response.status(500).json(...) but then continues to send another JSON response, which can trigger
runtime errors and break the stop/cancel flow.

Stopping streaming ends gracefully and communicates status to the user
workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts[226-233]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`/v1/query/interrupt` may send two responses when the upstream returns a non-OK response.

## Issue Context
This can throw `ERR_HTTP_HEADERS_SENT` and prevent the stop action from completing gracefully.

## Fix Focus Areas
- workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts[226-233]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


2. No stop confirmation message📎 Requirement gap ⛯ Reliability
Description
After Stop is pressed, the UI clears state but does not present any user-visible indication that
streaming was canceled/stopped, leaving chat state ambiguous. This violates the requirement to
communicate stop status to the user after canceling streaming.
Code

workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx[R684-694]

+  const handleStopButton = () => {
+    if (requestId) {
+      stopConversation(requestId);
+      setRequestId('');
+    }
+    setIsSendButtonDisabled(false);
+    setAnnouncement('');
+    setDraftMessage('');
+    setFileContents([]);
+    setUploadError({ message: null });
+  };
Evidence
PR Compliance ID 2 requires a user-facing message indicating the response was stopped/canceled. The
added handleStopButton resets UI state (setAnnouncement(''), clears draft/attachments) but does
not add any message/toast/banner communicating that streaming was stopped.

Stopping streaming ends gracefully and communicates status to the user
workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx[684-694]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
Clicking Stop cancels streaming but provides no user-visible confirmation that the response was stopped.

## Issue Context
Compliance requires canceling a stream to end gracefully and communicate stop/cancel status to the user.

## Fix Focus Areas
- workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx[684-694]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


3. topic_summary null mismatch🐞 Bug ✓ Correctness
Description
The PR adds logic that treats missing topic_summary as expected (polling until it exists and
showing a spinner), but ConversationSummary still types topic_summary as string and some
consumers assume it is always a string, which can break UI state/value handling when it is actually
null. This can cause runtime warnings/errors (e.g., setting a TextField value from null) and
undermines type safety.
Code

workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversations.ts[R33-38]

+    refetchInterval: query => {
+      const data = query.state.data;
+      if (!data?.length) return false;
+      const hasNullSummary = data.some(c => !c.topic_summary);
+      return hasNullSummary ? 2000 : false;
+    },
Evidence
useConversations now polls based on falsy topic_summary, and getCategorizeMessages shows a
spinner when it is missing; the new hook test explicitly uses topic_summary: null. However, the
shared type defines topic_summary: string, and RenameConversationModal assigns
conversation.topic_summary directly into a string state, which is unsafe if null is now possible
at runtime.

workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversations.ts[24-40]
workspaces/lightspeed/plugins/lightspeed/src/utils/lightspeed-chatbox-utils.tsx[178-227]
workspaces/lightspeed/plugins/lightspeed/src/types.ts[108-116]
workspaces/lightspeed/plugins/lightspeed/src/components/RenameConversationModal.tsx[55-66]
workspaces/lightspeed/plugins/lightspeed/src/hooks/tests/useConversations.test.tsx[68-82]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
The code now expects `topic_summary` to sometimes be missing/null, but the TypeScript type still requires `string`, and some UI paths assign it into string-only state without a null guard.

### Issue Context
- Polling uses `!c.topic_summary`.
- UI renders a spinner when `topic_summary` is falsy.
- Tests pass `topic_summary: null`.
- Types still define `topic_summary: string`.

### Fix Focus Areas
- workspaces/lightspeed/plugins/lightspeed/src/types.ts[108-116]
- workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversations.ts[24-40]
- workspaces/lightspeed/plugins/lightspeed/src/utils/lightspeed-chatbox-utils.tsx[178-227]
- workspaces/lightspeed/plugins/lightspeed/src/components/RenameConversationModal.tsx[55-66]

### Implementation notes
- Update `ConversationSummary.topic_summary` to `string | null` (or `topic_summary?: string | null` if optional).
- Normalize in UI where a string is required (e.g., `setChatName(conversation.topic_summary ?? '')`).
- Consider whether polling should specifically check for `null`/`undefined` rather than empty string if empty is a valid value.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Remediation recommended

4. Unvalidated request_id forwarded 🐞 Bug ⛯ Reliability
Description
useConversationMessages forwards data?.request_id from the untyped SSE start event to
onRequestIdReady without validating it is a non-empty string, so the stop-button state can be set
to an invalid value and the stop action becomes silently ineffective. This is a reliability gap when
upstream events are malformed or missing fields.
Code

workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversationMessages.ts[R289-292]

              if (event === 'start') {
+                requestId = data?.request_id;
+                onRequestIdReady?.(requestId);
+
Evidence
The hook assigns requestId = data?.request_id from parsed JSON (untyped) and immediately calls
onRequestIdReady?.(requestId). The receiver in LightspeedChat stores this into useState<string>
and the stop handler gates on truthiness, so a non-string/missing request_id makes Stop do nothing
without any surfaced error.

workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversationMessages.ts[283-297]
workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx[288-296]
workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx[682-689]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
`request_id` from the SSE `start` event is untyped and may be missing or not a string, but it is forwarded directly to the UI state used to trigger interrupts.

### Issue Context
- `useConversationMessages` does: `requestId = data?.request_id; onRequestIdReady?.(requestId);`
- The UI expects a string requestId.

### Fix Focus Areas
- workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversationMessages.ts[283-297]
- workspaces/lightspeed/plugins/lightspeed/src/components/LightSpeedChat.tsx[288-296]

### Implementation notes
- Guard before calling the callback, e.g.:
 - `const rid = data?.request_id; if (typeof rid === 'string' && rid.trim()) onRequestIdReady?.(rid);`
- Optionally clear requestId on `end`/`interrupted` so the Stop button doesn’t target stale IDs.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


5. Raw error object returned 🐞 Bug ⛨ Security
Description
The new interrupt handler returns the raw caught error object in the 500 JSON response, which can
leak internal details and may not serialize cleanly. This makes responses inconsistent with other
endpoints that return a string message.
Code

workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts[R233-240]

+    } catch (error) {
+      const errormsg = `Error while interrupting query: ${error}`;
+      logger.error(errormsg);
+      if (error instanceof NotAllowedError) {
+        response.status(403).json({ error: error.message });
+      } else {
+        response.status(500).json({ error: error });
+      }
Evidence
In the catch block for /v1/query/interrupt, the code logs a formatted string but sends `{ error:
error } for non-NotAllowed errors. Other handlers in the same file typically return { error:
errormsg }`, which avoids serializing arbitrary thrown objects and reduces information exposure.

workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts[233-241]
workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts[193-202]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
The interrupt endpoint returns a raw exception object in the HTTP response body for 500 errors.

### Issue Context
- Current behavior: `response.status(500).json({ error: error });`
- Safer/consistent behavior elsewhere: return a string message.

### Fix Focus Areas
- workspaces/lightspeed/plugins/lightspeed-backend/src/service/router.ts[233-241]

### Implementation notes
- Change to `response.status(500).json({ error: errormsg });` (or `String(error)`), while logging the full error separately via `logger.error` with structured fields if needed.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

@karthikjeeyar karthikjeeyar changed the title Add stop button feat(Lightspee): Add stop button to interrupt streaming conversation Mar 23, 2026
@karthikjeeyar karthikjeeyar changed the title feat(Lightspee): Add stop button to interrupt streaming conversation feat(Lightspeed): Add stop button to interrupt a streaming conversation Mar 23, 2026
@aprilma419
Copy link

The topic summary LGTM!

Two cents:

  1. Just wondering if we can re-enter the prompt for the users while the streaming is stopped?
  2. Instead of showing the "you interrupted the request", can we just show the unfinished response and add a hint below it with "Response stopped by the user"?
Screenshot 2026-03-23 at 22 08 59

@sonarqubecloud
Copy link

@karthikjeeyar
Copy link
Member Author

@aprilma419 I am now retaining the last used query if the conversation is interrupted.
retain_last_used_query

For the partial response, we will have a followup PR in LCORE to support it.

Copy link
Contributor

@Jdubrick Jdubrick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

small nit, generally lgtm though

Comment on lines +85 to +86
'/v1/feedback',
'/v1/query/interrupt',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
'/v1/feedback',
'/v1/query/interrupt',
'/v1/query/interrupt',
'/v1/feedback',

nit: can we keep them ordered?

Copy link
Contributor

@HusneShabbir HusneShabbir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Screen.Recording.2026-03-26.at.2.36.51.PM.mov

Works as expected, But One small observation —when a conversation gets interrupted, the side panel shows a loading icon. It feels a bit odd; I would like more clarity or feedback there.

Even after you continue the conversation in the same chat, it remains the same in the side panel
Screenshot 2026-03-26 at 2 49 54 PM

cc: @rohitkrai03 @aprilma419

@aprilma419
Copy link

To @HusneShabbir 's point, instead of the loader icon, if feasible, we can show a skeleton loading state.

Copy link
Contributor

@yangcao77 yangcao77 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generally lgtm from backend perspective. I will hold off the lgtm label and wait for frontend side approval as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants