Skip to content

feat: issue 2097 momentum features#2108

Open
embire2 wants to merge 1 commit intostackblitz-labs:mainfrom
embire2:feat/issue-2097-momentum
Open

feat: issue 2097 momentum features#2108
embire2 wants to merge 1 commit intostackblitz-labs:mainfrom
embire2:feat/issue-2097-momentum

Conversation

@embire2
Copy link

@embire2 embire2 commented Feb 5, 2026

Summary

  • add project memory per chat and inject it into prompts
  • add plan mode that creates/updates PLAN.md before changes
  • allow document attachments (text/PDF) with previews
  • add quick lock/unlock in the editor header
  • auto optimize prompts for small models
  • add issue tracker in docs/issue-2097-tracker.txt

Testing

  • pnpm typecheck
  • pnpm lint

Refs #2097

@embire2 embire2 mentioned this pull request Feb 5, 2026
@Stijnus Stijnus self-assigned this Feb 5, 2026
Copy link
Collaborator

@Stijnus Stijnus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review — PR #2108

Thanks for the contribution! The individual ideas here are solid, but there are several issues that should be addressed before merging.


1. Split into separate PRs

This PR bundles 5 unrelated features into a single commit (project memory, plan mode, document uploads, file lock/unlock, auto prompt optimization). Each is independently useful and should be reviewed/merged separately. This makes it easier to review, revert if needed, and bisect issues.


2. Document attachments are incomplete (text/PDF content never reaches the LLM)

In uploadUtils.ts, all files (including .txt, .pdf, .json) are read via readFileAsDataUrl(), but for non-image files the data URL is discardedimageData is set to '':

uploads.push({ file, imageData: isImageFile(file) ? dataUrl : '' });

The actual text content of these documents is never extracted or sent to the LLM. The UI shows a file preview card, but the content is silently dropped. This makes the document upload feature misleading — users will think their PDF/text files are being sent to the model when they aren't.


3. Race condition in project settings persistence

In Chat.client.tsx, two useEffect hooks fire when currentChatId changes:

// Effect 1: Load settings when chatId changes
useEffect(() => {
  const settings = getProjectSettings(currentChatId);
  setProjectMemory(settings.memory);
  setPlanMode(settings.planMode);
}, [currentChatId]);

// Effect 2: Save settings when memory/planMode changes
useEffect(() => {
  setProjectSettings(currentChatId, { memory: projectMemory, planMode });
}, [currentChatId, projectMemory, planMode]);

When currentChatId changes, both effects run. Effect 2 may execute with stale state (previous chat's values) before Effect 1 loads the new chat's settings — overwriting the new chat's saved settings with wrong data. This needs a guard (e.g. a skipNextSave ref or combining the effects).


4. Prompt injection risk in Project Memory

In stream-text.ts, user-provided memory is directly interpolated into the system prompt without any sanitization:

systemPrompt = `${systemPrompt}

    PROJECT MEMORY (USER-PROVIDED):
    ${trimmedMemory}
    ---
    `;

While the user is technically "attacking their own session," this is still poor practice. The memory content should be clearly fenced or placed in user-role messages instead to prevent accidental system prompt override.


5. No localStorage cleanup for project settings

projectSettings.ts stores entries as bolt_project_settings:{chatId}. These are never cleaned up when a chat is deleted, leading to unbounded localStorage accumulation over time.


6. Remove docs/issue-2097-tracker.txt

This tracking file doesn't belong in the repository — it should be tracked in the GitHub issue itself.


What's done well

  • uploadUtils.ts extraction is well-structured with proper MIME type and extension validation
  • ProjectMemoryDialog component is clean and follows existing UI patterns
  • Auto prompt optimization for small models is a nice idea with a reasonable threshold (16K tokens)
  • File lock/unlock reuses existing workbenchStore infrastructure properly
  • PanelHeaderButton title prop is a clean, non-breaking addition

Suggested path forward

  1. Split into separate PRs (one per feature)
  2. Fix document upload to actually extract and send text content to the LLM
  3. Fix the project settings race condition
  4. Remove the tracker file
  5. Add localStorage cleanup when chats are deleted

Happy to re-review once these are addressed!

Copy link
Collaborator

@Stijnus Stijnus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review — PR #2108

Thanks for the contribution! The individual ideas here are solid, but there are several issues that should be addressed before merging.


1. Split into separate PRs

This PR bundles 5 unrelated features into a single commit (project memory, plan mode, document uploads, file lock/unlock, auto prompt optimization). Each is independently useful and should be reviewed/merged separately. This makes it easier to review, revert if needed, and bisect issues.


2. Document attachments are incomplete (text/PDF content never reaches the LLM)

In uploadUtils.ts, all files (including .txt, .pdf, .json) are read via readFileAsDataUrl(), but for non-image files the data URL is discardedimageData is set to '':

uploads.push({ file, imageData: isImageFile(file) ? dataUrl : '' });

The actual text content of these documents is never extracted or sent to the LLM. The UI shows a file preview card, but the content is silently dropped. This makes the document upload feature misleading — users will think their PDF/text files are being sent to the model when they aren't.


3. Race condition in project settings persistence

In Chat.client.tsx, two useEffect hooks fire when currentChatId changes:

// Effect 1: Load settings when chatId changes
useEffect(() => {
  const settings = getProjectSettings(currentChatId);
  setProjectMemory(settings.memory);
  setPlanMode(settings.planMode);
}, [currentChatId]);

// Effect 2: Save settings when memory/planMode changes
useEffect(() => {
  setProjectSettings(currentChatId, { memory: projectMemory, planMode });
}, [currentChatId, projectMemory, planMode]);

When currentChatId changes, both effects run. Effect 2 may execute with stale state (previous chat values) before Effect 1 loads the new chat settings — potentially overwriting the new chat's saved settings with wrong data. This needs a guard (e.g. a skipNextSave ref or combining the effects).


4. Prompt injection risk in Project Memory

In stream-text.ts, user-provided memory is directly interpolated into the system prompt without any sanitization:

systemPrompt = `${systemPrompt}

    PROJECT MEMORY (USER-PROVIDED):
    ${trimmedMemory}
    ---
    `;

While the user is technically "attacking their own session," this is still poor practice. The memory content should be clearly fenced or placed in user-role messages instead to prevent accidental system prompt override.


5. No localStorage cleanup for project settings

projectSettings.ts stores entries as bolt_project_settings:{chatId}. These are never cleaned up when a chat is deleted, leading to unbounded localStorage accumulation over time.


6. Remove docs/issue-2097-tracker.txt

This tracking file doesn't belong in the repository — it should be tracked in the GitHub issue itself.


What's done well

  • uploadUtils.ts extraction is well-structured with proper MIME type and extension validation
  • ProjectMemoryDialog component is clean and follows existing UI patterns
  • Auto prompt optimization for small models is a nice idea with a reasonable threshold (16K tokens)
  • File lock/unlock reuses existing workbenchStore infrastructure properly
  • PanelHeaderButton title prop is a clean, non-breaking addition

Suggested path forward

  1. Split into separate PRs (one per feature)
  2. Fix document upload to actually extract and send text content to the LLM
  3. Fix the project settings race condition
  4. Remove the tracker file
  5. Add localStorage cleanup when chats are deleted

Happy to re-review once these are addressed!

@embire2
Copy link
Author

embire2 commented Feb 6, 2026

Thanks for the review — I’ve split out the document upload fix into its own PR: #2114

It extracts text/PDF content and ensures it’s sent to the LLM (ATTACHMENTS block), keeps attachment text aligned with uploads/screenshots, and includes PDF parsing via pdfjs-dist. Tests: typecheck, lint, and a Playwright e2e smoke verifying the chat payload contains the attachment text.

I’ll follow up shortly with the separate PRs for project memory and plan mode (with the race-condition + localStorage cleanup fixes).

@embire2
Copy link
Author

embire2 commented Feb 6, 2026

Follow‑up: the remaining split PRs are up as well.

Each includes the race‑condition guard + localStorage cleanup, and both have typecheck/lint + Playwright e2e smoke tests noted in the PR bodies.

@embire2
Copy link
Author

embire2 commented Feb 6, 2026

More splits are up:

Both include typecheck/lint + Playwright e2e smoke tests documented in the PR bodies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants