diff --git a/AGENTS.md b/AGENTS.md index bebab20fd..1c2e219d7 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -2,454 +2,150 @@ All docs must be canonical, with no past commentary, only live state. -## Project Summary -CodexMonitor is a Tauri app that orchestrates Codex agents across local workspaces. - -- Frontend: React + Vite -- Backend (app): Tauri Rust process -- Backend (daemon): `src-tauri/src/bin/codex_monitor_daemon.rs` -- Shared backend domain logic: `src-tauri/src/shared/*` - -## Backend Architecture - -The backend separates shared domain logic from environment wiring. - -- Shared domain/core logic: `src-tauri/src/shared/*` -- App wiring and platform concerns: feature folders + adapters -- Daemon wiring and transport concerns: `src-tauri/src/bin/codex_monitor_daemon.rs` and `src-tauri/src/bin/codex_monitor_daemon/*` - -## Feature Folders - -### Codex - -- `src-tauri/src/codex/mod.rs` -- `src-tauri/src/codex/args.rs` -- `src-tauri/src/codex/home.rs` -- `src-tauri/src/codex/config.rs` - -### Files - -- `src-tauri/src/files/mod.rs` -- `src-tauri/src/files/io.rs` -- `src-tauri/src/files/ops.rs` -- `src-tauri/src/files/policy.rs` - -### Dictation - -- `src-tauri/src/dictation/mod.rs` -- `src-tauri/src/dictation/real.rs` -- `src-tauri/src/dictation/stub.rs` - -### Workspaces - -- `src-tauri/src/workspaces/*` +## Scope -### Shared Core Layer +This file is the agent contract for how to work in this repo. +Detailed navigation/runbooks live in: -- `src-tauri/src/shared/*` +- `docs/codebase-map.md` (task-oriented file map: "if you need X, edit Y") +- `README.md` (setup, build, release, and broader project docs) -Root-level single-file features remain at `src-tauri/src/*.rs` (for example: `menu.rs`, `prompts.rs`, `terminal.rs`). -Module-folder features remain in dedicated folders (for example: `src-tauri/src/remote_backend/*`, `src-tauri/src/tailscale/*`). +## Project Snapshot -## Shared Core Modules (Source of Truth) - -Shared logic that must work in both the app and the daemon lives under `src-tauri/src/shared/`. - -- `src-tauri/src/shared/codex_core.rs` - - Threads, approvals, login/cancel, account, skills, config model -- `src-tauri/src/shared/codex_aux_core.rs` - - Codex helper logic used by app and daemon adapters -- `src-tauri/src/shared/codex_update_core.rs` - - Codex update and version comparison helpers -- `src-tauri/src/shared/workspaces_core.rs` - - Workspace/worktree operations, persistence, sorting, git command helpers -- `src-tauri/src/shared/settings_core.rs` - - App settings load/update, Codex config path -- `src-tauri/src/shared/files_core.rs` - - File read/write logic -- `src-tauri/src/shared/git_core.rs` - - Git command helpers and remote/branch logic -- `src-tauri/src/shared/git_ui_core.rs` - - Git/GitHub UI-facing backend logic shared across app and daemon -- `src-tauri/src/shared/local_usage_core.rs` - - Local usage snapshots and workspace usage aggregation -- `src-tauri/src/shared/orbit_core.rs` - - Orbit connectivity and auth flow helpers -- `src-tauri/src/shared/process_core.rs` - - Process spawn and command execution helper utilities -- `src-tauri/src/shared/prompts_core.rs` - - Prompt CRUD/listing helpers for global and workspace prompts -- `src-tauri/src/shared/worktree_core.rs` - - Worktree naming/path helpers and clone destination helpers -- `src-tauri/src/shared/account.rs` - - Account helper utilities and tests - -## App/Daemon Pattern - -Use this mental model when changing backend code: +CodexMonitor is a Tauri app that orchestrates Codex agents across local workspaces. -1. Put shared logic in a shared core module. -2. Keep app and daemon code as thin adapters. -3. Pass environment-specific behavior via closures or small adapter helpers. +- Frontend: React + Vite (`src/`) +- Backend app: Tauri Rust process (`src-tauri/src/lib.rs`) +- Backend daemon: JSON-RPC process (`src-tauri/src/bin/codex_monitor_daemon.rs`) +- Shared backend source of truth: `src-tauri/src/shared/*` -The app and daemon do not re-implement domain logic. +## Non-Negotiable Architecture Rules -## Daemon Module Wrappers +1. Put shared/domain backend logic in `src-tauri/src/shared/*` first. +2. Keep app and daemon as thin adapters around shared cores. +3. Do not duplicate logic between app and daemon. +4. Keep JSON-RPC method names and payload shapes stable unless intentionally changing contracts. +5. Keep frontend IPC contracts in sync with backend command surfaces. -The daemon defines wrapper modules named `codex` and `files` inside `src-tauri/src/bin/codex_monitor_daemon.rs`. +## Backend Routing Rules -These wrappers re-export the daemon’s local modules: +For backend behavior changes, follow this order: -- Codex: `codex_args`, `codex_home`, `codex_config` -- Files: `file_io`, `file_ops`, `file_policy` +1. Shared core (`src-tauri/src/shared/*`) when behavior is cross-runtime. +2. App adapter and Tauri command surface (`src-tauri/src/lib.rs` + adapter module). +3. Frontend IPC wrapper (`src/services/tauri.ts`). +4. Daemon RPC surface (`src-tauri/src/bin/codex_monitor_daemon/rpc.rs` + `rpc/*`). -Shared cores use `crate::codex::*` and `crate::files::*` paths. The daemon wrappers satisfy those paths without importing app-only modules. +If you add a backend command, update all relevant layers and tests. -## Key Paths +## Frontend Routing Rules -### Frontend +- Keep `src/App.tsx` as composition/wiring root. +- Move stateful orchestration into: + - `src/features/app/hooks/*` + - `src/features/app/bootstrap/*` + - `src/features/app/orchestration/*` +- Keep presentational UI in feature components. +- Keep Tauri calls in `src/services/tauri.ts` only. +- Keep event subscription fanout in `src/services/events.ts`. -- Composition root: `src/App.tsx` -- Feature slices: `src/features/` -- Tauri IPC wrapper: `src/services/tauri.ts` -- Tauri event hub: `src/services/events.ts` -- Shared UI types: `src/types.ts` -- Thread item normalization: `src/utils/threadItems.ts` -- Styles: `src/styles/` +## Import Aliases -### Backend (App) +Use project aliases for frontend imports: -- Tauri command registry: `src-tauri/src/lib.rs` -- Codex adapters: `src-tauri/src/codex/*` -- Files adapters: `src-tauri/src/files/*` -- Dictation adapters: `src-tauri/src/dictation/*` -- Workspaces adapters: `src-tauri/src/workspaces/*` -- Shared core layer: `src-tauri/src/shared/*` -- Git feature: `src-tauri/src/git/mod.rs` +- `@/*` -> `src/*` +- `@app/*` -> `src/features/app/*` +- `@settings/*` -> `src/features/settings/*` +- `@threads/*` -> `src/features/threads/*` +- `@services/*` -> `src/services/*` +- `@utils/*` -> `src/utils/*` -### Backend (Daemon) +## Key File Anchors +- Frontend composition root: `src/App.tsx` +- Frontend IPC wrapper: `src/services/tauri.ts` +- Frontend event hub: `src/services/events.ts` +- App command registry: `src-tauri/src/lib.rs` - Daemon entrypoint: `src-tauri/src/bin/codex_monitor_daemon.rs` -- Daemon RPC handler and method router: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` -- Daemon transport wiring: `src-tauri/src/bin/codex_monitor_daemon/transport.rs` -- Daemon imports shared cores via `#[path = "../shared/mod.rs"] mod shared;` - -## Architecture Guidelines - -### Frontend Guidelines - -- Composition root: keep orchestration in `src/App.tsx`. -- Components: presentational only. Props in, UI out. No Tauri IPC. -- Hooks: own state, side effects, and event wiring. -- Utils: pure helpers only in `src/utils/`. -- Services: all Tauri IPC goes through `src/services/`. -- Types: shared UI types live in `src/types.ts`. -- Styles: one CSS file per UI area under `src/styles/`. - -Keep `src/App.tsx` lean: - -- Keep it to wiring: hook composition, layout, and assembly. -- Move stateful logic/effects into hooks under `src/features/app/hooks/`. -- Keep Tauri IPC, menu listeners, and subscriptions out of `src/App.tsx`. - -### Design System Usage - -Use the design-system layer for shared UI shells and tokenized styling. - -- Primitive component locations: - - `src/features/design-system/components/modal/ModalShell.tsx` - - `src/features/design-system/components/toast/ToastPrimitives.tsx` - - `src/features/design-system/components/panel/PanelPrimitives.tsx` - - `src/features/design-system/components/popover/PopoverPrimitives.tsx` - - Toast sub-primitives: `ToastHeader`, `ToastActions`, `ToastError` (in `ToastPrimitives.tsx`) - - Panel sub-primitives: `PanelMeta`, `PanelSearchField`, `PanelNavList`, `PanelNavItem` (in `PanelPrimitives.tsx`) - - Popover sub-primitives: `PopoverMenuItem` (in `PopoverPrimitives.tsx`) -- Diff theming and style bridge: - - `src/features/design-system/diff/diffViewerTheme.ts` -- DS token/style locations: - - `src/styles/ds-tokens.css` - - `src/styles/ds-modal.css` - - `src/styles/ds-toast.css` - - `src/styles/ds-panel.css` - - `src/styles/ds-popover.css` - - `src/styles/ds-diff.css` +- Daemon RPC router: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` +- Shared workspaces core: `src-tauri/src/shared/workspaces_core.rs` + `src-tauri/src/shared/workspaces_core/*` +- Shared git UI core: `src-tauri/src/shared/git_ui_core.rs` + `src-tauri/src/shared/git_ui_core/*` +- Threads reducer entrypoint: `src/features/threads/hooks/useThreadsReducer.ts` +- Threads reducer slices: `src/features/threads/hooks/threadReducer/*` -Naming conventions: +For broader path maps, use `docs/codebase-map.md`. -- DS CSS classes use `.ds-*` prefixes. -- DS CSS variables use `--ds-*` prefixes. -- DS React primitives use `PascalCase` component names (`ModalShell`, `ToastCard`, `ToastHeader`, `ToastActions`, `ToastError`, `PanelFrame`, `PanelHeader`, `PanelMeta`, `PanelSearchField`, `PanelNavList`, `PanelNavItem`, `PopoverSurface`, `PopoverMenuItem`). -- Feature CSS should keep feature-prefixed classes (`.worktree-*`, `.update-*`) for content/layout specifics. - -Do: - -- Use DS primitives first for shared shells (modal wrappers, toast cards/viewports, panel shells/headers, popover/dropdown surfaces). -- Pull shared visual tokens from `--ds-*` variables. -- Keep feature styles focused on feature-specific layout/content, not duplicated shell chrome. -- Centralize shared animation/chrome in DS stylesheets when used by multiple feature families. +## App/Daemon Parity Checklist -Don't: - -- Recreate fixed modal backdrops/cards in feature CSS when `ModalShell` is used. -- Duplicate toast card chrome (background/border/shadow/padding/enter animation) per toast family. -- Duplicate panel shell layout/header alignment in feature styles when `PanelFrame`/`PanelHeader` already provide it. -- Recreate popover/dropdown shell chrome in feature CSS when `PopoverSurface`/`PopoverMenuItem` already provide it. -- Add new non-DS color constants for shared shells; add/extend DS tokens instead. +When changing backend behavior that can run remotely: -Migration guidance for new/updated components: +1. Shared core logic updated (or explicitly app-only/daemon-only). +2. App surface updated (`src-tauri/src/lib.rs` + adapter). +3. Frontend IPC updated (`src/services/tauri.ts`) when needed. +4. Daemon RPC updated (`rpc.rs` + `rpc/*`) when needed. +5. Contract/test coverage updated. -1. Start by wrapping UI in the closest DS primitive. -2. Migrate shared shell styles into DS CSS (`ds-*.css`) and delete redundant feature-level shell selectors. -3. Keep only feature-local classes for spacing/content/interaction details. -4. For legacy selectors that are still referenced, keep minimal compatibility aliases temporarily. -5. Remove compatibility aliases once callsites reach zero, then rerun lint/typecheck/tests. +## Design System Rule (High-Level) -Anti-duplication guidance: +Use existing design-system primitives and tokens for shared shell chrome. +Do not reintroduce duplicated modal/toast/panel/popover shell styling in feature CSS. -- Before adding shell styles, search for existing DS token/primitive coverage. -- If two or more feature files need the same shell rule, move it to DS CSS immediately. -- Prefer extending DS primitives/tokens over introducing another feature-specific wrapper class. -- During refactors, remove unused legacy selectors once callsites are migrated. +(See existing DS files and lint guardrails for implementation details.) -Enforcement workflow: +## Safety and Git Behavior -- Lint guardrails for DS-targeted files live in `.eslintrc.cjs`. -- Popover guardrails are enforced for migrated popover files (`MainHeader`, `Sidebar`, `SidebarHeader`, `SidebarCornerActions`, `OpenAppMenu`, `LaunchScript*`, `ComposerInput`, `FilePreviewPopover`, `WorkspaceHome`) to require `PopoverSurface`/`PopoverMenuItem`. -- Codemod scripts live in `scripts/codemods/`: - - `modal-shell-codemod.mjs` - - `panel-shell-codemod.mjs` - - `toast-shell-codemod.mjs` -- Run `npm run codemod:ds:dry` before UI shell migration PRs. -- Keep `npm run lint:ds`/`npm run lint` green for modal/toast/panel/popover/diff files. +- Prefer safe git operations (`status`, `diff`, `log`). +- Do not reset/revert unrelated user changes. +- If unrelated changes appear, continue focusing on owned files unless they block correctness. +- If conflicts impact correctness, call them out and choose the safest path. +- Fix root cause, not band-aids. -### Backend Guidelines +## Validation Matrix -- Shared logic goes in `src-tauri/src/shared/` first. -- App and daemon are thin adapters around shared cores. -- Avoid duplicating git/worktree/codex/settings/files logic in adapters. -- Prefer explicit, readable adapter helpers over clever abstractions. -- Do not folderize single-file features unless you are splitting them. - -## Daemon: How and When to Add Code - -The daemon runs backend logic outside the Tauri app. - -### When to Update the Daemon - -Update the daemon when one of these is true: - -- A Tauri command is used in remote mode. -- The daemon exposes the same behavior over its JSON-RPC transport. -- Shared core behavior changes and the daemon wiring must pass new inputs. - -### Where Code Goes - -1. Shared behavior or domain logic: - - Add or update code in `src-tauri/src/shared/*.rs`. -2. App-only behavior: - - Update the app adapters or Tauri commands. -3. Daemon-only transport/wiring behavior: - - Update `src-tauri/src/bin/codex_monitor_daemon.rs` and/or `src-tauri/src/bin/codex_monitor_daemon/*` modules. - -### How to Add a New Backend Command - -1. Implement the core logic in a shared module. -2. Wire it in the app. - - Add a Tauri command in `src-tauri/src/lib.rs`. - - Call the shared core from the appropriate adapter. - - Mirror it in `src/services/tauri.ts`. -3. Wire it in the daemon. - - Add a daemon method that calls the same shared core. - - Add/update the JSON-RPC handler branch in `src-tauri/src/bin/codex_monitor_daemon/rpc.rs`. - -### Backend Command Checklist - -For any new backend command or command shape change, update all layers: - -1. Shared logic first (`src-tauri/src/shared/*`) when behavior is domain-level. -2. Tauri command surface (`src-tauri/src/lib.rs`). -3. Frontend IPC client (`src/services/tauri.ts`). -4. Daemon JSON-RPC surface (`src-tauri/src/bin/codex_monitor_daemon/rpc.rs`). -5. Tests for touched layers (TS + Rust as applicable). - -### Adapter Patterns to Reuse - -- Shared git unit wrapper: - - `workspaces_core::run_git_command_unit(...)` -- App spawn adapter: - - `spawn_with_app(...)` in `src-tauri/src/workspaces/commands.rs` -- Daemon spawn adapter: - - `spawn_with_client(...)` in `src-tauri/src/bin/codex_monitor_daemon.rs` -- Daemon wrapper modules: - - `mod codex { ... }` and `mod files { ... }` in `codex_monitor_daemon.rs` - -If you find yourself copying logic between app and daemon, extract it into `src-tauri/src/shared/`. - -## App-Server Flow - -- Backend spawns `codex app-server` using the `codex` binary. -- Initialize with `initialize` and then `initialized`. -- Do not send requests before initialization. -- JSON-RPC notifications stream over stdout. -- Threads are listed via `thread/list` and resumed via `thread/resume`. -- Archiving uses `thread/archive`. - -## Event Stack (Tauri → React) - -The app uses a shared event hub so each native event has one `listen` and many subscribers. - -- Backend emits: `src-tauri/src/lib.rs` emits events to the main window. -- Frontend hub: `src/services/events.ts` defines `createEventHub` and module-level hubs. -- React subscription: use `useTauriEvent(subscribeX, handler)`. - -### Adding a New Tauri Event - -1. Emit the event in `src-tauri/src/lib.rs`. -2. Add a hub and `subscribeX` helper in `src/services/events.ts`. -3. Subscribe via `useTauriEvent` in a hook or component. -4. Update `src/services/events.test.ts` if you add new subscription helpers. - -## Workspace Persistence - -- Workspaces live in `workspaces.json` under the app data directory. -- Settings live in `settings.json` under the app data directory. -- On launch, the app connects each workspace once and loads its thread list. - -## Common Changes (Where to Look First) - -- UI layout or styling: - - `src/features/*/components/*` and `src/styles/*` -- App-server events: - - `src/features/app/hooks/useAppServerEvents.ts` -- Tauri IPC shape: - - `src/services/tauri.ts` and `src-tauri/src/lib.rs` -- Shared backend behavior: - - `src-tauri/src/shared/*` -- Workspaces/worktrees: - - Shared core: `src-tauri/src/shared/workspaces_core.rs` - - App adapters: `src-tauri/src/workspaces/*` - - Daemon wiring: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` -- Settings and Codex config: - - Shared core: `src-tauri/src/shared/settings_core.rs` - - App adapters: `src-tauri/src/codex/config.rs`, `src-tauri/src/settings/mod.rs` - - Daemon wiring: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` -- Files: - - Shared core: `src-tauri/src/shared/files_core.rs` - - App adapters: `src-tauri/src/files/*` - - Daemon wiring: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` -- Codex threads/approvals/login: - - Shared core: `src-tauri/src/shared/codex_core.rs` - - App adapters: `src-tauri/src/codex/*` - - Daemon wiring: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` - -## Navigation Hotspots - -High-churn or high-complexity files where extra care is needed: - -- `src/App.tsx` (frontend orchestration root) -- `src/features/settings/components/SettingsView.tsx` (settings surface) -- `src/features/threads/hooks/useThreadsReducer.ts` (thread state transitions) -- `src-tauri/src/shared/git_ui_core.rs` (shared git/github logic) -- `src-tauri/src/shared/workspaces_core.rs` (workspace/worktree core) -- `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` (daemon RPC routing) - -## Threads Feature Split (Frontend) - -`useThreads` is a composition layer that wires focused hooks and shared utilities. - -- Orchestration: `src/features/threads/hooks/useThreads.ts` -- Actions: `src/features/threads/hooks/useThreadActions.ts` -- Approvals: `src/features/threads/hooks/useThreadApprovals.ts` -- Event handlers: `src/features/threads/hooks/useThreadEventHandlers.ts` -- Messaging: `src/features/threads/hooks/useThreadMessaging.ts` -- Storage: `src/features/threads/hooks/useThreadStorage.ts` -- Status helpers: `src/features/threads/hooks/useThreadStatus.ts` -- Selectors: `src/features/threads/hooks/useThreadSelectors.ts` -- Rate limits: `src/features/threads/hooks/useThreadRateLimits.ts` -- Collab links: `src/features/threads/hooks/useThreadLinking.ts` - -## Running Locally +Run validations based on touched areas: -```bash -npm install -npm run tauri:dev -``` - -## iOS (WIP) - -- iOS is supported as WIP. -- Simulator: -```bash -./scripts/build_run_ios.sh -``` -- USB device: -```bash -./scripts/build_run_ios_device.sh --list-devices -./scripts/build_run_ios_device.sh --device "Dimillian’s iPhone" --team Z6P74P6T99 -``` -- If signing is not ready: -```bash -./scripts/build_run_ios_device.sh --open-xcode -``` - -## Release Build - -```bash -npm run tauri:build -``` +- Always: `npm run typecheck` +- Frontend behavior/state/hooks/components: `npm run test` +- Rust backend changes: `cd src-tauri && cargo check` +- Use targeted tests for touched modules before full-suite runs when iterating. -## Canonical Commands +## Quick Runbook -Prefer these package scripts (source of truth: `package.json`): +Core local commands (keep these inline for daily use): ```bash npm install npm run doctor:strict npm run tauri:dev -npm run tauri:build -npm run lint npm run test npm run typecheck +cd src-tauri && cargo check ``` -## Type Checking +Release build: ```bash -npm run typecheck +npm run tauri:build ``` -## Tests - -```bash -npm run test -``` +Focused test runs: ```bash -npm run test:watch +npm run test -- ``` -## Validation - -At the end of a task: +## Hotspots -1. Run `npm run lint`. -2. Run `npm run test` when you touched threads, settings, updater, shared utils, or backend cores. -3. Run `npm run typecheck`. -4. If you changed Rust backend code, run `cargo check` in `src-tauri`. +Use extra care in high-churn/high-complexity files: -## Notes - -- The window uses `titleBarStyle: "Overlay"` and macOS private APIs for transparency. -- Avoid breaking JSON-RPC format; the app-server is strict. -- App settings and Codex feature toggles are best-effort synced to `CODEX_HOME/config.toml`. -- UI preferences live in `localStorage`. -- GitHub issues require `gh` to be installed and authenticated. -- Custom prompts are loaded from `$CODEX_HOME/prompts` (or `~/.codex/prompts`). +- `src/App.tsx` +- `src/features/settings/components/SettingsView.tsx` +- `src/features/threads/hooks/useThreadsReducer.ts` +- `src-tauri/src/shared/git_ui_core.rs` +- `src-tauri/src/shared/workspaces_core.rs` +- `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` -## Error Toasts +## Canonical References -- Use `pushErrorToast` from `src/services/toasts.ts` for user-facing errors. -- Toast wiring: - - Hook: `src/features/notifications/hooks/useErrorToasts.ts` - - UI: `src/features/notifications/components/ErrorToasts.tsx` - - Styles: `src/styles/error-toasts.css` +- Task-oriented code map: `docs/codebase-map.md` +- Setup/build/release/test commands: `README.md` diff --git a/README.md b/README.md index 192c2a65d..4eca4c1ab 100644 --- a/README.md +++ b/README.md @@ -219,18 +219,30 @@ npm run typecheck cd src-tauri && cargo check ``` +## Codebase Navigation + +For task-oriented file lookup ("if you need X, edit Y"), use: + +- `docs/codebase-map.md` + ## Project Structure ``` src/ features/ feature-sliced UI + hooks + features/app/bootstrap/ app bootstrap orchestration + features/app/orchestration/ app layout/thread/workspace orchestration + features/threads/hooks/threadReducer/ thread reducer slices services/ Tauri IPC wrapper styles/ split CSS by area types.ts shared types src-tauri/ src/lib.rs Tauri app backend command registry src/bin/codex_monitor_daemon.rs remote daemon JSON-RPC process + src/bin/codex_monitor_daemon/rpc/ daemon RPC domain handlers src/shared/ shared backend core used by app + daemon + src/shared/git_ui_core/ git/github shared core modules + src/shared/workspaces_core/ workspace/worktree shared core modules src/workspaces/ workspace/worktree adapters src/codex/ codex app-server adapters src/files/ file adapters @@ -247,7 +259,8 @@ src-tauri/ - Selecting a thread always calls `thread/resume` to refresh messages from disk. - CLI sessions appear if their `cwd` matches the workspace path; they are not live-streamed unless resumed. - The app uses `codex app-server` over stdio; see `src-tauri/src/lib.rs` and `src-tauri/src/codex/`. -- The remote daemon entrypoint is `src-tauri/src/bin/codex_monitor_daemon.rs`; shared domain logic lives in `src-tauri/src/shared/`. +- The remote daemon entrypoint is `src-tauri/src/bin/codex_monitor_daemon.rs`; RPC routing lives in `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` and domain handlers in `src-tauri/src/bin/codex_monitor_daemon/rpc/`. +- Shared domain logic lives in `src-tauri/src/shared/` (notably `src-tauri/src/shared/git_ui_core/` and `src-tauri/src/shared/workspaces_core/`). - Codex home resolves from workspace settings (if set), then legacy `.codexmonitor/`, then `$CODEX_HOME`/`~/.codex`. - Worktree agents live under the app data directory (`worktrees/`); legacy `.codex-worktrees/` paths remain supported, and the app no longer edits repo `.gitignore` files. - UI state (panel sizes, reduced transparency toggle, recent thread activity) is stored in `localStorage`. diff --git a/docs/app-server-events.md b/docs/app-server-events.md index a1d3ac7f2..45559543f 100644 --- a/docs/app-server-events.md +++ b/docs/app-server-events.md @@ -1,4 +1,4 @@ -# App-Server Events Reference (Codex `383b45279efda1ef611a4aa286621815fe656b8a`) +# App-Server Events Reference (Codex `2c5eeb6b1fb32776b9c4d3d3ff62b55aa3c464a3`) This document helps agents quickly answer: - Which app-server events CodexMonitor supports right now. @@ -48,9 +48,10 @@ Primary outgoing request layer: ## Supported Events (Current) These are the app-server methods currently supported in -`src/utils/appServerEvents.ts` (`SUPPORTED_APP_SERVER_METHODS`) and routed in -`useAppServerEvents.ts`. +`src/utils/appServerEvents.ts` (`SUPPORTED_APP_SERVER_METHODS`) and then either +routed in `useAppServerEvents.ts` or handled in feature-specific subscriptions. +- `app/list/updated` - `codex/connected` - `*requestApproval` methods (matched via `isApprovalRequestMethod(method)`; suffix check) @@ -98,7 +99,6 @@ CodexMonitor status: Compared against Codex app-server protocol v2 notifications, the following events are currently not routed: -- `app/list/updated` - `rawResponseItem/completed` - `item/mcpToolCall/progress` - `mcpServer/oauthLogin/completed` @@ -138,6 +138,7 @@ Compared against Codex v2 request methods, CodexMonitor currently does not send: - `thread/unarchive` - `thread/rollback` +- `thread/backgroundTerminals/clean` - `thread/loaded/list` - `thread/read` - `skills/remote/read` diff --git a/docs/codebase-map.md b/docs/codebase-map.md new file mode 100644 index 000000000..c3a92a03e --- /dev/null +++ b/docs/codebase-map.md @@ -0,0 +1,141 @@ +# Codebase Map (Task-Oriented) + +Canonical navigation guide for CodexMonitor. Use this as: "if you need X, edit Y". + +## Start Here: How Changes Flow + +For backend behavior, follow this path in order: + +1. Frontend callsite: `src/features/**` hooks/components +2. Frontend IPC API: `src/services/tauri.ts` +3. Tauri command registration: `src-tauri/src/lib.rs` (`invoke_handler`) +4. App adapter: `src-tauri/src/{codex,workspaces,git,files,settings,prompts}/*` +5. Shared core source of truth: `src-tauri/src/shared/*` +6. Daemon RPC method parity: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` +7. Daemon state/wiring implementation: `src-tauri/src/bin/codex_monitor_daemon.rs` + +If a behavior must work in both app and daemon, implement it in `src-tauri/src/shared/*` first. + +## If You Need X, Edit Y + +| Need | Primary files to edit | +| --- | --- | +| App-level UI composition/layout wiring | `src/App.tsx`, `src/features/app/components/AppLayout.tsx`, `src/features/app/bootstrap/*`, `src/features/app/orchestration/*`, `src/features/app/hooks/*` | +| Add/change Tauri IPC methods used by frontend | `src/services/tauri.ts`, `src-tauri/src/lib.rs`, matching backend adapter module | +| Add/change app-server event handling in UI | `src/services/events.ts`, `src/features/app/hooks/useAppServerEvents.ts`, `src/utils/appServerEvents.ts`, `src/features/threads/utils/threadNormalize.ts` | +| Change thread state transitions | `src/features/threads/hooks/useThreadsReducer.ts`, `src/features/threads/hooks/threadReducer/*`, `src/features/threads/hooks/useThreads.ts`, focused thread hooks under `src/features/threads/hooks/*` | +| Change workspace lifecycle/worktree behavior | `src/features/workspaces/hooks/useWorkspaces.ts`, `src-tauri/src/workspaces/commands.rs`, `src-tauri/src/shared/workspaces_core.rs`, `src-tauri/src/shared/workspaces_core/*`, `src-tauri/src/shared/worktree_core.rs` | +| Change settings model/load/update | `src/features/settings/components/SettingsView.tsx`, `src/features/settings/hooks/useAppSettings.ts`, `src/services/tauri.ts`, `src-tauri/src/settings/mod.rs`, `src-tauri/src/shared/settings_core.rs`, `src-tauri/src/types.rs`, `src/types.ts` | +| Change Git/GitHub backend behavior | `src/features/git/hooks/*`, `src/services/tauri.ts`, `src-tauri/src/git/mod.rs`, `src-tauri/src/shared/git_ui_core.rs`, `src-tauri/src/shared/git_ui_core/*`, `src-tauri/src/shared/git_core.rs`, `src-tauri/src/bin/codex_monitor_daemon/rpc.rs`, `src-tauri/src/bin/codex_monitor_daemon/rpc/git.rs` | +| Change prompts CRUD/listing behavior | `src/features/prompts/hooks/useCustomPrompts.ts`, `src/features/prompts/components/PromptPanel.tsx`, `src/services/tauri.ts`, `src-tauri/src/prompts.rs`, `src-tauri/src/shared/prompts_core.rs`, `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` | +| Change file read/write for Agents/config | `src/services/tauri.ts`, `src-tauri/src/files/mod.rs`, `src-tauri/src/shared/files_core.rs`, `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` | +| Add/change daemon JSON-RPC surface | `src-tauri/src/bin/codex_monitor_daemon/rpc.rs`, `src-tauri/src/bin/codex_monitor_daemon/rpc/*`, `src-tauri/src/bin/codex_monitor_daemon.rs`, matching shared core | + +## Frontend Navigation + +- Composition root: `src/App.tsx` +- App bootstrap orchestration: `src/features/app/bootstrap/*` +- App layout/thread/workspace orchestration: `src/features/app/orchestration/*` +- Tauri IPC wrapper: `src/services/tauri.ts` +- Tauri event hub (single-listener fanout): `src/services/events.ts` +- Event subscription hook: `src/features/app/hooks/useTauriEvent.ts` +- App-server event router: `src/features/app/hooks/useAppServerEvents.ts` +- Shared frontend types: `src/types.ts` + +### Import Aliases + +Use TS/Vite aliases for refactor-safe imports: + +- `@/*` -> `src/*` +- `@app/*` -> `src/features/app/*` +- `@settings/*` -> `src/features/settings/*` +- `@threads/*` -> `src/features/threads/*` +- `@services/*` -> `src/services/*` +- `@utils/*` -> `src/utils/*` + +### Threads + +- Orchestrator: `src/features/threads/hooks/useThreads.ts` +- Reducer composition entrypoint: `src/features/threads/hooks/useThreadsReducer.ts` +- Reducer slices: `src/features/threads/hooks/threadReducer/*` +- Event-focused handlers: `src/features/threads/hooks/useThreadEventHandlers.ts`, `src/features/threads/hooks/useThreadTurnEvents.ts`, `src/features/threads/hooks/useThreadItemEvents.ts`, `src/features/threads/hooks/useThreadApprovalEvents.ts`, `src/features/threads/hooks/useThreadUserInputEvents.ts` +- Message send/steer/interrupt: `src/features/threads/hooks/useThreadMessaging.ts` +- Persistence/local thread metadata: `src/features/threads/hooks/useThreadStorage.ts`, `src/features/threads/utils/threadStorage.ts` + +### Workspaces + +- Workspace state and lifecycle: `src/features/workspaces/hooks/useWorkspaces.ts` +- Workspace home behavior: `src/features/workspaces/hooks/useWorkspaceHome.ts` +- Workspace file list and reads in app layer: `src/features/app/hooks/useWorkspaceFileListing.ts`, `src/features/workspaces/hooks/useWorkspaceFiles.ts` + +### Settings + +- Main settings surface: `src/features/settings/components/SettingsView.tsx` +- Settings state + persistence flow: `src/features/settings/hooks/useAppSettings.ts`, `src/features/app/hooks/useAppSettingsController.ts` +- Typed settings contracts: `src/types.ts` + +### Git + +- Git UI hooks: `src/features/git/hooks/*` +- Git panel components: `src/features/git/components/*` +- Branch workflows: `src/features/git/hooks/useGitBranches.ts`, `src/features/git/hooks/useBranchSwitcher.ts` + +### Prompts + +- Prompt UI and workflow: `src/features/prompts/components/PromptPanel.tsx`, `src/features/prompts/hooks/useCustomPrompts.ts` + +## Backend App (Tauri) Navigation + +- Command registry (what frontend can invoke): `src-tauri/src/lib.rs` +- Codex adapters: `src-tauri/src/codex/mod.rs` +- Workspace/worktree adapters: `src-tauri/src/workspaces/commands.rs` +- Git adapters: `src-tauri/src/git/mod.rs` +- Settings adapters: `src-tauri/src/settings/mod.rs` +- Prompts adapters: `src-tauri/src/prompts.rs` +- File adapters: `src-tauri/src/files/mod.rs` +- Event emission implementation: `src-tauri/src/event_sink.rs` +- Event payload definitions: `src-tauri/src/backend/events.rs` + +## Daemon Navigation + +- Daemon entrypoint and state/wiring: `src-tauri/src/bin/codex_monitor_daemon.rs` +- Daemon JSON-RPC dispatcher/router: `src-tauri/src/bin/codex_monitor_daemon/rpc.rs` +- Daemon domain handlers: `src-tauri/src/bin/codex_monitor_daemon/rpc/*` +- Daemon transport: `src-tauri/src/bin/codex_monitor_daemon/transport.rs` + +When adding a new method, keep method names and payload shape aligned with `src/services/tauri.ts` and app commands in `src-tauri/src/lib.rs`. + +## Shared Cores (Source of Truth) + +All cross-runtime domain behavior belongs in `src-tauri/src/shared/*`: + +- Codex threads/approvals/account/skills/config: `src-tauri/src/shared/codex_core.rs` +- Codex helper commands: `src-tauri/src/shared/codex_aux_core.rs` +- Codex update/version helpers: `src-tauri/src/shared/codex_update_core.rs` +- Workspaces/worktrees: `src-tauri/src/shared/workspaces_core.rs`, `src-tauri/src/shared/workspaces_core/*`, `src-tauri/src/shared/worktree_core.rs` +- Settings model/update: `src-tauri/src/shared/settings_core.rs` +- Files read/write: `src-tauri/src/shared/files_core.rs` +- Git and GitHub logic: `src-tauri/src/shared/git_core.rs`, `src-tauri/src/shared/git_ui_core.rs`, `src-tauri/src/shared/git_ui_core/*` +- Prompts CRUD/listing: `src-tauri/src/shared/prompts_core.rs` +- Usage snapshot and aggregation: `src-tauri/src/shared/local_usage_core.rs` +- Orbit connectivity/auth helpers: `src-tauri/src/shared/orbit_core.rs` +- Process helpers: `src-tauri/src/shared/process_core.rs` + +## Events Map (Backend -> Frontend) + +- Backend emits through sink: `src-tauri/src/event_sink.rs` +- App-server event name: `app-server-event` +- Terminal event names: `terminal-output`, `terminal-exit` +- Frontend fanout hubs: `src/services/events.ts` +- Frontend routing into thread state: `src/features/app/hooks/useAppServerEvents.ts` -> thread hooks/reducer under `src/features/threads/hooks/*` + +If event payload format changes, update parser/guards first in `src/utils/appServerEvents.ts`. + +## Type Contract Files + +Keep Rust and TypeScript contracts in sync: + +- Rust backend types: `src-tauri/src/types.rs` +- Frontend types: `src/types.ts` + +This is required for settings, workspace metadata, app-server payload handling, and RPC response decoding. diff --git a/src-tauri/gen/apple/codex-monitor_iOS/Info.plist b/src-tauri/gen/apple/codex-monitor_iOS/Info.plist index 6de7eadc7..98d0107e0 100644 --- a/src-tauri/gen/apple/codex-monitor_iOS/Info.plist +++ b/src-tauri/gen/apple/codex-monitor_iOS/Info.plist @@ -49,4 +49,4 @@ NSMicrophoneUsageDescription Allow access to the microphone for dictation. - + \ No newline at end of file diff --git a/src-tauri/src/bin/codex_monitor_daemon.rs b/src-tauri/src/bin/codex_monitor_daemon.rs index 90b9dcb1b..c866d7592 100644 --- a/src-tauri/src/bin/codex_monitor_daemon.rs +++ b/src-tauri/src/bin/codex_monitor_daemon.rs @@ -890,6 +890,26 @@ impl DaemonState { git_ui_core::get_git_status_core(&self.workspaces, workspace_id).await } + async fn init_git_repo( + &self, + workspace_id: String, + branch: String, + force: bool, + ) -> Result { + git_ui_core::init_git_repo_core(&self.workspaces, workspace_id, branch, force).await + } + + async fn create_github_repo( + &self, + workspace_id: String, + repo: String, + visibility: String, + branch: Option, + ) -> Result { + git_ui_core::create_github_repo_core(&self.workspaces, workspace_id, repo, visibility, branch) + .await + } + async fn list_git_roots( &self, workspace_id: String, diff --git a/src-tauri/src/bin/codex_monitor_daemon/rpc.rs b/src-tauri/src/bin/codex_monitor_daemon/rpc.rs index ef3c2a345..f7dc160e2 100644 --- a/src-tauri/src/bin/codex_monitor_daemon/rpc.rs +++ b/src-tauri/src/bin/codex_monitor_daemon/rpc.rs @@ -1,5 +1,18 @@ use super::*; +#[path = "rpc/codex.rs"] +mod codex; +#[path = "rpc/daemon.rs"] +mod daemon; +#[path = "rpc/dispatcher.rs"] +mod dispatcher; +#[path = "rpc/git.rs"] +mod git; +#[path = "rpc/prompts.rs"] +mod prompts; +#[path = "rpc/workspace.rs"] +mod workspace; + pub(super) fn build_error_response(id: Option, message: &str) -> Option { let id = id?; Some( @@ -51,7 +64,7 @@ pub(super) fn parse_auth_token(params: &Value) -> Option { } } -fn parse_string(value: &Value, key: &str) -> Result { +pub(super) fn parse_string(value: &Value, key: &str) -> Result { match value { Value::Object(map) => map .get(key) @@ -62,7 +75,7 @@ fn parse_string(value: &Value, key: &str) -> Result { } } -fn parse_optional_string(value: &Value, key: &str) -> Option { +pub(super) fn parse_optional_string(value: &Value, key: &str) -> Option { match value { Value::Object(map) => map .get(key) @@ -72,7 +85,7 @@ fn parse_optional_string(value: &Value, key: &str) -> Option { } } -fn parse_optional_u32(value: &Value, key: &str) -> Option { +pub(super) fn parse_optional_u32(value: &Value, key: &str) -> Option { match value { Value::Object(map) => map.get(key).and_then(|value| value.as_u64()).and_then(|v| { if v > u32::MAX as u64 { @@ -85,14 +98,14 @@ fn parse_optional_u32(value: &Value, key: &str) -> Option { } } -fn parse_optional_bool(value: &Value, key: &str) -> Option { +pub(super) fn parse_optional_bool(value: &Value, key: &str) -> Option { match value { Value::Object(map) => map.get(key).and_then(|value| value.as_bool()), _ => None, } } -fn parse_optional_string_array(value: &Value, key: &str) -> Option> { +pub(super) fn parse_optional_string_array(value: &Value, key: &str) -> Option> { match value { Value::Object(map) => map .get(key) @@ -107,642 +120,24 @@ fn parse_optional_string_array(value: &Value, key: &str) -> Option> } } -fn parse_string_array(value: &Value, key: &str) -> Result, String> { +pub(super) fn parse_string_array(value: &Value, key: &str) -> Result, String> { parse_optional_string_array(value, key).ok_or_else(|| format!("missing `{key}`")) } -fn parse_optional_value(value: &Value, key: &str) -> Option { +pub(super) fn parse_optional_value(value: &Value, key: &str) -> Option { match value { Value::Object(map) => map.get(key).cloned(), _ => None, } } -#[derive(Debug, Deserialize)] -#[serde(rename_all = "camelCase")] -struct FileReadRequest { - scope: file_policy::FileScope, - kind: file_policy::FileKind, - workspace_id: Option, -} - -#[derive(Debug, Deserialize)] -#[serde(rename_all = "camelCase")] -struct FileWriteRequest { - scope: file_policy::FileScope, - kind: file_policy::FileKind, - workspace_id: Option, - content: String, -} - -fn parse_file_read_request(params: &Value) -> Result { - serde_json::from_value(params.clone()).map_err(|err| err.to_string()) -} - -fn parse_file_write_request(params: &Value) -> Result { - serde_json::from_value(params.clone()).map_err(|err| err.to_string()) -} - pub(super) async fn handle_rpc_request( state: &DaemonState, method: &str, params: Value, client_version: String, ) -> Result { - match method { - "ping" => Ok(json!({ "ok": true })), - "daemon_info" => Ok(state.daemon_info()), - "daemon_shutdown" => { - tokio::spawn(async { - sleep(Duration::from_millis(100)).await; - std::process::exit(0); - }); - Ok(json!({ "ok": true })) - } - "list_workspaces" => { - let workspaces = state.list_workspaces().await; - serde_json::to_value(workspaces).map_err(|err| err.to_string()) - } - "is_workspace_path_dir" => { - let path = parse_string(¶ms, "path")?; - let is_dir = state.is_workspace_path_dir(path).await; - serde_json::to_value(is_dir).map_err(|err| err.to_string()) - } - "add_workspace" => { - let path = parse_string(¶ms, "path")?; - let codex_bin = parse_optional_string(¶ms, "codex_bin"); - let workspace = state.add_workspace(path, codex_bin, client_version).await?; - serde_json::to_value(workspace).map_err(|err| err.to_string()) - } - "add_worktree" => { - let parent_id = parse_string(¶ms, "parentId")?; - let branch = parse_string(¶ms, "branch")?; - let name = parse_optional_string(¶ms, "name"); - let copy_agents_md = parse_optional_bool(¶ms, "copyAgentsMd").unwrap_or(true); - let workspace = state - .add_worktree(parent_id, branch, name, copy_agents_md, client_version) - .await?; - serde_json::to_value(workspace).map_err(|err| err.to_string()) - } - "worktree_setup_status" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let status = state.worktree_setup_status(workspace_id).await?; - serde_json::to_value(status).map_err(|err| err.to_string()) - } - "worktree_setup_mark_ran" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.worktree_setup_mark_ran(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "connect_workspace" => { - let id = parse_string(¶ms, "id")?; - state.connect_workspace(id, client_version).await?; - Ok(json!({ "ok": true })) - } - "remove_workspace" => { - let id = parse_string(¶ms, "id")?; - state.remove_workspace(id).await?; - Ok(json!({ "ok": true })) - } - "remove_worktree" => { - let id = parse_string(¶ms, "id")?; - state.remove_worktree(id).await?; - Ok(json!({ "ok": true })) - } - "rename_worktree" => { - let id = parse_string(¶ms, "id")?; - let branch = parse_string(¶ms, "branch")?; - let workspace = state.rename_worktree(id, branch, client_version).await?; - serde_json::to_value(workspace).map_err(|err| err.to_string()) - } - "rename_worktree_upstream" => { - let id = parse_string(¶ms, "id")?; - let old_branch = parse_string(¶ms, "oldBranch")?; - let new_branch = parse_string(¶ms, "newBranch")?; - state - .rename_worktree_upstream(id, old_branch, new_branch) - .await?; - Ok(json!({ "ok": true })) - } - "update_workspace_settings" => { - let id = parse_string(¶ms, "id")?; - let settings_value = match params { - Value::Object(map) => map.get("settings").cloned().unwrap_or(Value::Null), - _ => Value::Null, - }; - let settings: WorkspaceSettings = - serde_json::from_value(settings_value).map_err(|err| err.to_string())?; - let workspace = state - .update_workspace_settings(id, settings, client_version) - .await?; - serde_json::to_value(workspace).map_err(|err| err.to_string()) - } - "update_workspace_codex_bin" => { - let id = parse_string(¶ms, "id")?; - let codex_bin = parse_optional_string(¶ms, "codex_bin"); - let workspace = state.update_workspace_codex_bin(id, codex_bin).await?; - serde_json::to_value(workspace).map_err(|err| err.to_string()) - } - "list_workspace_files" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let files = state.list_workspace_files(workspace_id).await?; - serde_json::to_value(files).map_err(|err| err.to_string()) - } - "read_workspace_file" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let path = parse_string(¶ms, "path")?; - let response = state.read_workspace_file(workspace_id, path).await?; - serde_json::to_value(response).map_err(|err| err.to_string()) - } - "file_read" => { - let request = parse_file_read_request(¶ms)?; - let response = state - .file_read(request.scope, request.kind, request.workspace_id) - .await?; - serde_json::to_value(response).map_err(|err| err.to_string()) - } - "file_write" => { - let request = parse_file_write_request(¶ms)?; - state - .file_write( - request.scope, - request.kind, - request.workspace_id, - request.content, - ) - .await?; - serde_json::to_value(json!({ "ok": true })).map_err(|err| err.to_string()) - } - "get_app_settings" => { - let settings = state.get_app_settings().await; - serde_json::to_value(settings).map_err(|err| err.to_string()) - } - "update_app_settings" => { - let settings_value = match params { - Value::Object(map) => map.get("settings").cloned().unwrap_or(Value::Null), - _ => Value::Null, - }; - let settings: AppSettings = - serde_json::from_value(settings_value).map_err(|err| err.to_string())?; - let updated = state.update_app_settings(settings).await?; - serde_json::to_value(updated).map_err(|err| err.to_string()) - } - "orbit_connect_test" => { - let result = state.orbit_connect_test().await?; - serde_json::to_value(result).map_err(|err| err.to_string()) - } - "orbit_sign_in_start" => { - let result = state.orbit_sign_in_start().await?; - serde_json::to_value(result).map_err(|err| err.to_string()) - } - "orbit_sign_in_poll" => { - let device_code = parse_string(¶ms, "deviceCode")?; - let result = state.orbit_sign_in_poll(device_code).await?; - serde_json::to_value(result).map_err(|err| err.to_string()) - } - "orbit_sign_out" => { - let result = state.orbit_sign_out().await?; - serde_json::to_value(result).map_err(|err| err.to_string()) - } - "get_codex_config_path" => { - let path = settings_core::get_codex_config_path_core()?; - Ok(Value::String(path)) - } - "get_config_model" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.get_config_model(workspace_id).await - } - "start_thread" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.start_thread(workspace_id).await - } - "resume_thread" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - state.resume_thread(workspace_id, thread_id).await - } - "fork_thread" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - state.fork_thread(workspace_id, thread_id).await - } - "list_threads" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let cursor = parse_optional_string(¶ms, "cursor"); - let limit = parse_optional_u32(¶ms, "limit"); - let sort_key = parse_optional_string(¶ms, "sortKey"); - state - .list_threads(workspace_id, cursor, limit, sort_key) - .await - } - "list_mcp_server_status" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let cursor = parse_optional_string(¶ms, "cursor"); - let limit = parse_optional_u32(¶ms, "limit"); - state - .list_mcp_server_status(workspace_id, cursor, limit) - .await - } - "archive_thread" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - state.archive_thread(workspace_id, thread_id).await - } - "compact_thread" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - state.compact_thread(workspace_id, thread_id).await - } - "set_thread_name" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - let name = parse_string(¶ms, "name")?; - state.set_thread_name(workspace_id, thread_id, name).await - } - "send_user_message" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - let text = parse_string(¶ms, "text")?; - let model = parse_optional_string(¶ms, "model"); - let effort = parse_optional_string(¶ms, "effort"); - let access_mode = parse_optional_string(¶ms, "accessMode"); - let images = parse_optional_string_array(¶ms, "images"); - let app_mentions = parse_optional_value(¶ms, "appMentions") - .and_then(|value| value.as_array().cloned()); - let collaboration_mode = parse_optional_value(¶ms, "collaborationMode"); - state - .send_user_message( - workspace_id, - thread_id, - text, - model, - effort, - access_mode, - images, - app_mentions, - collaboration_mode, - ) - .await - } - "turn_interrupt" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - let turn_id = parse_string(¶ms, "turnId")?; - state.turn_interrupt(workspace_id, thread_id, turn_id).await - } - "turn_steer" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - let turn_id = parse_string(¶ms, "turnId")?; - let text = parse_string(¶ms, "text")?; - let images = parse_optional_string_array(¶ms, "images"); - let app_mentions = parse_optional_value(¶ms, "appMentions") - .and_then(|value| value.as_array().cloned()); - state - .turn_steer(workspace_id, thread_id, turn_id, text, images, app_mentions) - .await - } - "start_review" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let thread_id = parse_string(¶ms, "threadId")?; - let target = params - .as_object() - .and_then(|map| map.get("target")) - .cloned() - .ok_or("missing `target`")?; - let delivery = parse_optional_string(¶ms, "delivery"); - state - .start_review(workspace_id, thread_id, target, delivery) - .await - } - "model_list" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.model_list(workspace_id).await - } - "collaboration_mode_list" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.collaboration_mode_list(workspace_id).await - } - "account_rate_limits" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.account_rate_limits(workspace_id).await - } - "account_read" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.account_read(workspace_id).await - } - "codex_login" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.codex_login(workspace_id).await - } - "codex_login_cancel" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.codex_login_cancel(workspace_id).await - } - "skills_list" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.skills_list(workspace_id).await - } - "apps_list" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let cursor = parse_optional_string(¶ms, "cursor"); - let limit = parse_optional_u32(¶ms, "limit"); - let thread_id = parse_optional_string(¶ms, "threadId"); - state.apps_list(workspace_id, cursor, limit, thread_id).await - } - "respond_to_server_request" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let map = params.as_object().ok_or("missing requestId")?; - let request_id = map - .get("requestId") - .cloned() - .filter(|value| value.is_number() || value.is_string()) - .ok_or("missing requestId")?; - let result = map.get("result").cloned().ok_or("missing `result`")?; - state - .respond_to_server_request(workspace_id, request_id, result) - .await - } - "remember_approval_rule" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let command = parse_string_array(¶ms, "command")?; - state.remember_approval_rule(workspace_id, command).await - } - "add_clone" => { - let source_workspace_id = parse_string(¶ms, "sourceWorkspaceId")?; - let copies_folder = parse_string(¶ms, "copiesFolder")?; - let copy_name = parse_string(¶ms, "copyName")?; - let workspace = state - .add_clone( - source_workspace_id, - copies_folder, - copy_name, - client_version, - ) - .await?; - serde_json::to_value(workspace).map_err(|err| err.to_string()) - } - "apply_worktree_changes" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.apply_worktree_changes(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "open_workspace_in" => { - let path = parse_string(¶ms, "path")?; - let app = parse_optional_string(¶ms, "app"); - let command = parse_optional_string(¶ms, "command"); - let args = parse_optional_string_array(¶ms, "args").unwrap_or_default(); - state.open_workspace_in(path, app, args, command).await?; - Ok(json!({ "ok": true })) - } - "get_open_app_icon" => { - let app_name = parse_string(¶ms, "appName")?; - let icon = state.get_open_app_icon(app_name).await?; - serde_json::to_value(icon).map_err(|err| err.to_string()) - } - "get_git_status" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.get_git_status(workspace_id).await - } - "list_git_roots" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let depth = parse_optional_u32(¶ms, "depth").map(|value| value as usize); - let roots = state.list_git_roots(workspace_id, depth).await?; - serde_json::to_value(roots).map_err(|err| err.to_string()) - } - "get_git_diffs" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let diffs = state.get_git_diffs(workspace_id).await?; - serde_json::to_value(diffs).map_err(|err| err.to_string()) - } - "get_git_log" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let limit = parse_optional_u32(¶ms, "limit").map(|value| value as usize); - let log = state.get_git_log(workspace_id, limit).await?; - serde_json::to_value(log).map_err(|err| err.to_string()) - } - "get_git_commit_diff" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let sha = parse_string(¶ms, "sha")?; - let diff = state.get_git_commit_diff(workspace_id, sha).await?; - serde_json::to_value(diff).map_err(|err| err.to_string()) - } - "get_git_remote" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let remote = state.get_git_remote(workspace_id).await?; - serde_json::to_value(remote).map_err(|err| err.to_string()) - } - "stage_git_file" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let path = parse_string(¶ms, "path")?; - state.stage_git_file(workspace_id, path).await?; - Ok(json!({ "ok": true })) - } - "stage_git_all" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.stage_git_all(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "unstage_git_file" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let path = parse_string(¶ms, "path")?; - state.unstage_git_file(workspace_id, path).await?; - Ok(json!({ "ok": true })) - } - "revert_git_file" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let path = parse_string(¶ms, "path")?; - state.revert_git_file(workspace_id, path).await?; - Ok(json!({ "ok": true })) - } - "revert_git_all" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.revert_git_all(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "commit_git" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let message = parse_string(¶ms, "message")?; - state.commit_git(workspace_id, message).await?; - Ok(json!({ "ok": true })) - } - "push_git" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.push_git(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "pull_git" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.pull_git(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "fetch_git" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.fetch_git(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "sync_git" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.sync_git(workspace_id).await?; - Ok(json!({ "ok": true })) - } - "get_github_issues" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let issues = state.get_github_issues(workspace_id).await?; - serde_json::to_value(issues).map_err(|err| err.to_string()) - } - "get_github_pull_requests" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let prs = state.get_github_pull_requests(workspace_id).await?; - serde_json::to_value(prs).map_err(|err| err.to_string()) - } - "get_github_pull_request_diff" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let pr_number = - parse_optional_u64(¶ms, "prNumber").ok_or("missing or invalid `prNumber`")?; - let diff = state - .get_github_pull_request_diff(workspace_id, pr_number) - .await?; - serde_json::to_value(diff).map_err(|err| err.to_string()) - } - "get_github_pull_request_comments" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let pr_number = - parse_optional_u64(¶ms, "prNumber").ok_or("missing or invalid `prNumber`")?; - let comments = state - .get_github_pull_request_comments(workspace_id, pr_number) - .await?; - serde_json::to_value(comments).map_err(|err| err.to_string()) - } - "list_git_branches" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - state.list_git_branches(workspace_id).await - } - "checkout_git_branch" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let name = parse_string(¶ms, "name")?; - state.checkout_git_branch(workspace_id, name).await?; - Ok(json!({ "ok": true })) - } - "create_git_branch" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let name = parse_string(¶ms, "name")?; - state.create_git_branch(workspace_id, name).await?; - Ok(json!({ "ok": true })) - } - "prompts_list" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let prompts = state.prompts_list(workspace_id).await?; - serde_json::to_value(prompts).map_err(|err| err.to_string()) - } - "prompts_workspace_dir" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let dir = state.prompts_workspace_dir(workspace_id).await?; - Ok(Value::String(dir)) - } - "prompts_global_dir" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let dir = state.prompts_global_dir(workspace_id).await?; - Ok(Value::String(dir)) - } - "prompts_create" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let scope = parse_string(¶ms, "scope")?; - let name = parse_string(¶ms, "name")?; - let description = parse_optional_string(¶ms, "description"); - let argument_hint = parse_optional_string(¶ms, "argumentHint"); - let content = parse_string(¶ms, "content")?; - let prompt = state - .prompts_create( - workspace_id, - scope, - name, - description, - argument_hint, - content, - ) - .await?; - serde_json::to_value(prompt).map_err(|err| err.to_string()) - } - "prompts_update" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let path = parse_string(¶ms, "path")?; - let name = parse_string(¶ms, "name")?; - let description = parse_optional_string(¶ms, "description"); - let argument_hint = parse_optional_string(¶ms, "argumentHint"); - let content = parse_string(¶ms, "content")?; - let prompt = state - .prompts_update( - workspace_id, - path, - name, - description, - argument_hint, - content, - ) - .await?; - serde_json::to_value(prompt).map_err(|err| err.to_string()) - } - "prompts_delete" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let path = parse_string(¶ms, "path")?; - state.prompts_delete(workspace_id, path).await?; - Ok(json!({ "ok": true })) - } - "prompts_move" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let path = parse_string(¶ms, "path")?; - let scope = parse_string(¶ms, "scope")?; - let prompt = state.prompts_move(workspace_id, path, scope).await?; - serde_json::to_value(prompt).map_err(|err| err.to_string()) - } - "codex_doctor" => { - let codex_bin = parse_optional_string(¶ms, "codexBin"); - let codex_args = parse_optional_string(¶ms, "codexArgs"); - state.codex_doctor(codex_bin, codex_args).await - } - "generate_commit_message" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let message = state.generate_commit_message(workspace_id).await?; - Ok(Value::String(message)) - } - "generate_run_metadata" => { - let workspace_id = parse_string(¶ms, "workspaceId")?; - let prompt = parse_string(¶ms, "prompt")?; - state.generate_run_metadata(workspace_id, prompt).await - } - "local_usage_snapshot" => { - let days = parse_optional_u32(¶ms, "days"); - let workspace_path = parse_optional_string(¶ms, "workspacePath"); - let snapshot = state.local_usage_snapshot(days, workspace_path).await?; - serde_json::to_value(snapshot).map_err(|err| err.to_string()) - } - "menu_set_accelerators" => { - let updates: Vec = match ¶ms { - Value::Object(map) => map - .get("updates") - .cloned() - .map(serde_json::from_value) - .transpose() - .map_err(|err| err.to_string())? - .unwrap_or_default(), - _ => Vec::new(), - }; - state.menu_set_accelerators(updates).await?; - Ok(json!({ "ok": true })) - } - "is_macos_debug_build" => { - let is_debug = state.is_macos_debug_build().await; - Ok(Value::Bool(is_debug)) - } - "send_notification_fallback" => { - let title = parse_string(¶ms, "title")?; - let body = parse_string(¶ms, "body")?; - state.send_notification_fallback(title, body).await?; - Ok(json!({ "ok": true })) - } - _ => Err(format!("unknown method: {method}")), - } + dispatcher::dispatch_rpc_request(state, method, ¶ms, &client_version).await } pub(super) async fn forward_events( diff --git a/src-tauri/src/bin/codex_monitor_daemon/rpc/codex.rs b/src-tauri/src/bin/codex_monitor_daemon/rpc/codex.rs new file mode 100644 index 000000000..35e75ffe2 --- /dev/null +++ b/src-tauri/src/bin/codex_monitor_daemon/rpc/codex.rs @@ -0,0 +1,338 @@ +use super::*; + +pub(super) async fn try_handle( + state: &DaemonState, + method: &str, + params: &Value, +) -> Option> { + match method { + "get_codex_config_path" => { + let path = match settings_core::get_codex_config_path_core() { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(Ok(Value::String(path))) + } + "get_config_model" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.get_config_model(workspace_id).await) + } + "start_thread" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.start_thread(workspace_id).await) + } + "resume_thread" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.resume_thread(workspace_id, thread_id).await) + } + "fork_thread" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.fork_thread(workspace_id, thread_id).await) + } + "list_threads" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let cursor = parse_optional_string(params, "cursor"); + let limit = parse_optional_u32(params, "limit"); + let sort_key = parse_optional_string(params, "sortKey"); + Some( + state + .list_threads(workspace_id, cursor, limit, sort_key) + .await, + ) + } + "list_mcp_server_status" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let cursor = parse_optional_string(params, "cursor"); + let limit = parse_optional_u32(params, "limit"); + Some( + state + .list_mcp_server_status(workspace_id, cursor, limit) + .await, + ) + } + "archive_thread" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.archive_thread(workspace_id, thread_id).await) + } + "compact_thread" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.compact_thread(workspace_id, thread_id).await) + } + "set_thread_name" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let name = match parse_string(params, "name") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.set_thread_name(workspace_id, thread_id, name).await) + } + "send_user_message" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let text = match parse_string(params, "text") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let model = parse_optional_string(params, "model"); + let effort = parse_optional_string(params, "effort"); + let access_mode = parse_optional_string(params, "accessMode"); + let images = parse_optional_string_array(params, "images"); + let app_mentions = parse_optional_value(params, "appMentions") + .and_then(|value| value.as_array().cloned()); + let collaboration_mode = parse_optional_value(params, "collaborationMode"); + Some( + state + .send_user_message( + workspace_id, + thread_id, + text, + model, + effort, + access_mode, + images, + app_mentions, + collaboration_mode, + ) + .await, + ) + } + "turn_interrupt" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let turn_id = match parse_string(params, "turnId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.turn_interrupt(workspace_id, thread_id, turn_id).await) + } + "turn_steer" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let turn_id = match parse_string(params, "turnId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let text = match parse_string(params, "text") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let images = parse_optional_string_array(params, "images"); + let app_mentions = parse_optional_value(params, "appMentions") + .and_then(|value| value.as_array().cloned()); + Some( + state + .turn_steer(workspace_id, thread_id, turn_id, text, images, app_mentions) + .await, + ) + } + "start_review" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let thread_id = match parse_string(params, "threadId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let target = match params + .as_object() + .and_then(|map| map.get("target")) + .cloned() + .ok_or("missing `target`") + { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + let delivery = parse_optional_string(params, "delivery"); + Some( + state + .start_review(workspace_id, thread_id, target, delivery) + .await, + ) + } + "model_list" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.model_list(workspace_id).await) + } + "collaboration_mode_list" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.collaboration_mode_list(workspace_id).await) + } + "account_rate_limits" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.account_rate_limits(workspace_id).await) + } + "account_read" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.account_read(workspace_id).await) + } + "codex_login" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.codex_login(workspace_id).await) + } + "codex_login_cancel" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.codex_login_cancel(workspace_id).await) + } + "skills_list" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.skills_list(workspace_id).await) + } + "apps_list" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let cursor = parse_optional_string(params, "cursor"); + let limit = parse_optional_u32(params, "limit"); + let thread_id = parse_optional_string(params, "threadId"); + Some( + state + .apps_list(workspace_id, cursor, limit, thread_id) + .await, + ) + } + "respond_to_server_request" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let map = match params.as_object().ok_or("missing requestId") { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + let request_id = match map + .get("requestId") + .cloned() + .filter(|value| value.is_number() || value.is_string()) + .ok_or("missing requestId") + { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + let result = match map.get("result").cloned().ok_or("missing `result`") { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + Some( + state + .respond_to_server_request(workspace_id, request_id, result) + .await, + ) + } + "remember_approval_rule" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let command = match parse_string_array(params, "command") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.remember_approval_rule(workspace_id, command).await) + } + "codex_doctor" => { + let codex_bin = parse_optional_string(params, "codexBin"); + let codex_args = parse_optional_string(params, "codexArgs"); + Some(state.codex_doctor(codex_bin, codex_args).await) + } + "generate_run_metadata" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let prompt = match parse_string(params, "prompt") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.generate_run_metadata(workspace_id, prompt).await) + } + _ => None, + } +} diff --git a/src-tauri/src/bin/codex_monitor_daemon/rpc/daemon.rs b/src-tauri/src/bin/codex_monitor_daemon/rpc/daemon.rs new file mode 100644 index 000000000..e491f0372 --- /dev/null +++ b/src-tauri/src/bin/codex_monitor_daemon/rpc/daemon.rs @@ -0,0 +1,60 @@ +use super::*; + +pub(super) async fn try_handle( + state: &DaemonState, + method: &str, + params: &Value, +) -> Option> { + match method { + "ping" => Some(Ok(json!({ "ok": true }))), + "daemon_info" => Some(Ok(state.daemon_info())), + "daemon_shutdown" => { + tokio::spawn(async { + tokio::time::sleep(std::time::Duration::from_millis(100)).await; + std::process::exit(0); + }); + Some(Ok(json!({ "ok": true }))) + } + "menu_set_accelerators" => { + let updates: Vec = match params { + Value::Object(map) => match map + .get("updates") + .cloned() + .map(serde_json::from_value) + .transpose() + { + Ok(value) => value.unwrap_or_default(), + Err(err) => return Some(Err(err.to_string())), + }, + _ => Vec::new(), + }; + Some( + state + .menu_set_accelerators(updates) + .await + .map(|_| json!({ "ok": true })), + ) + } + "is_macos_debug_build" => { + let is_debug = state.is_macos_debug_build().await; + Some(Ok(Value::Bool(is_debug))) + } + "send_notification_fallback" => { + let title = match parse_string(params, "title") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let body = match parse_string(params, "body") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .send_notification_fallback(title, body) + .await + .map(|_| json!({ "ok": true })), + ) + } + _ => None, + } +} diff --git a/src-tauri/src/bin/codex_monitor_daemon/rpc/dispatcher.rs b/src-tauri/src/bin/codex_monitor_daemon/rpc/dispatcher.rs new file mode 100644 index 000000000..781b3d7bb --- /dev/null +++ b/src-tauri/src/bin/codex_monitor_daemon/rpc/dispatcher.rs @@ -0,0 +1,30 @@ +use super::*; + +pub(super) async fn dispatch_rpc_request( + state: &DaemonState, + method: &str, + params: &Value, + client_version: &str, +) -> Result { + if let Some(result) = daemon::try_handle(state, method, params).await { + return result; + } + + if let Some(result) = workspace::try_handle(state, method, params, client_version).await { + return result; + } + + if let Some(result) = codex::try_handle(state, method, params).await { + return result; + } + + if let Some(result) = git::try_handle(state, method, params).await { + return result; + } + + if let Some(result) = prompts::try_handle(state, method, params).await { + return result; + } + + Err(format!("unknown method: {method}")) +} diff --git a/src-tauri/src/bin/codex_monitor_daemon/rpc/git.rs b/src-tauri/src/bin/codex_monitor_daemon/rpc/git.rs new file mode 100644 index 000000000..203e240b5 --- /dev/null +++ b/src-tauri/src/bin/codex_monitor_daemon/rpc/git.rs @@ -0,0 +1,359 @@ +use super::*; + +pub(super) async fn try_handle( + state: &DaemonState, + method: &str, + params: &Value, +) -> Option> { + match method { + "get_git_status" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.get_git_status(workspace_id).await) + } + "init_git_repo" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let branch = match parse_string(params, "branch") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let force = parse_optional_bool(params, "force").unwrap_or(false); + Some(state.init_git_repo(workspace_id, branch, force).await) + } + "create_github_repo" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let repo = match parse_string(params, "repo") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let visibility = match parse_string(params, "visibility") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let branch = parse_optional_string(params, "branch"); + Some( + state + .create_github_repo(workspace_id, repo, visibility, branch) + .await, + ) + } + "list_git_roots" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let depth = parse_optional_u32(params, "depth").map(|value| value as usize); + let roots = match state.list_git_roots(workspace_id, depth).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(roots).map_err(|err| err.to_string())) + } + "get_git_diffs" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let diffs = match state.get_git_diffs(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(diffs).map_err(|err| err.to_string())) + } + "get_git_log" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let limit = parse_optional_u32(params, "limit").map(|value| value as usize); + let log = match state.get_git_log(workspace_id, limit).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(log).map_err(|err| err.to_string())) + } + "get_git_commit_diff" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let sha = match parse_string(params, "sha") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let diff = match state.get_git_commit_diff(workspace_id, sha).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(diff).map_err(|err| err.to_string())) + } + "get_git_remote" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let remote = match state.get_git_remote(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(remote).map_err(|err| err.to_string())) + } + "stage_git_file" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .stage_git_file(workspace_id, path) + .await + .map(|_| json!({ "ok": true })), + ) + } + "stage_git_all" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .stage_git_all(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "unstage_git_file" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .unstage_git_file(workspace_id, path) + .await + .map(|_| json!({ "ok": true })), + ) + } + "revert_git_file" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .revert_git_file(workspace_id, path) + .await + .map(|_| json!({ "ok": true })), + ) + } + "revert_git_all" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .revert_git_all(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "commit_git" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let message = match parse_string(params, "message") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .commit_git(workspace_id, message) + .await + .map(|_| json!({ "ok": true })), + ) + } + "push_git" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .push_git(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "pull_git" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .pull_git(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "fetch_git" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .fetch_git(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "sync_git" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .sync_git(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "get_github_issues" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let issues = match state.get_github_issues(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(issues).map_err(|err| err.to_string())) + } + "get_github_pull_requests" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let prs = match state.get_github_pull_requests(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(prs).map_err(|err| err.to_string())) + } + "get_github_pull_request_diff" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let pr_number = match super::super::parse_optional_u64(params, "prNumber") + .ok_or("missing or invalid `prNumber`") + { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + let diff = match state + .get_github_pull_request_diff(workspace_id, pr_number) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(diff).map_err(|err| err.to_string())) + } + "get_github_pull_request_comments" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let pr_number = match super::super::parse_optional_u64(params, "prNumber") + .ok_or("missing or invalid `prNumber`") + { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + let comments = match state + .get_github_pull_request_comments(workspace_id, pr_number) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(comments).map_err(|err| err.to_string())) + } + "list_git_branches" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(state.list_git_branches(workspace_id).await) + } + "checkout_git_branch" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let name = match parse_string(params, "name") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .checkout_git_branch(workspace_id, name) + .await + .map(|_| json!({ "ok": true })), + ) + } + "create_git_branch" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let name = match parse_string(params, "name") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .create_git_branch(workspace_id, name) + .await + .map(|_| json!({ "ok": true })), + ) + } + "generate_commit_message" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let message = match state.generate_commit_message(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(Ok(Value::String(message))) + } + _ => None, + } +} diff --git a/src-tauri/src/bin/codex_monitor_daemon/rpc/prompts.rs b/src-tauri/src/bin/codex_monitor_daemon/rpc/prompts.rs new file mode 100644 index 000000000..353ea4c17 --- /dev/null +++ b/src-tauri/src/bin/codex_monitor_daemon/rpc/prompts.rs @@ -0,0 +1,149 @@ +use super::*; + +pub(super) async fn try_handle( + state: &DaemonState, + method: &str, + params: &Value, +) -> Option> { + match method { + "prompts_list" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let prompts = match state.prompts_list(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(prompts).map_err(|err| err.to_string())) + } + "prompts_workspace_dir" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let dir = match state.prompts_workspace_dir(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(Ok(Value::String(dir))) + } + "prompts_global_dir" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let dir = match state.prompts_global_dir(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(Ok(Value::String(dir))) + } + "prompts_create" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let scope = match parse_string(params, "scope") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let name = match parse_string(params, "name") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let description = parse_optional_string(params, "description"); + let argument_hint = parse_optional_string(params, "argumentHint"); + let content = match parse_string(params, "content") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let prompt = match state + .prompts_create( + workspace_id, + scope, + name, + description, + argument_hint, + content, + ) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(prompt).map_err(|err| err.to_string())) + } + "prompts_update" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let name = match parse_string(params, "name") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let description = parse_optional_string(params, "description"); + let argument_hint = parse_optional_string(params, "argumentHint"); + let content = match parse_string(params, "content") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let prompt = match state + .prompts_update( + workspace_id, + path, + name, + description, + argument_hint, + content, + ) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(prompt).map_err(|err| err.to_string())) + } + "prompts_delete" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .prompts_delete(workspace_id, path) + .await + .map(|_| json!({ "ok": true })), + ) + } + "prompts_move" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let scope = match parse_string(params, "scope") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let prompt = match state.prompts_move(workspace_id, path, scope).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(prompt).map_err(|err| err.to_string())) + } + _ => None, + } +} diff --git a/src-tauri/src/bin/codex_monitor_daemon/rpc/workspace.rs b/src-tauri/src/bin/codex_monitor_daemon/rpc/workspace.rs new file mode 100644 index 000000000..d26b8c36d --- /dev/null +++ b/src-tauri/src/bin/codex_monitor_daemon/rpc/workspace.rs @@ -0,0 +1,404 @@ +use super::*; + +#[derive(Debug, Deserialize)] +#[serde(rename_all = "camelCase")] +struct FileReadRequest { + scope: file_policy::FileScope, + kind: file_policy::FileKind, + workspace_id: Option, +} + +#[derive(Debug, Deserialize)] +#[serde(rename_all = "camelCase")] +struct FileWriteRequest { + scope: file_policy::FileScope, + kind: file_policy::FileKind, + workspace_id: Option, + content: String, +} + +fn parse_file_read_request(params: &Value) -> Result { + serde_json::from_value(params.clone()).map_err(|err| err.to_string()) +} + +fn parse_file_write_request(params: &Value) -> Result { + serde_json::from_value(params.clone()).map_err(|err| err.to_string()) +} + +pub(super) async fn try_handle( + state: &DaemonState, + method: &str, + params: &Value, + client_version: &str, +) -> Option> { + match method { + "list_workspaces" => { + let workspaces = state.list_workspaces().await; + Some(serde_json::to_value(workspaces).map_err(|err| err.to_string())) + } + "is_workspace_path_dir" => { + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let is_dir = state.is_workspace_path_dir(path).await; + Some(serde_json::to_value(is_dir).map_err(|err| err.to_string())) + } + "add_workspace" => { + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let codex_bin = parse_optional_string(params, "codex_bin"); + let workspace = match state + .add_workspace(path, codex_bin, client_version.to_string()) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(workspace).map_err(|err| err.to_string())) + } + "add_worktree" => { + let parent_id = match parse_string(params, "parentId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let branch = match parse_string(params, "branch") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let name = parse_optional_string(params, "name"); + let copy_agents_md = parse_optional_bool(params, "copyAgentsMd").unwrap_or(true); + let workspace = match state + .add_worktree( + parent_id, + branch, + name, + copy_agents_md, + client_version.to_string(), + ) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(workspace).map_err(|err| err.to_string())) + } + "worktree_setup_status" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let status = match state.worktree_setup_status(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(status).map_err(|err| err.to_string())) + } + "worktree_setup_mark_ran" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .worktree_setup_mark_ran(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "connect_workspace" => { + let id = match parse_string(params, "id") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .connect_workspace(id, client_version.to_string()) + .await + .map(|_| json!({ "ok": true })), + ) + } + "remove_workspace" => { + let id = match parse_string(params, "id") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .remove_workspace(id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "remove_worktree" => { + let id = match parse_string(params, "id") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .remove_worktree(id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "rename_worktree" => { + let id = match parse_string(params, "id") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let branch = match parse_string(params, "branch") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let workspace = match state + .rename_worktree(id, branch, client_version.to_string()) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(workspace).map_err(|err| err.to_string())) + } + "rename_worktree_upstream" => { + let id = match parse_string(params, "id") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let old_branch = match parse_string(params, "oldBranch") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let new_branch = match parse_string(params, "newBranch") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .rename_worktree_upstream(id, old_branch, new_branch) + .await + .map(|_| json!({ "ok": true })), + ) + } + "update_workspace_settings" => { + let id = match parse_string(params, "id") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let settings_value = match params { + Value::Object(map) => map.get("settings").cloned().unwrap_or(Value::Null), + _ => Value::Null, + }; + let settings: WorkspaceSettings = match serde_json::from_value(settings_value) { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + let workspace = match state + .update_workspace_settings(id, settings, client_version.to_string()) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(workspace).map_err(|err| err.to_string())) + } + "update_workspace_codex_bin" => { + let id = match parse_string(params, "id") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let codex_bin = parse_optional_string(params, "codex_bin"); + let workspace = match state.update_workspace_codex_bin(id, codex_bin).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(workspace).map_err(|err| err.to_string())) + } + "list_workspace_files" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let files = match state.list_workspace_files(workspace_id).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(files).map_err(|err| err.to_string())) + } + "read_workspace_file" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let response = match state.read_workspace_file(workspace_id, path).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(response).map_err(|err| err.to_string())) + } + "file_read" => { + let request = match parse_file_read_request(params) { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let response = match state + .file_read(request.scope, request.kind, request.workspace_id) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(response).map_err(|err| err.to_string())) + } + "file_write" => { + let request = match parse_file_write_request(params) { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + if let Err(err) = state + .file_write( + request.scope, + request.kind, + request.workspace_id, + request.content, + ) + .await + { + return Some(Err(err)); + } + Some(serde_json::to_value(json!({ "ok": true })).map_err(|err| err.to_string())) + } + "get_app_settings" => { + let settings = state.get_app_settings().await; + Some(serde_json::to_value(settings).map_err(|err| err.to_string())) + } + "update_app_settings" => { + let settings_value = match params { + Value::Object(map) => map.get("settings").cloned().unwrap_or(Value::Null), + _ => Value::Null, + }; + let settings: AppSettings = match serde_json::from_value(settings_value) { + Ok(value) => value, + Err(err) => return Some(Err(err.to_string())), + }; + let updated = match state.update_app_settings(settings).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(updated).map_err(|err| err.to_string())) + } + "orbit_connect_test" => { + let result = match state.orbit_connect_test().await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(result).map_err(|err| err.to_string())) + } + "orbit_sign_in_start" => { + let result = match state.orbit_sign_in_start().await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(result).map_err(|err| err.to_string())) + } + "orbit_sign_in_poll" => { + let device_code = match parse_string(params, "deviceCode") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let result = match state.orbit_sign_in_poll(device_code).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(result).map_err(|err| err.to_string())) + } + "orbit_sign_out" => { + let result = match state.orbit_sign_out().await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(result).map_err(|err| err.to_string())) + } + "add_clone" => { + let source_workspace_id = match parse_string(params, "sourceWorkspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let copies_folder = match parse_string(params, "copiesFolder") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let copy_name = match parse_string(params, "copyName") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let workspace = match state + .add_clone( + source_workspace_id, + copies_folder, + copy_name, + client_version.to_string(), + ) + .await + { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(workspace).map_err(|err| err.to_string())) + } + "apply_worktree_changes" => { + let workspace_id = match parse_string(params, "workspaceId") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some( + state + .apply_worktree_changes(workspace_id) + .await + .map(|_| json!({ "ok": true })), + ) + } + "open_workspace_in" => { + let path = match parse_string(params, "path") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let app = parse_optional_string(params, "app"); + let command = parse_optional_string(params, "command"); + let args = parse_optional_string_array(params, "args").unwrap_or_default(); + Some( + state + .open_workspace_in(path, app, args, command) + .await + .map(|_| json!({ "ok": true })), + ) + } + "get_open_app_icon" => { + let app_name = match parse_string(params, "appName") { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + let icon = match state.get_open_app_icon(app_name).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(icon).map_err(|err| err.to_string())) + } + "local_usage_snapshot" => { + let days = parse_optional_u32(params, "days"); + let workspace_path = parse_optional_string(params, "workspacePath"); + let snapshot = match state.local_usage_snapshot(days, workspace_path).await { + Ok(value) => value, + Err(err) => return Some(Err(err)), + }; + Some(serde_json::to_value(snapshot).map_err(|err| err.to_string())) + } + _ => None, + } +} diff --git a/src-tauri/src/git/mod.rs b/src-tauri/src/git/mod.rs index 5b2b2b1c1..a3da3f522 100644 --- a/src-tauri/src/git/mod.rs +++ b/src-tauri/src/git/mod.rs @@ -84,6 +84,43 @@ pub(crate) async fn get_git_status( git_ui_core::get_git_status_core(&state.workspaces, workspace_id).await } +#[tauri::command] +pub(crate) async fn init_git_repo( + workspace_id: String, + branch: String, + force: Option, + state: State<'_, AppState>, + app: AppHandle, +) -> Result { + try_remote_value!( + state, + app, + "init_git_repo", + json!({ "workspaceId": &workspace_id, "branch": &branch, "force": force }) + ); + git_ui_core::init_git_repo_core(&state.workspaces, workspace_id, branch, force.unwrap_or(false)) + .await +} + +#[tauri::command] +pub(crate) async fn create_github_repo( + workspace_id: String, + repo: String, + visibility: String, + branch: Option, + state: State<'_, AppState>, + app: AppHandle, +) -> Result { + try_remote_value!( + state, + app, + "create_github_repo", + json!({ "workspaceId": &workspace_id, "repo": &repo, "visibility": &visibility, "branch": branch }) + ); + git_ui_core::create_github_repo_core(&state.workspaces, workspace_id, repo, visibility, branch) + .await +} + #[tauri::command] pub(crate) async fn stage_git_file( workspace_id: String, diff --git a/src-tauri/src/lib.rs b/src-tauri/src/lib.rs index 1c9a6b4de..8c0f3833c 100644 --- a/src-tauri/src/lib.rs +++ b/src-tauri/src/lib.rs @@ -252,6 +252,8 @@ pub fn run() { codex::collaboration_mode_list, workspaces::connect_workspace, git::get_git_status, + git::init_git_repo, + git::create_github_repo, git::list_git_roots, git::get_git_diffs, git::get_git_log, diff --git a/src-tauri/src/shared/codex_update_core.rs b/src-tauri/src/shared/codex_update_core.rs index 17b50acf9..abd82bfef 100644 --- a/src-tauri/src/shared/codex_update_core.rs +++ b/src-tauri/src/shared/codex_update_core.rs @@ -1,3 +1,5 @@ +#![allow(dead_code)] + use serde_json::Value; use std::time::Duration; diff --git a/src-tauri/src/shared/git_core.rs b/src-tauri/src/shared/git_core.rs index cbe24b6c0..682e116de 100644 --- a/src-tauri/src/shared/git_core.rs +++ b/src-tauri/src/shared/git_core.rs @@ -1,5 +1,3 @@ -#![allow(dead_code)] - use std::path::PathBuf; use crate::shared::process_core::tokio_command; @@ -131,6 +129,8 @@ pub(crate) async fn git_remote_branch_exists_live( Err(format_git_error(&output.stdout, &output.stderr)) } +// Used by daemon-only worktree orchestration paths. +#[allow(dead_code)] pub(crate) async fn git_remote_branch_exists_local( repo_path: &PathBuf, remote: &str, @@ -183,6 +183,8 @@ pub(crate) async fn git_find_remote_for_branch_live( Ok(None) } +// Used by daemon-only worktree orchestration paths. +#[allow(dead_code)] pub(crate) async fn git_find_remote_tracking_branch_local( repo_path: &PathBuf, branch: &str, diff --git a/src-tauri/src/shared/git_ui_core.rs b/src-tauri/src/shared/git_ui_core.rs index 5d5716e58..f08574d6f 100644 --- a/src-tauri/src/shared/git_ui_core.rs +++ b/src-tauri/src/shared/git_ui_core.rs @@ -1,1590 +1,64 @@ -use std::collections::{HashMap, HashSet}; -use std::fs; -use std::io::{Read, Write}; +use std::collections::HashMap; use std::path::{Path, PathBuf}; -use std::process::Stdio; -use base64::{engine::general_purpose::STANDARD, Engine as _}; -use git2::{BranchType, DiffOptions, Repository, Sort, Status, StatusOptions}; -use serde_json::{json, Value}; +use serde_json::Value; use tokio::sync::Mutex; -use crate::git_utils::{ - checkout_branch, commit_to_entry, diff_patch_to_string, diff_stats_for_path, image_mime_type, - list_git_roots as scan_git_roots, parse_github_repo, resolve_git_root, -}; -use crate::shared::process_core::tokio_command; use crate::types::{ - AppSettings, BranchInfo, GitCommitDiff, GitFileDiff, GitFileStatus, GitHubIssue, - GitHubIssuesResponse, GitHubPullRequest, GitHubPullRequestComment, GitHubPullRequestDiff, - GitHubPullRequestsResponse, GitLogResponse, WorkspaceEntry, + AppSettings, GitCommitDiff, GitFileDiff, GitHubIssuesResponse, GitHubPullRequestComment, + GitHubPullRequestDiff, GitHubPullRequestsResponse, GitLogResponse, WorkspaceEntry, }; -use crate::utils::{git_env_path, normalize_git_path, resolve_git_binary}; - -const INDEX_SKIP_WORKTREE_FLAG: u16 = 0x4000; -const MAX_IMAGE_BYTES: usize = 10 * 1024 * 1024; -const MAX_TEXT_DIFF_BYTES: usize = 2 * 1024 * 1024; - -fn encode_image_base64(data: &[u8]) -> Option { - if data.len() > MAX_IMAGE_BYTES { - return None; - } - Some(STANDARD.encode(data)) -} - -fn blob_to_base64(blob: git2::Blob) -> Option { - if blob.size() > MAX_IMAGE_BYTES { - return None; - } - encode_image_base64(blob.content()) -} - -fn read_image_base64(path: &Path) -> Option { - let metadata = fs::metadata(path).ok()?; - if metadata.len() > MAX_IMAGE_BYTES as u64 { - return None; - } - let data = fs::read(path).ok()?; - encode_image_base64(&data) -} - -fn bytes_look_binary(bytes: &[u8]) -> bool { - bytes.iter().take(8192).any(|byte| *byte == 0) -} - -fn split_lines_preserving_newlines(content: &str) -> Vec { - if content.is_empty() { - return Vec::new(); - } - content - .split_inclusive('\n') - .map(ToString::to_string) - .collect() -} - -fn blob_to_lines(blob: git2::Blob<'_>) -> Option> { - if blob.size() > MAX_TEXT_DIFF_BYTES || blob.is_binary() { - return None; - } - let content = String::from_utf8_lossy(blob.content()); - Some(split_lines_preserving_newlines(content.as_ref())) -} - -fn read_text_lines(path: &Path) -> Option> { - let metadata = fs::metadata(path).ok()?; - if metadata.len() > MAX_TEXT_DIFF_BYTES as u64 { - return None; - } - let data = fs::read(path).ok()?; - if bytes_look_binary(&data) { - return None; - } - let content = String::from_utf8_lossy(&data); - Some(split_lines_preserving_newlines(content.as_ref())) -} - -async fn run_git_command(repo_root: &Path, args: &[&str]) -> Result<(), String> { - let git_bin = resolve_git_binary().map_err(|e| format!("Failed to run git: {e}"))?; - let output = tokio_command(git_bin) - .args(args) - .current_dir(repo_root) - .env("PATH", git_env_path()) - .output() - .await - .map_err(|e| format!("Failed to run git: {e}"))?; - - if output.status.success() { - return Ok(()); - } - - let stderr = String::from_utf8_lossy(&output.stderr); - let stdout = String::from_utf8_lossy(&output.stdout); - let detail = if stderr.trim().is_empty() { - stdout.trim() - } else { - stderr.trim() - }; - if detail.is_empty() { - return Err("Git command failed.".to_string()); - } - Err(detail.to_string()) -} -fn action_paths_for_file(repo_root: &Path, path: &str) -> Vec { - let target = normalize_git_path(path).trim().to_string(); - if target.is_empty() { - return Vec::new(); - } - - let repo = match Repository::open(repo_root) { - Ok(repo) => repo, - Err(_) => return vec![target], - }; - - let mut status_options = StatusOptions::new(); - status_options - .include_untracked(true) - .recurse_untracked_dirs(true) - .renames_head_to_index(true) - .renames_index_to_workdir(true) - .include_ignored(false); - - let statuses = match repo.statuses(Some(&mut status_options)) { - Ok(statuses) => statuses, - Err(_) => return vec![target], - }; - - for entry in statuses.iter() { - let status = entry.status(); - if !(status.contains(Status::WT_RENAMED) || status.contains(Status::INDEX_RENAMED)) { - continue; - } - let delta = entry.index_to_workdir().or_else(|| entry.head_to_index()); - let Some(delta) = delta else { - continue; - }; - let (Some(old_path), Some(new_path)) = (delta.old_file().path(), delta.new_file().path()) - else { - continue; - }; - let old_path = normalize_git_path(old_path.to_string_lossy().as_ref()); - let new_path = normalize_git_path(new_path.to_string_lossy().as_ref()); - if old_path != target && new_path != target { - continue; - } - if old_path == new_path || new_path.is_empty() { - return vec![target]; - } - let mut result = Vec::new(); - if !old_path.is_empty() { - result.push(old_path); - } - if !new_path.is_empty() && !result.contains(&new_path) { - result.push(new_path); - } - return if result.is_empty() { - vec![target] - } else { - result - }; - } - - vec![target] -} - -fn parse_upstream_ref(name: &str) -> Option<(String, String)> { - let trimmed = name.strip_prefix("refs/remotes/").unwrap_or(name); - let mut parts = trimmed.splitn(2, '/'); - let remote = parts.next()?; - let branch = parts.next()?; - if remote.is_empty() || branch.is_empty() { - return None; - } - Some((remote.to_string(), branch.to_string())) -} - -fn upstream_remote_and_branch(repo_root: &Path) -> Result, String> { - let repo = Repository::open(repo_root).map_err(|e| e.to_string())?; - let head = match repo.head() { - Ok(head) => head, - Err(_) => return Ok(None), - }; - if !head.is_branch() { - return Ok(None); - } - let branch_name = match head.shorthand() { - Some(name) => name, - None => return Ok(None), - }; - let branch = repo - .find_branch(branch_name, BranchType::Local) - .map_err(|e| e.to_string())?; - let upstream_branch = match branch.upstream() { - Ok(upstream) => upstream, - Err(_) => return Ok(None), - }; - let upstream_ref = upstream_branch.get(); - let upstream_name = upstream_ref.name().or_else(|| upstream_ref.shorthand()); - Ok(upstream_name.and_then(parse_upstream_ref)) -} - -async fn push_with_upstream(repo_root: &Path) -> Result<(), String> { - let upstream = upstream_remote_and_branch(repo_root)?; - if let Some((remote, branch)) = upstream { - let _ = run_git_command(repo_root, &["fetch", "--prune", remote.as_str()]).await; - let refspec = format!("HEAD:{branch}"); - return run_git_command(repo_root, &["push", remote.as_str(), refspec.as_str()]).await; - } - run_git_command(repo_root, &["push"]).await -} - -async fn fetch_with_default_remote(repo_root: &Path) -> Result<(), String> { - let upstream = upstream_remote_and_branch(repo_root)?; - if let Some((remote, _)) = upstream { - return run_git_command(repo_root, &["fetch", "--prune", remote.as_str()]).await; - } - run_git_command(repo_root, &["fetch", "--prune"]).await -} - -async fn pull_with_default_strategy(repo_root: &Path) -> Result<(), String> { - fn autostash_unsupported(lower: &str) -> bool { - lower.contains("unknown option") && lower.contains("autostash") - } - - fn needs_reconcile_strategy(lower: &str) -> bool { - lower.contains("need to specify how to reconcile divergent branches") - || lower.contains("you have divergent branches") - } - - match run_git_command(repo_root, &["pull", "--autostash"]).await { - Ok(()) => Ok(()), - Err(err) => { - let lower = err.to_lowercase(); - if autostash_unsupported(&lower) { - match run_git_command(repo_root, &["pull"]).await { - Ok(()) => Ok(()), - Err(no_autostash_err) => { - let no_autostash_lower = no_autostash_err.to_lowercase(); - if needs_reconcile_strategy(&no_autostash_lower) { - return run_git_command(repo_root, &["pull", "--no-rebase"]).await; - } - Err(no_autostash_err) - } - } - } else if needs_reconcile_strategy(&lower) { - match run_git_command(repo_root, &["pull", "--no-rebase", "--autostash"]).await { - Ok(()) => Ok(()), - Err(merge_err) => { - let merge_lower = merge_err.to_lowercase(); - if autostash_unsupported(&merge_lower) { - return run_git_command(repo_root, &["pull", "--no-rebase"]).await; - } - Err(merge_err) - } - } - } else { - Err(err) - } - } - } -} - -fn status_for_index(status: Status) -> Option<&'static str> { - if status.contains(Status::INDEX_NEW) { - Some("A") - } else if status.contains(Status::INDEX_MODIFIED) { - Some("M") - } else if status.contains(Status::INDEX_DELETED) { - Some("D") - } else if status.contains(Status::INDEX_RENAMED) { - Some("R") - } else if status.contains(Status::INDEX_TYPECHANGE) { - Some("T") - } else { - None - } -} +#[path = "git_ui_core/commands.rs"] +mod commands; +#[path = "git_ui_core/context.rs"] +mod context; +#[path = "git_ui_core/diff.rs"] +mod diff; +#[path = "git_ui_core/github.rs"] +mod github; +#[path = "git_ui_core/log.rs"] +mod log; -fn status_for_workdir(status: Status) -> Option<&'static str> { - if status.contains(Status::WT_NEW) { - Some("A") - } else if status.contains(Status::WT_MODIFIED) { - Some("M") - } else if status.contains(Status::WT_DELETED) { - Some("D") - } else if status.contains(Status::WT_RENAMED) { - Some("R") - } else if status.contains(Status::WT_TYPECHANGE) { - Some("T") - } else { - None - } -} - -fn status_for_delta(status: git2::Delta) -> &'static str { - match status { - git2::Delta::Added => "A", - git2::Delta::Modified => "M", - git2::Delta::Deleted => "D", - git2::Delta::Renamed => "R", - git2::Delta::Typechange => "T", - _ => "M", - } -} - -fn has_ignored_parent_directory(repo: &Repository, path: &Path) -> bool { - let mut current = path.parent(); - while let Some(parent) = current { - if parent.as_os_str().is_empty() { - break; - } - let probe = parent.join(".codexmonitor-ignore-probe"); - if repo.status_should_ignore(&probe).unwrap_or(false) { - return true; - } - current = parent.parent(); - } - false -} - -fn collect_ignored_paths_with_git(repo: &Repository, paths: &[PathBuf]) -> Option> { - if paths.is_empty() { - return Some(HashSet::new()); - } - - let repo_root = repo.workdir()?; - let git_bin = resolve_git_binary().ok()?; - let mut child = std::process::Command::new(git_bin) - .arg("check-ignore") - .arg("--stdin") - .arg("-z") - .current_dir(repo_root) - .env("PATH", git_env_path()) - .stdin(Stdio::piped()) - .stdout(Stdio::piped()) - .stderr(Stdio::null()) - .spawn() - .ok()?; - - let mut stdout = child.stdout.take()?; - let stdout_thread = std::thread::spawn(move || { - let mut buffer = Vec::new(); - stdout.read_to_end(&mut buffer).ok()?; - Some(buffer) - }); - - let wrote_all_input = { - let mut wrote_all = true; - if let Some(mut stdin) = child.stdin.take() { - for path in paths { - if stdin - .write_all(path.as_os_str().as_encoded_bytes()) - .is_err() - { - wrote_all = false; - break; - } - if stdin.write_all(&[0]).is_err() { - wrote_all = false; - break; - } - } - } else { - wrote_all = false; - } - wrote_all - }; - - if !wrote_all_input { - let _ = child.kill(); - let _ = child.wait(); - let _ = stdout_thread.join(); - return None; - } - - let status = child.wait().ok()?; - let stdout = stdout_thread.join().ok().flatten()?; - match status.code() { - Some(0) | Some(1) => {} - _ => return None, - } - - let mut ignored_paths = HashSet::new(); - for raw in stdout.split(|byte| *byte == 0) { - if raw.is_empty() { - continue; - } - let path = String::from_utf8_lossy(raw); - ignored_paths.insert(PathBuf::from(path.as_ref())); - } - Some(ignored_paths) -} - -fn check_ignore_with_git(repo: &Repository, path: &Path) -> Option { - let ignored_paths = collect_ignored_paths_with_git(repo, &[path.to_path_buf()])?; - Some(ignored_paths.contains(path)) -} - -fn is_tracked_path(repo: &Repository, path: &Path) -> bool { - if let Ok(index) = repo.index() { - if index.get_path(path, 0).is_some() { - return true; - } - } - if let Ok(head) = repo.head() { - if let Ok(tree) = head.peel_to_tree() { - if tree.get_path(path).is_ok() { - return true; - } - } - } - false -} - -fn should_skip_ignored_path_with_cache( - repo: &Repository, - path: &Path, - ignored_paths: Option<&HashSet>, -) -> bool { - if is_tracked_path(repo, path) { - return false; - } - if let Some(ignored_paths) = ignored_paths { - return ignored_paths.contains(path); - } - if let Some(ignored) = check_ignore_with_git(repo, path) { - return ignored; - } - // Fallback when git check-ignore is unavailable. - repo.status_should_ignore(path).unwrap_or(false) || has_ignored_parent_directory(repo, path) -} - -fn build_combined_diff(repo: &Repository, diff: &git2::Diff) -> String { - let diff_entries: Vec<(usize, PathBuf)> = diff - .deltas() - .enumerate() - .filter_map(|(index, delta)| { - delta.new_file() - .path() - .or_else(|| delta.old_file().path()) - .map(|path| (index, path.to_path_buf())) - }) - .collect(); - let diff_paths: Vec = diff_entries.iter().map(|(_, path)| path.clone()).collect(); - let ignored_paths = collect_ignored_paths_with_git(repo, &diff_paths); - - let mut combined_diff = String::new(); - for (index, path) in diff_entries { - if should_skip_ignored_path_with_cache(repo, &path, ignored_paths.as_ref()) { - continue; - } - let patch = match git2::Patch::from_diff(diff, index) { - Ok(patch) => patch, - Err(_) => continue, - }; - let Some(mut patch) = patch else { - continue; - }; - let content = match diff_patch_to_string(&mut patch) { - Ok(content) => content, - Err(_) => continue, - }; - if content.trim().is_empty() { - continue; - } - if !combined_diff.is_empty() { - combined_diff.push_str("\n\n"); - } - combined_diff.push_str(&format!("=== {} ===\n", path.display())); - combined_diff.push_str(&content); - } - combined_diff -} - -fn collect_workspace_diff(repo_root: &Path) -> Result { - let repo = Repository::open(repo_root).map_err(|e| e.to_string())?; - let head_tree = repo.head().ok().and_then(|head| head.peel_to_tree().ok()); - - let mut options = DiffOptions::new(); - let index = repo.index().map_err(|e| e.to_string())?; - let diff = match head_tree.as_ref() { - Some(tree) => repo - .diff_tree_to_index(Some(tree), Some(&index), Some(&mut options)) - .map_err(|e| e.to_string())?, - None => repo - .diff_tree_to_index(None, Some(&index), Some(&mut options)) - .map_err(|e| e.to_string())?, - }; - let combined_diff = build_combined_diff(&repo, &diff); - if !combined_diff.trim().is_empty() { - return Ok(combined_diff); - } - - let mut options = DiffOptions::new(); - options - .include_untracked(true) - .recurse_untracked_dirs(true) - .show_untracked_content(true); - let diff = match head_tree.as_ref() { - Some(tree) => repo - .diff_tree_to_workdir_with_index(Some(tree), Some(&mut options)) - .map_err(|e| e.to_string())?, - None => repo - .diff_tree_to_workdir_with_index(None, Some(&mut options)) - .map_err(|e| e.to_string())?, - }; - Ok(build_combined_diff(&repo, &diff)) -} - -fn github_repo_from_path(path: &Path) -> Result { - let repo = Repository::open(path).map_err(|e| e.to_string())?; - let remotes = repo.remotes().map_err(|e| e.to_string())?; - let name = if remotes.iter().any(|remote| remote == Some("origin")) { - "origin".to_string() - } else { - remotes.iter().flatten().next().unwrap_or("").to_string() - }; - if name.is_empty() { - return Err("No git remote configured.".to_string()); - } - let remote = repo.find_remote(&name).map_err(|e| e.to_string())?; - let remote_url = remote.url().ok_or("Remote has no URL configured.")?; - parse_github_repo(remote_url).ok_or("Remote is not a GitHub repository.".to_string()) -} - -fn parse_pr_diff(diff: &str) -> Vec { - let mut entries = Vec::new(); - let mut current_lines: Vec<&str> = Vec::new(); - let mut current_old_path: Option = None; - let mut current_new_path: Option = None; - let mut current_status: Option = None; - - let finalize = |lines: &Vec<&str>, - old_path: &Option, - new_path: &Option, - status: &Option, - results: &mut Vec| { - if lines.is_empty() { - return; - } - let diff_text = lines.join("\n"); - if diff_text.trim().is_empty() { - return; - } - let status_value = status.clone().unwrap_or_else(|| "M".to_string()); - let path = if status_value == "D" { - old_path.clone().unwrap_or_default() - } else { - new_path - .clone() - .or_else(|| old_path.clone()) - .unwrap_or_default() - }; - if path.is_empty() { - return; - } - results.push(GitHubPullRequestDiff { - path: normalize_git_path(&path), - status: status_value, - diff: diff_text, - }); - }; - - for line in diff.lines() { - if line.starts_with("diff --git ") { - finalize( - ¤t_lines, - ¤t_old_path, - ¤t_new_path, - ¤t_status, - &mut entries, - ); - current_lines = vec![line]; - current_old_path = None; - current_new_path = None; - current_status = None; - - let rest = line.trim_start_matches("diff --git ").trim(); - let mut parts = rest.split_whitespace(); - let old_part = parts.next().unwrap_or("").trim_start_matches("a/"); - let new_part = parts.next().unwrap_or("").trim_start_matches("b/"); - if !old_part.is_empty() { - current_old_path = Some(old_part.to_string()); - } - if !new_part.is_empty() { - current_new_path = Some(new_part.to_string()); - } - continue; - } - if line.starts_with("new file mode ") { - current_status = Some("A".to_string()); - } else if line.starts_with("deleted file mode ") { - current_status = Some("D".to_string()); - } else if line.starts_with("rename from ") { - current_status = Some("R".to_string()); - let path = line.trim_start_matches("rename from ").trim(); - if !path.is_empty() { - current_old_path = Some(path.to_string()); - } - } else if line.starts_with("rename to ") { - current_status = Some("R".to_string()); - let path = line.trim_start_matches("rename to ").trim(); - if !path.is_empty() { - current_new_path = Some(path.to_string()); - } - } - current_lines.push(line); - } - - finalize( - ¤t_lines, - ¤t_old_path, - ¤t_new_path, - ¤t_status, - &mut entries, - ); - - entries -} - -async fn workspace_entry_for_id( - workspaces: &Mutex>, - workspace_id: &str, -) -> Result { - let workspaces = workspaces.lock().await; - workspaces - .get(workspace_id) - .cloned() - .ok_or_else(|| "workspace not found".to_string()) -} +#[cfg(test)] +#[path = "git_ui_core/tests.rs"] +mod tests; -async fn resolve_repo_root_for_workspace( +pub(crate) async fn resolve_repo_root_for_workspace_core( workspaces: &Mutex>, workspace_id: String, ) -> Result { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - resolve_git_root(&entry) -} - -async fn get_git_status_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - - let branch_name = repo - .head() - .ok() - .and_then(|head| head.shorthand().map(|s| s.to_string())) - .unwrap_or_else(|| "unknown".to_string()); - - let mut status_options = StatusOptions::new(); - status_options - .include_untracked(true) - .recurse_untracked_dirs(true) - .renames_head_to_index(true) - .renames_index_to_workdir(true) - .include_ignored(false); - - let statuses = repo - .statuses(Some(&mut status_options)) - .map_err(|e| e.to_string())?; - let status_paths: Vec = statuses - .iter() - .filter_map(|entry| entry.path().map(PathBuf::from)) - .filter(|path| !path.as_os_str().is_empty()) - .collect(); - let ignored_paths = collect_ignored_paths_with_git(&repo, &status_paths); - - let head_tree = repo.head().ok().and_then(|head| head.peel_to_tree().ok()); - let index = repo.index().ok(); - - let mut files = Vec::new(); - let mut staged_files = Vec::new(); - let mut unstaged_files = Vec::new(); - let mut total_additions = 0i64; - let mut total_deletions = 0i64; - for entry in statuses.iter() { - let path = entry.path().unwrap_or(""); - if path.is_empty() { - continue; - } - if should_skip_ignored_path_with_cache(&repo, Path::new(path), ignored_paths.as_ref()) { - continue; - } - if let Some(index) = index.as_ref() { - if let Some(entry) = index.get_path(Path::new(path), 0) { - if entry.flags_extended & INDEX_SKIP_WORKTREE_FLAG != 0 { - continue; - } - } - } - let status = entry.status(); - let normalized_path = normalize_git_path(path); - let include_index = status.intersects( - Status::INDEX_NEW - | Status::INDEX_MODIFIED - | Status::INDEX_DELETED - | Status::INDEX_RENAMED - | Status::INDEX_TYPECHANGE, - ); - let include_workdir = status.intersects( - Status::WT_NEW - | Status::WT_MODIFIED - | Status::WT_DELETED - | Status::WT_RENAMED - | Status::WT_TYPECHANGE, - ); - let mut combined_additions = 0i64; - let mut combined_deletions = 0i64; - - if include_index { - let (additions, deletions) = - diff_stats_for_path(&repo, head_tree.as_ref(), path, true, false).unwrap_or((0, 0)); - if let Some(status_str) = status_for_index(status) { - staged_files.push(GitFileStatus { - path: normalized_path.clone(), - status: status_str.to_string(), - additions, - deletions, - }); - } - combined_additions += additions; - combined_deletions += deletions; - total_additions += additions; - total_deletions += deletions; - } - - if include_workdir { - let (additions, deletions) = - diff_stats_for_path(&repo, head_tree.as_ref(), path, false, true).unwrap_or((0, 0)); - if let Some(status_str) = status_for_workdir(status) { - unstaged_files.push(GitFileStatus { - path: normalized_path.clone(), - status: status_str.to_string(), - additions, - deletions, - }); - } - combined_additions += additions; - combined_deletions += deletions; - total_additions += additions; - total_deletions += deletions; - } - - if include_index || include_workdir { - let status_str = status_for_workdir(status) - .or_else(|| status_for_index(status)) - .unwrap_or("--"); - files.push(GitFileStatus { - path: normalized_path, - status: status_str.to_string(), - additions: combined_additions, - deletions: combined_deletions, - }); - } - } - - Ok(json!({ - "branchName": branch_name, - "files": files, - "stagedFiles": staged_files, - "unstagedFiles": unstaged_files, - "totalAdditions": total_additions, - "totalDeletions": total_deletions, - })) -} - -async fn stage_git_file_inner( - workspaces: &Mutex>, - workspace_id: String, - path: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - for path in action_paths_for_file(&repo_root, &path) { - run_git_command(&repo_root, &["add", "-A", "--", &path]).await?; - } - Ok(()) -} - -async fn stage_git_all_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - run_git_command(&repo_root, &["add", "-A"]).await -} - -async fn unstage_git_file_inner( - workspaces: &Mutex>, - workspace_id: String, - path: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - for path in action_paths_for_file(&repo_root, &path) { - run_git_command(&repo_root, &["restore", "--staged", "--", &path]).await?; - } - Ok(()) -} - -async fn revert_git_file_inner( - workspaces: &Mutex>, - workspace_id: String, - path: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - for path in action_paths_for_file(&repo_root, &path) { - if run_git_command( - &repo_root, - &["restore", "--staged", "--worktree", "--", &path], - ) - .await - .is_ok() - { - continue; - } - run_git_command(&repo_root, &["clean", "-f", "--", &path]).await?; - } - Ok(()) -} - -async fn revert_git_all_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - run_git_command( - &repo_root, - &["restore", "--staged", "--worktree", "--", "."], - ) - .await?; - run_git_command(&repo_root, &["clean", "-f", "-d"]).await -} - -async fn commit_git_inner( - workspaces: &Mutex>, - workspace_id: String, - message: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - run_git_command(&repo_root, &["commit", "-m", &message]).await -} - -async fn push_git_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - push_with_upstream(&repo_root).await -} - -async fn pull_git_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - pull_with_default_strategy(&repo_root).await -} - -async fn fetch_git_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - fetch_with_default_remote(&repo_root).await -} - -async fn sync_git_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - pull_with_default_strategy(&repo_root).await?; - push_with_upstream(&repo_root).await -} - -async fn list_git_roots_inner( - workspaces: &Mutex>, - workspace_id: String, - depth: Option, -) -> Result, String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let root = PathBuf::from(&entry.path); - let depth = depth.unwrap_or(2).clamp(1, 6); - Ok(scan_git_roots(&root, depth, 200)) -} - -async fn get_git_diffs_inner( - workspaces: &Mutex>, - app_settings: &Mutex, - workspace_id: String, -) -> Result, String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let ignore_whitespace_changes = { - let settings = app_settings.lock().await; - settings.git_diff_ignore_whitespace_changes - }; - - tokio::task::spawn_blocking(move || { - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - let head_tree = repo.head().ok().and_then(|head| head.peel_to_tree().ok()); - - let mut options = DiffOptions::new(); - options - .include_untracked(true) - .recurse_untracked_dirs(true) - .show_untracked_content(true); - options.ignore_whitespace_change(ignore_whitespace_changes); - - let diff = match head_tree.as_ref() { - Some(tree) => repo - .diff_tree_to_workdir_with_index(Some(tree), Some(&mut options)) - .map_err(|e| e.to_string())?, - None => repo - .diff_tree_to_workdir_with_index(None, Some(&mut options)) - .map_err(|e| e.to_string())?, - }; - let diff_paths: Vec = diff - .deltas() - .filter_map(|delta| delta.new_file().path().or_else(|| delta.old_file().path())) - .map(PathBuf::from) - .collect(); - let ignored_paths = collect_ignored_paths_with_git(&repo, &diff_paths); - - let mut results = Vec::new(); - for (index, delta) in diff.deltas().enumerate() { - let old_path = delta.old_file().path(); - let new_path = delta.new_file().path(); - let display_path = new_path.or(old_path); - let Some(display_path) = display_path else { - continue; - }; - if should_skip_ignored_path_with_cache(&repo, display_path, ignored_paths.as_ref()) { - continue; - } - let old_path_str = old_path.map(|path| path.to_string_lossy()); - let new_path_str = new_path.map(|path| path.to_string_lossy()); - let display_path_str = display_path.to_string_lossy(); - let normalized_path = normalize_git_path(&display_path_str); - let old_image_mime = old_path_str.as_deref().and_then(image_mime_type); - let new_image_mime = new_path_str.as_deref().and_then(image_mime_type); - let is_image = old_image_mime.is_some() || new_image_mime.is_some(); - let is_deleted = delta.status() == git2::Delta::Deleted; - let is_added = delta.status() == git2::Delta::Added; - - let old_lines = if !is_added { - head_tree - .as_ref() - .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) - .and_then(|entry| repo.find_blob(entry.id()).ok()) - .and_then(blob_to_lines) - } else { - None - }; - - let new_lines = if !is_deleted { - match new_path { - Some(path) => { - let full_path = repo_root.join(path); - read_text_lines(&full_path) - } - None => None, - } - } else { - None - }; - - if is_image { - let old_image_data = if !is_added && old_image_mime.is_some() { - head_tree - .as_ref() - .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) - .and_then(|entry| repo.find_blob(entry.id()).ok()) - .and_then(blob_to_base64) - } else { - None - }; - - let new_image_data = if !is_deleted && new_image_mime.is_some() { - match new_path { - Some(path) => { - let full_path = repo_root.join(path); - read_image_base64(&full_path) - } - None => None, - } - } else { - None - }; - - results.push(GitFileDiff { - path: normalized_path, - diff: String::new(), - old_lines: None, - new_lines: None, - is_binary: true, - is_image: true, - old_image_data, - new_image_data, - old_image_mime: old_image_mime.map(str::to_string), - new_image_mime: new_image_mime.map(str::to_string), - }); - continue; - } - - let patch = match git2::Patch::from_diff(&diff, index) { - Ok(patch) => patch, - Err(_) => continue, - }; - let Some(mut patch) = patch else { - continue; - }; - let content = match diff_patch_to_string(&mut patch) { - Ok(content) => content, - Err(_) => continue, - }; - if content.trim().is_empty() { - continue; - } - results.push(GitFileDiff { - path: normalized_path, - diff: content, - old_lines, - new_lines, - is_binary: false, - is_image: false, - old_image_data: None, - new_image_data: None, - old_image_mime: None, - new_image_mime: None, - }); - } - - Ok(results) - }) - .await - .map_err(|e| e.to_string())? -} - -async fn get_git_log_inner( - workspaces: &Mutex>, - workspace_id: String, - limit: Option, -) -> Result { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - let max_items = limit.unwrap_or(40); - let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; - revwalk.push_head().map_err(|e| e.to_string())?; - revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; - - let mut total = 0usize; - for oid_result in revwalk { - oid_result.map_err(|e| e.to_string())?; - total += 1; - } - - let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; - revwalk.push_head().map_err(|e| e.to_string())?; - revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; - - let mut entries = Vec::new(); - for oid_result in revwalk.take(max_items) { - let oid = oid_result.map_err(|e| e.to_string())?; - let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; - entries.push(commit_to_entry(commit)); - } - - let mut ahead = 0usize; - let mut behind = 0usize; - let mut ahead_entries = Vec::new(); - let mut behind_entries = Vec::new(); - let mut upstream = None; - - if let Ok(head) = repo.head() { - if head.is_branch() { - if let Some(branch_name) = head.shorthand() { - if let Ok(branch) = repo.find_branch(branch_name, BranchType::Local) { - if let Ok(upstream_branch) = branch.upstream() { - let upstream_ref = upstream_branch.get(); - upstream = upstream_ref - .shorthand() - .map(|name| name.to_string()) - .or_else(|| upstream_ref.name().map(|name| name.to_string())); - if let (Some(head_oid), Some(upstream_oid)) = - (head.target(), upstream_ref.target()) - { - let (ahead_count, behind_count) = repo - .graph_ahead_behind(head_oid, upstream_oid) - .map_err(|e| e.to_string())?; - ahead = ahead_count; - behind = behind_count; - - let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; - revwalk.push(head_oid).map_err(|e| e.to_string())?; - revwalk.hide(upstream_oid).map_err(|e| e.to_string())?; - revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; - for oid_result in revwalk.take(max_items) { - let oid = oid_result.map_err(|e| e.to_string())?; - let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; - ahead_entries.push(commit_to_entry(commit)); - } - - let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; - revwalk.push(upstream_oid).map_err(|e| e.to_string())?; - revwalk.hide(head_oid).map_err(|e| e.to_string())?; - revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; - for oid_result in revwalk.take(max_items) { - let oid = oid_result.map_err(|e| e.to_string())?; - let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; - behind_entries.push(commit_to_entry(commit)); - } - } - } - } - } - } - } - - Ok(GitLogResponse { - total, - entries, - ahead, - behind, - ahead_entries, - behind_entries, - upstream, - }) -} - -async fn get_git_commit_diff_inner( - workspaces: &Mutex>, - app_settings: &Mutex, - workspace_id: String, - sha: String, -) -> Result, String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - - let ignore_whitespace_changes = { - let settings = app_settings.lock().await; - settings.git_diff_ignore_whitespace_changes - }; - - let repo_root = resolve_git_root(&entry)?; - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - let oid = git2::Oid::from_str(&sha).map_err(|e| e.to_string())?; - let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; - let commit_tree = commit.tree().map_err(|e| e.to_string())?; - let parent_tree = commit.parent(0).ok().and_then(|parent| parent.tree().ok()); - - let mut options = DiffOptions::new(); - options.ignore_whitespace_change(ignore_whitespace_changes); - let diff = repo - .diff_tree_to_tree(parent_tree.as_ref(), Some(&commit_tree), Some(&mut options)) - .map_err(|e| e.to_string())?; - - let mut results = Vec::new(); - for (index, delta) in diff.deltas().enumerate() { - let old_path = delta.old_file().path(); - let new_path = delta.new_file().path(); - let display_path = new_path.or(old_path); - let Some(display_path) = display_path else { - continue; - }; - let old_path_str = old_path.map(|path| path.to_string_lossy()); - let new_path_str = new_path.map(|path| path.to_string_lossy()); - let display_path_str = display_path.to_string_lossy(); - let normalized_path = normalize_git_path(&display_path_str); - let old_image_mime = old_path_str.as_deref().and_then(image_mime_type); - let new_image_mime = new_path_str.as_deref().and_then(image_mime_type); - let is_image = old_image_mime.is_some() || new_image_mime.is_some(); - let is_deleted = delta.status() == git2::Delta::Deleted; - let is_added = delta.status() == git2::Delta::Added; - - let old_lines = if !is_added { - parent_tree - .as_ref() - .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) - .and_then(|entry| repo.find_blob(entry.id()).ok()) - .and_then(blob_to_lines) - } else { - None - }; - - let new_lines = if !is_deleted { - new_path - .and_then(|path| commit_tree.get_path(path).ok()) - .and_then(|entry| repo.find_blob(entry.id()).ok()) - .and_then(blob_to_lines) - } else { - None - }; - - if is_image { - let old_image_data = if !is_added && old_image_mime.is_some() { - parent_tree - .as_ref() - .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) - .and_then(|entry| repo.find_blob(entry.id()).ok()) - .and_then(blob_to_base64) - } else { - None - }; - - let new_image_data = if !is_deleted && new_image_mime.is_some() { - new_path - .and_then(|path| commit_tree.get_path(path).ok()) - .and_then(|entry| repo.find_blob(entry.id()).ok()) - .and_then(blob_to_base64) - } else { - None - }; - - results.push(GitCommitDiff { - path: normalized_path, - status: status_for_delta(delta.status()).to_string(), - diff: String::new(), - old_lines: None, - new_lines: None, - is_binary: true, - is_image: true, - old_image_data, - new_image_data, - old_image_mime: old_image_mime.map(str::to_string), - new_image_mime: new_image_mime.map(str::to_string), - }); - continue; - } - - let patch = match git2::Patch::from_diff(&diff, index) { - Ok(patch) => patch, - Err(_) => continue, - }; - let Some(mut patch) = patch else { - continue; - }; - let content = match diff_patch_to_string(&mut patch) { - Ok(content) => content, - Err(_) => continue, - }; - if content.trim().is_empty() { - continue; - } - results.push(GitCommitDiff { - path: normalized_path, - status: status_for_delta(delta.status()).to_string(), - diff: content, - old_lines, - new_lines, - is_binary: false, - is_image: false, - old_image_data: None, - new_image_data: None, - old_image_mime: None, - new_image_mime: None, - }); - } - - Ok(results) -} - -async fn get_git_remote_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result, String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - let remotes = repo.remotes().map_err(|e| e.to_string())?; - let name = if remotes.iter().any(|remote| remote == Some("origin")) { - "origin".to_string() - } else { - remotes.iter().flatten().next().unwrap_or("").to_string() - }; - if name.is_empty() { - return Ok(None); - } - let remote = repo.find_remote(&name).map_err(|e| e.to_string())?; - Ok(remote.url().map(|url| url.to_string())) -} - -async fn get_github_issues_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo_name = github_repo_from_path(&repo_root)?; - - let output = tokio_command("gh") - .args([ - "issue", - "list", - "--repo", - &repo_name, - "--limit", - "50", - "--json", - "number,title,url,updatedAt", - ]) - .current_dir(&repo_root) - .output() - .await - .map_err(|e| format!("Failed to run gh: {e}"))?; - - if !output.status.success() { - let stderr = String::from_utf8_lossy(&output.stderr); - let stdout = String::from_utf8_lossy(&output.stdout); - let detail = if stderr.trim().is_empty() { - stdout.trim() - } else { - stderr.trim() - }; - if detail.is_empty() { - return Err("GitHub CLI command failed.".to_string()); - } - return Err(detail.to_string()); - } - - let issues: Vec = - serde_json::from_slice(&output.stdout).map_err(|e| e.to_string())?; - - let search_query = format!("repo:{repo_name} is:issue is:open").replace(' ', "+"); - let total = match tokio_command("gh") - .args([ - "api", - &format!("/search/issues?q={search_query}"), - "--jq", - ".total_count", - ]) - .current_dir(&repo_root) - .output() - .await - { - Ok(output) if output.status.success() => String::from_utf8_lossy(&output.stdout) - .trim() - .parse::() - .unwrap_or(issues.len()), - _ => issues.len(), - }; - - Ok(GitHubIssuesResponse { total, issues }) -} - -async fn get_github_pull_requests_inner( - workspaces: &Mutex>, - workspace_id: String, -) -> Result { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo_name = github_repo_from_path(&repo_root)?; - - let output = tokio_command("gh") - .args([ - "pr", - "list", - "--repo", - &repo_name, - "--state", - "open", - "--limit", - "50", - "--json", - "number,title,url,updatedAt,createdAt,body,headRefName,baseRefName,isDraft,author", - ]) - .current_dir(&repo_root) - .output() - .await - .map_err(|e| format!("Failed to run gh: {e}"))?; - - if !output.status.success() { - let stderr = String::from_utf8_lossy(&output.stderr); - let stdout = String::from_utf8_lossy(&output.stdout); - let detail = if stderr.trim().is_empty() { - stdout.trim() - } else { - stderr.trim() - }; - if detail.is_empty() { - return Err("GitHub CLI command failed.".to_string()); - } - return Err(detail.to_string()); - } - - let pull_requests: Vec = - serde_json::from_slice(&output.stdout).map_err(|e| e.to_string())?; - - let search_query = format!("repo:{repo_name} is:pr is:open").replace(' ', "+"); - let total = match tokio_command("gh") - .args([ - "api", - &format!("/search/issues?q={search_query}"), - "--jq", - ".total_count", - ]) - .current_dir(&repo_root) - .output() - .await - { - Ok(output) if output.status.success() => String::from_utf8_lossy(&output.stdout) - .trim() - .parse::() - .unwrap_or(pull_requests.len()), - _ => pull_requests.len(), - }; - - Ok(GitHubPullRequestsResponse { - total, - pull_requests, - }) -} - -async fn get_github_pull_request_diff_inner( - workspaces: &Mutex>, - workspace_id: String, - pr_number: u64, -) -> Result, String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo_name = github_repo_from_path(&repo_root)?; - - let output = tokio_command("gh") - .args([ - "pr", - "diff", - &pr_number.to_string(), - "--repo", - &repo_name, - "--color", - "never", - ]) - .current_dir(&repo_root) - .output() - .await - .map_err(|e| format!("Failed to run gh: {e}"))?; - - if !output.status.success() { - let stderr = String::from_utf8_lossy(&output.stderr); - let stdout = String::from_utf8_lossy(&output.stdout); - let detail = if stderr.trim().is_empty() { - stdout.trim() - } else { - stderr.trim() - }; - if detail.is_empty() { - return Err("GitHub CLI command failed.".to_string()); - } - return Err(detail.to_string()); - } - - let diff_text = String::from_utf8_lossy(&output.stdout); - Ok(parse_pr_diff(&diff_text)) + context::resolve_repo_root_for_workspace(workspaces, workspace_id).await } -async fn get_github_pull_request_comments_inner( - workspaces: &Mutex>, - workspace_id: String, - pr_number: u64, -) -> Result, String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo_name = github_repo_from_path(&repo_root)?; - - let comments_endpoint = format!("/repos/{repo_name}/issues/{pr_number}/comments?per_page=30"); - let jq_filter = r#"[.[] | {id, body, createdAt: .created_at, url: .html_url, author: (if .user then {login: .user.login} else null end)}]"#; - - let output = tokio_command("gh") - .args(["api", &comments_endpoint, "--jq", jq_filter]) - .current_dir(&repo_root) - .output() - .await - .map_err(|e| format!("Failed to run gh: {e}"))?; - - if !output.status.success() { - let stderr = String::from_utf8_lossy(&output.stderr); - let stdout = String::from_utf8_lossy(&output.stdout); - let detail = if stderr.trim().is_empty() { - stdout.trim() - } else { - stderr.trim() - }; - if detail.is_empty() { - return Err("GitHub CLI command failed.".to_string()); - } - return Err(detail.to_string()); - } - - let comments: Vec = - serde_json::from_slice(&output.stdout).map_err(|e| e.to_string())?; - - Ok(comments) +pub(crate) fn collect_workspace_diff_core(repo_root: &Path) -> Result { + diff::collect_workspace_diff(repo_root) } -async fn list_git_branches_inner( +pub(crate) async fn get_git_status_core( workspaces: &Mutex>, workspace_id: String, ) -> Result { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - let mut branches = Vec::new(); - let refs = repo - .branches(Some(BranchType::Local)) - .map_err(|e| e.to_string())?; - for branch_result in refs { - let (branch, _) = branch_result.map_err(|e| e.to_string())?; - let name = branch.name().ok().flatten().unwrap_or("").to_string(); - if name.is_empty() { - continue; - } - let last_commit = branch - .get() - .target() - .and_then(|oid| repo.find_commit(oid).ok()) - .map(|commit| commit.time().seconds()) - .unwrap_or(0); - branches.push(BranchInfo { name, last_commit }); - } - branches.sort_by(|a, b| b.last_commit.cmp(&a.last_commit)); - Ok(json!({ "branches": branches })) -} - -async fn checkout_git_branch_inner( - workspaces: &Mutex>, - workspace_id: String, - name: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - checkout_branch(&repo, &name).map_err(|e| e.to_string()) -} - -async fn create_git_branch_inner( - workspaces: &Mutex>, - workspace_id: String, - name: String, -) -> Result<(), String> { - let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; - let repo_root = resolve_git_root(&entry)?; - let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; - let head = repo.head().map_err(|e| e.to_string())?; - let target = head.peel_to_commit().map_err(|e| e.to_string())?; - repo.branch(&name, &target, false) - .map_err(|e| e.to_string())?; - checkout_branch(&repo, &name).map_err(|e| e.to_string()) + diff::get_git_status_inner(workspaces, workspace_id).await } -pub(crate) async fn resolve_repo_root_for_workspace_core( +pub(crate) async fn init_git_repo_core( workspaces: &Mutex>, workspace_id: String, -) -> Result { - resolve_repo_root_for_workspace(workspaces, workspace_id).await -} - -pub(crate) fn collect_workspace_diff_core(repo_root: &Path) -> Result { - collect_workspace_diff(repo_root) + branch: String, + force: bool, +) -> Result { + commands::init_git_repo_inner(workspaces, workspace_id, branch, force).await } -pub(crate) async fn get_git_status_core( +pub(crate) async fn create_github_repo_core( workspaces: &Mutex>, workspace_id: String, + repo: String, + visibility: String, + branch: Option, ) -> Result { - get_git_status_inner(workspaces, workspace_id).await + commands::create_github_repo_inner(workspaces, workspace_id, repo, visibility, branch).await } pub(crate) async fn list_git_roots_core( @@ -1592,7 +66,7 @@ pub(crate) async fn list_git_roots_core( workspace_id: String, depth: Option, ) -> Result, String> { - list_git_roots_inner(workspaces, workspace_id, depth).await + commands::list_git_roots_inner(workspaces, workspace_id, depth).await } pub(crate) async fn get_git_diffs_core( @@ -1600,7 +74,7 @@ pub(crate) async fn get_git_diffs_core( app_settings: &Mutex, workspace_id: String, ) -> Result, String> { - get_git_diffs_inner(workspaces, app_settings, workspace_id).await + diff::get_git_diffs_inner(workspaces, app_settings, workspace_id).await } pub(crate) async fn get_git_log_core( @@ -1608,7 +82,7 @@ pub(crate) async fn get_git_log_core( workspace_id: String, limit: Option, ) -> Result { - get_git_log_inner(workspaces, workspace_id, limit).await + log::get_git_log_inner(workspaces, workspace_id, limit).await } pub(crate) async fn get_git_commit_diff_core( @@ -1617,14 +91,14 @@ pub(crate) async fn get_git_commit_diff_core( workspace_id: String, sha: String, ) -> Result, String> { - get_git_commit_diff_inner(workspaces, app_settings, workspace_id, sha).await + diff::get_git_commit_diff_inner(workspaces, app_settings, workspace_id, sha).await } pub(crate) async fn get_git_remote_core( workspaces: &Mutex>, workspace_id: String, ) -> Result, String> { - get_git_remote_inner(workspaces, workspace_id).await + log::get_git_remote_inner(workspaces, workspace_id).await } pub(crate) async fn stage_git_file_core( @@ -1632,14 +106,14 @@ pub(crate) async fn stage_git_file_core( workspace_id: String, path: String, ) -> Result<(), String> { - stage_git_file_inner(workspaces, workspace_id, path).await + commands::stage_git_file_inner(workspaces, workspace_id, path).await } pub(crate) async fn stage_git_all_core( workspaces: &Mutex>, workspace_id: String, ) -> Result<(), String> { - stage_git_all_inner(workspaces, workspace_id).await + commands::stage_git_all_inner(workspaces, workspace_id).await } pub(crate) async fn unstage_git_file_core( @@ -1647,7 +121,7 @@ pub(crate) async fn unstage_git_file_core( workspace_id: String, path: String, ) -> Result<(), String> { - unstage_git_file_inner(workspaces, workspace_id, path).await + commands::unstage_git_file_inner(workspaces, workspace_id, path).await } pub(crate) async fn revert_git_file_core( @@ -1655,14 +129,14 @@ pub(crate) async fn revert_git_file_core( workspace_id: String, path: String, ) -> Result<(), String> { - revert_git_file_inner(workspaces, workspace_id, path).await + commands::revert_git_file_inner(workspaces, workspace_id, path).await } pub(crate) async fn revert_git_all_core( workspaces: &Mutex>, workspace_id: String, ) -> Result<(), String> { - revert_git_all_inner(workspaces, workspace_id).await + commands::revert_git_all_inner(workspaces, workspace_id).await } pub(crate) async fn commit_git_core( @@ -1670,49 +144,49 @@ pub(crate) async fn commit_git_core( workspace_id: String, message: String, ) -> Result<(), String> { - commit_git_inner(workspaces, workspace_id, message).await + commands::commit_git_inner(workspaces, workspace_id, message).await } pub(crate) async fn push_git_core( workspaces: &Mutex>, workspace_id: String, ) -> Result<(), String> { - push_git_inner(workspaces, workspace_id).await + commands::push_git_inner(workspaces, workspace_id).await } pub(crate) async fn pull_git_core( workspaces: &Mutex>, workspace_id: String, ) -> Result<(), String> { - pull_git_inner(workspaces, workspace_id).await + commands::pull_git_inner(workspaces, workspace_id).await } pub(crate) async fn fetch_git_core( workspaces: &Mutex>, workspace_id: String, ) -> Result<(), String> { - fetch_git_inner(workspaces, workspace_id).await + commands::fetch_git_inner(workspaces, workspace_id).await } pub(crate) async fn sync_git_core( workspaces: &Mutex>, workspace_id: String, ) -> Result<(), String> { - sync_git_inner(workspaces, workspace_id).await + commands::sync_git_inner(workspaces, workspace_id).await } pub(crate) async fn get_github_issues_core( workspaces: &Mutex>, workspace_id: String, ) -> Result { - get_github_issues_inner(workspaces, workspace_id).await + github::get_github_issues_inner(workspaces, workspace_id).await } pub(crate) async fn get_github_pull_requests_core( workspaces: &Mutex>, workspace_id: String, ) -> Result { - get_github_pull_requests_inner(workspaces, workspace_id).await + github::get_github_pull_requests_inner(workspaces, workspace_id).await } pub(crate) async fn get_github_pull_request_diff_core( @@ -1720,7 +194,7 @@ pub(crate) async fn get_github_pull_request_diff_core( workspace_id: String, pr_number: u64, ) -> Result, String> { - get_github_pull_request_diff_inner(workspaces, workspace_id, pr_number).await + github::get_github_pull_request_diff_inner(workspaces, workspace_id, pr_number).await } pub(crate) async fn get_github_pull_request_comments_core( @@ -1728,14 +202,14 @@ pub(crate) async fn get_github_pull_request_comments_core( workspace_id: String, pr_number: u64, ) -> Result, String> { - get_github_pull_request_comments_inner(workspaces, workspace_id, pr_number).await + github::get_github_pull_request_comments_inner(workspaces, workspace_id, pr_number).await } pub(crate) async fn list_git_branches_core( workspaces: &Mutex>, workspace_id: String, ) -> Result { - list_git_branches_inner(workspaces, workspace_id).await + commands::list_git_branches_inner(workspaces, workspace_id).await } pub(crate) async fn checkout_git_branch_core( @@ -1743,7 +217,7 @@ pub(crate) async fn checkout_git_branch_core( workspace_id: String, name: String, ) -> Result<(), String> { - checkout_git_branch_inner(workspaces, workspace_id, name).await + commands::checkout_git_branch_inner(workspaces, workspace_id, name).await } pub(crate) async fn create_git_branch_core( @@ -1751,396 +225,5 @@ pub(crate) async fn create_git_branch_core( workspace_id: String, name: String, ) -> Result<(), String> { - create_git_branch_inner(workspaces, workspace_id, name).await -} - -#[cfg(test)] -mod tests { - use super::*; - use crate::types::{WorkspaceKind, WorkspaceSettings}; - use std::fs; - use std::path::Path; - use tokio::runtime::Runtime; - - fn create_temp_repo() -> (PathBuf, Repository) { - let root = - std::env::temp_dir().join(format!("codex-monitor-test-{}", uuid::Uuid::new_v4())); - fs::create_dir_all(&root).expect("create temp repo root"); - let repo = Repository::init(&root).expect("init repo"); - (root, repo) - } - - #[test] - fn collect_workspace_diff_prefers_staged_changes() { - let (root, repo) = create_temp_repo(); - let file_path = root.join("staged.txt"); - fs::write(&file_path, "staged\n").expect("write staged file"); - let mut index = repo.index().expect("index"); - index.add_path(Path::new("staged.txt")).expect("add path"); - index.write().expect("write index"); - - let diff = collect_workspace_diff(&root).expect("collect diff"); - assert!(diff.contains("staged.txt")); - assert!(diff.contains("staged")); - } - - #[test] - fn collect_workspace_diff_falls_back_to_workdir() { - let (root, _repo) = create_temp_repo(); - let file_path = root.join("unstaged.txt"); - fs::write(&file_path, "unstaged\n").expect("write unstaged file"); - - let diff = collect_workspace_diff(&root).expect("collect diff"); - assert!(diff.contains("unstaged.txt")); - assert!(diff.contains("unstaged")); - } - - #[test] - fn action_paths_for_file_expands_renames() { - let (root, repo) = create_temp_repo(); - fs::write(root.join("a.txt"), "hello\n").expect("write file"); - - let mut index = repo.index().expect("repo index"); - index.add_path(Path::new("a.txt")).expect("add path"); - let tree_id = index.write_tree().expect("write tree"); - let tree = repo.find_tree(tree_id).expect("find tree"); - let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); - repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) - .expect("commit"); - - fs::rename(root.join("a.txt"), root.join("b.txt")).expect("rename file"); - - let mut index = repo.index().expect("repo index"); - index - .remove_path(Path::new("a.txt")) - .expect("remove old path"); - index.add_path(Path::new("b.txt")).expect("add new path"); - index.write().expect("write index"); - - let paths = action_paths_for_file(&root, "b.txt"); - assert_eq!(paths, vec!["a.txt".to_string(), "b.txt".to_string()]); - } - - #[test] - fn get_git_status_omits_global_ignored_paths() { - let (root, repo) = create_temp_repo(); - fs::write(root.join("tracked.txt"), "tracked\n").expect("write tracked file"); - let mut index = repo.index().expect("repo index"); - index.add_path(Path::new("tracked.txt")).expect("add path"); - let tree_id = index.write_tree().expect("write tree"); - let tree = repo.find_tree(tree_id).expect("find tree"); - let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); - repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) - .expect("commit"); - - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - let ignored_path = root.join("ignored_root/example/foo/bar.txt"); - fs::create_dir_all(ignored_path.parent().expect("parent")).expect("create ignored dir"); - fs::write(&ignored_path, "ignored\n").expect("write ignored file"); - - let workspace = WorkspaceEntry { - id: "w1".to_string(), - name: "w1".to_string(), - path: root.to_string_lossy().to_string(), - codex_bin: None, - kind: WorkspaceKind::Main, - parent_id: None, - worktree: None, - settings: WorkspaceSettings::default(), - }; - let mut entries = HashMap::new(); - entries.insert("w1".to_string(), workspace); - let workspaces = Mutex::new(entries); - - let runtime = Runtime::new().expect("create tokio runtime"); - let status = runtime - .block_on(get_git_status_inner(&workspaces, "w1".to_string())) - .expect("get git status"); - - let has_ignored = status - .get("unstagedFiles") - .and_then(Value::as_array) - .into_iter() - .flatten() - .filter_map(|entry| entry.get("path").and_then(Value::as_str)) - .any(|path| path.starts_with("ignored_root/example/foo/bar")); - assert!(!has_ignored, "ignored files should not appear in unstagedFiles"); - } - - #[test] - fn get_git_diffs_omits_global_ignored_paths() { - let (root, repo) = create_temp_repo(); - fs::write(root.join("tracked.txt"), "tracked\n").expect("write tracked file"); - let mut index = repo.index().expect("repo index"); - index.add_path(Path::new("tracked.txt")).expect("add path"); - let tree_id = index.write_tree().expect("write tree"); - let tree = repo.find_tree(tree_id).expect("find tree"); - let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); - repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) - .expect("commit"); - - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - let ignored_path = root.join("ignored_root/example/foo/bar.txt"); - fs::create_dir_all(ignored_path.parent().expect("parent")).expect("create ignored dir"); - fs::write(&ignored_path, "ignored\n").expect("write ignored file"); - - let workspace = WorkspaceEntry { - id: "w1".to_string(), - name: "w1".to_string(), - path: root.to_string_lossy().to_string(), - codex_bin: None, - kind: WorkspaceKind::Main, - parent_id: None, - worktree: None, - settings: WorkspaceSettings::default(), - }; - let mut entries = HashMap::new(); - entries.insert("w1".to_string(), workspace); - let workspaces = Mutex::new(entries); - let app_settings = Mutex::new(AppSettings::default()); - - let runtime = Runtime::new().expect("create tokio runtime"); - let diffs = runtime - .block_on(get_git_diffs_inner( - &workspaces, - &app_settings, - "w1".to_string(), - )) - .expect("get git diffs"); - - let has_ignored = diffs - .iter() - .any(|diff| diff.path.starts_with("ignored_root/example/foo/bar")); - assert!(!has_ignored, "ignored files should not appear in diff list"); - } - - #[test] - fn check_ignore_with_git_respects_negated_rule_for_specific_file() { - let (root, repo) = create_temp_repo(); - - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root/*\n!ignored_root/keep.txt\n") - .expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - let kept_path = Path::new("ignored_root/keep.txt"); - assert!( - check_ignore_with_git(&repo, kept_path) == Some(false), - "keep.txt should be visible because of negated rule" - ); - } - - #[test] - fn should_skip_ignored_path_respects_negated_rule_for_specific_file() { - let (root, repo) = create_temp_repo(); - - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root/*\n!ignored_root/keep.txt\n") - .expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - assert!( - !should_skip_ignored_path_with_cache(&repo, Path::new("ignored_root/keep.txt"), None), - "keep.txt should not be skipped when unignored by negated rule" - ); - } - - #[test] - fn should_skip_ignored_path_skips_paths_with_ignored_parent() { - let (root, repo) = create_temp_repo(); - - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - assert!( - should_skip_ignored_path_with_cache( - &repo, - Path::new("ignored_root/example/foo/bar.txt"), - None, - ), - "nested path should be skipped when parent directory is ignored" - ); - } - - #[test] - fn should_skip_ignored_path_keeps_tracked_file_under_ignored_parent_pattern() { - let (root, repo) = create_temp_repo(); - let tracked_path = root.join("ignored_root/tracked.txt"); - fs::create_dir_all(tracked_path.parent().expect("parent")).expect("create tracked dir"); - fs::write(&tracked_path, "tracked\n").expect("write tracked file"); - let mut index = repo.index().expect("repo index"); - index - .add_path(Path::new("ignored_root/tracked.txt")) - .expect("add tracked path"); - index.write().expect("write index"); - let tree_id = index.write_tree().expect("write tree"); - let tree = repo.find_tree(tree_id).expect("find tree"); - let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); - repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) - .expect("commit"); - - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root/*\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - assert!( - !should_skip_ignored_path_with_cache( - &repo, - Path::new("ignored_root/tracked.txt"), - None, - ), - "tracked file should not be skipped even if ignore pattern matches its path" - ); - } - - #[test] - fn check_ignore_with_git_treats_tracked_file_as_not_ignored() { - let (root, repo) = create_temp_repo(); - let tracked_path = root.join("ignored_root/tracked.txt"); - fs::create_dir_all(tracked_path.parent().expect("parent")).expect("create tracked dir"); - fs::write(&tracked_path, "tracked\n").expect("write tracked file"); - let mut index = repo.index().expect("repo index"); - index - .add_path(Path::new("ignored_root/tracked.txt")) - .expect("add tracked path"); - index.write().expect("write index"); - let tree_id = index.write_tree().expect("write tree"); - let tree = repo.find_tree(tree_id).expect("find tree"); - let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); - repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) - .expect("commit"); - - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root/*\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - assert_eq!( - check_ignore_with_git(&repo, Path::new("ignored_root/tracked.txt")), - Some(false), - "git check-ignore should treat tracked files as not ignored" - ); - } - - #[test] - fn should_skip_ignored_path_respects_repo_negation_over_global_ignore() { - let (root, repo) = create_temp_repo(); - - fs::write(root.join(".gitignore"), "!keep.log\n").expect("write repo gitignore"); - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "*.log\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - assert_eq!( - check_ignore_with_git(&repo, Path::new("keep.log")), - Some(false), - "repo negation should override global ignore for keep.log" - ); - assert!( - !should_skip_ignored_path_with_cache(&repo, Path::new("keep.log"), None), - "keep.log should remain visible when repo .gitignore negates global ignore" - ); - } - - #[test] - fn collect_ignored_paths_with_git_checks_multiple_paths_in_one_call() { - let (root, repo) = create_temp_repo(); - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - let ignored_path = PathBuf::from("ignored_root/example/foo/bar.txt"); - let visible_path = PathBuf::from("visible.txt"); - let ignored_paths = collect_ignored_paths_with_git( - &repo, - &[ignored_path.clone(), visible_path.clone()], - ) - .expect("collect ignored paths"); - - assert!(ignored_paths.contains(&ignored_path)); - assert!(!ignored_paths.contains(&visible_path)); - } - - #[test] - fn collect_ignored_paths_with_git_handles_large_ignored_output() { - let (root, repo) = create_temp_repo(); - let excludes_path = root.join("global-excludes.txt"); - fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); - let mut config = repo.config().expect("repo config"); - config - .set_str( - "core.excludesfile", - excludes_path.to_string_lossy().as_ref(), - ) - .expect("set core.excludesfile"); - - let total = 6000usize; - let paths: Vec = (0..total) - .map(|i| PathBuf::from(format!("ignored_root/deep/path/file-{i}.txt"))) - .collect(); - let ignored_paths = - collect_ignored_paths_with_git(&repo, &paths).expect("collect ignored paths"); - - assert_eq!(ignored_paths.len(), total); - } + commands::create_git_branch_inner(workspaces, workspace_id, name).await } diff --git a/src-tauri/src/shared/git_ui_core/commands.rs b/src-tauri/src/shared/git_ui_core/commands.rs new file mode 100644 index 000000000..4c30bf7eb --- /dev/null +++ b/src-tauri/src/shared/git_ui_core/commands.rs @@ -0,0 +1,708 @@ +use std::collections::HashMap; +use std::fs; +use std::path::{Path, PathBuf}; + +use git2::{BranchType, Repository, Status, StatusOptions}; +use serde_json::{json, Value}; +use tokio::sync::Mutex; + +use crate::git_utils::{checkout_branch, list_git_roots as scan_git_roots, resolve_git_root}; +use crate::shared::process_core::tokio_command; +use crate::types::{BranchInfo, WorkspaceEntry}; +use crate::utils::{git_env_path, normalize_git_path, resolve_git_binary}; + +use super::context::workspace_entry_for_id; + +async fn run_git_command(repo_root: &Path, args: &[&str]) -> Result<(), String> { + let git_bin = resolve_git_binary().map_err(|e| format!("Failed to run git: {e}"))?; + let output = tokio_command(git_bin) + .args(args) + .current_dir(repo_root) + .env("PATH", git_env_path()) + .output() + .await + .map_err(|e| format!("Failed to run git: {e}"))?; + + if output.status.success() { + return Ok(()); + } + + let stderr = String::from_utf8_lossy(&output.stderr); + let stdout = String::from_utf8_lossy(&output.stdout); + let detail = if stderr.trim().is_empty() { + stdout.trim() + } else { + stderr.trim() + }; + if detail.is_empty() { + return Err("Git command failed.".to_string()); + } + Err(detail.to_string()) +} + +async fn run_gh_command(repo_root: &Path, args: &[&str]) -> Result<(String, String), String> { + let output = tokio_command("gh") + .args(args) + .current_dir(repo_root) + .output() + .await + .map_err(|e| format!("Failed to run gh: {e}"))?; + + let stdout = String::from_utf8_lossy(&output.stdout).to_string(); + let stderr = String::from_utf8_lossy(&output.stderr).to_string(); + if output.status.success() { + return Ok((stdout, stderr)); + } + + let detail = if stderr.trim().is_empty() { + stdout.trim() + } else { + stderr.trim() + }; + if detail.is_empty() { + return Err("GitHub CLI command failed.".to_string()); + } + Err(detail.to_string()) +} + +async fn gh_stdout_trim(repo_root: &Path, args: &[&str]) -> Result { + let (stdout, _) = run_gh_command(repo_root, args).await?; + Ok(stdout.trim().to_string()) +} + +async fn gh_git_protocol(repo_root: &Path) -> String { + gh_stdout_trim(repo_root, &["config", "get", "git_protocol"]) + .await + .unwrap_or_else(|_| "https".to_string()) +} + +fn count_effective_dir_entries(root: &Path) -> Result { + let entries = fs::read_dir(root).map_err(|err| format!("Failed to read directory: {err}"))?; + let mut count = 0usize; + for entry in entries { + let entry = + entry.map_err(|err| format!("Failed to read directory entry in {}: {err}", root.display()))?; + let name = entry.file_name(); + let name = name.to_string_lossy(); + if name == ".git" || name == ".DS_Store" || name == "Thumbs.db" { + continue; + } + count += 1; + } + Ok(count) +} + +fn validate_branch_name(name: &str) -> Result { + let trimmed = name.trim(); + if trimmed.is_empty() { + return Err("Branch name is required.".to_string()); + } + if trimmed == "." || trimmed == ".." { + return Err("Branch name cannot be '.' or '..'.".to_string()); + } + if trimmed.chars().any(|ch| ch.is_whitespace()) { + return Err("Branch name cannot contain spaces.".to_string()); + } + if trimmed.starts_with('/') || trimmed.ends_with('/') { + return Err("Branch name cannot start or end with '/'.".to_string()); + } + if trimmed.ends_with(".lock") { + return Err("Branch name cannot end with '.lock'.".to_string()); + } + if trimmed.contains("..") { + return Err("Branch name cannot contain '..'.".to_string()); + } + if trimmed.contains("@{") { + return Err("Branch name cannot contain '@{'.".to_string()); + } + let invalid_chars = ['~', '^', ':', '?', '*', '[', '\\']; + if trimmed.chars().any(|ch| invalid_chars.contains(&ch)) { + return Err("Branch name contains invalid characters.".to_string()); + } + if trimmed.ends_with('.') { + return Err("Branch name cannot end with '.'.".to_string()); + } + Ok(trimmed.to_string()) +} + +fn validate_github_repo_name(value: &str) -> Result { + let trimmed = value.trim(); + if trimmed.is_empty() { + return Err("Repository name is required.".to_string()); + } + if trimmed.chars().any(|ch| ch.is_whitespace()) { + return Err("Repository name cannot contain spaces.".to_string()); + } + if trimmed.starts_with('/') || trimmed.ends_with('/') { + return Err("Repository name cannot start or end with '/'.".to_string()); + } + if trimmed.contains("//") { + return Err("Repository name cannot contain '//'.".to_string()); + } + Ok(trimmed.to_string()) +} + +fn github_repo_exists_message(lower: &str) -> bool { + lower.contains("already exists") + || lower.contains("name already exists") + || lower.contains("has already been taken") + || lower.contains("repository with this name already exists") +} + +fn normalize_repo_full_name(value: &str) -> String { + value + .trim() + .trim_start_matches("https://github.com/") + .trim_start_matches("http://github.com/") + .trim_start_matches("git@github.com:") + .trim_end_matches(".git") + .trim_end_matches('/') + .to_string() +} + +fn git_remote_url(repo_root: &Path, remote_name: &str) -> Option { + let repo = Repository::open(repo_root).ok()?; + let remote = repo.find_remote(remote_name).ok()?; + remote.url().map(|url| url.to_string()) +} + +pub(super) fn action_paths_for_file(repo_root: &Path, path: &str) -> Vec { + let target = normalize_git_path(path).trim().to_string(); + if target.is_empty() { + return Vec::new(); + } + + let repo = match Repository::open(repo_root) { + Ok(repo) => repo, + Err(_) => return vec![target], + }; + + let mut status_options = StatusOptions::new(); + status_options + .include_untracked(true) + .recurse_untracked_dirs(true) + .renames_head_to_index(true) + .renames_index_to_workdir(true) + .include_ignored(false); + + let statuses = match repo.statuses(Some(&mut status_options)) { + Ok(statuses) => statuses, + Err(_) => return vec![target], + }; + + for entry in statuses.iter() { + let status = entry.status(); + if !(status.contains(Status::WT_RENAMED) || status.contains(Status::INDEX_RENAMED)) { + continue; + } + let delta = entry.index_to_workdir().or_else(|| entry.head_to_index()); + let Some(delta) = delta else { + continue; + }; + let (Some(old_path), Some(new_path)) = (delta.old_file().path(), delta.new_file().path()) + else { + continue; + }; + let old_path = normalize_git_path(old_path.to_string_lossy().as_ref()); + let new_path = normalize_git_path(new_path.to_string_lossy().as_ref()); + if old_path != target && new_path != target { + continue; + } + if old_path == new_path || new_path.is_empty() { + return vec![target]; + } + let mut result = Vec::new(); + if !old_path.is_empty() { + result.push(old_path); + } + if !new_path.is_empty() && !result.contains(&new_path) { + result.push(new_path); + } + return if result.is_empty() { + vec![target] + } else { + result + }; + } + + vec![target] +} + +fn parse_upstream_ref(name: &str) -> Option<(String, String)> { + let trimmed = name.strip_prefix("refs/remotes/").unwrap_or(name); + let mut parts = trimmed.splitn(2, '/'); + let remote = parts.next()?; + let branch = parts.next()?; + if remote.is_empty() || branch.is_empty() { + return None; + } + Some((remote.to_string(), branch.to_string())) +} + +fn upstream_remote_and_branch(repo_root: &Path) -> Result, String> { + let repo = Repository::open(repo_root).map_err(|e| e.to_string())?; + let head = match repo.head() { + Ok(head) => head, + Err(_) => return Ok(None), + }; + if !head.is_branch() { + return Ok(None); + } + let branch_name = match head.shorthand() { + Some(name) => name, + None => return Ok(None), + }; + let branch = repo + .find_branch(branch_name, BranchType::Local) + .map_err(|e| e.to_string())?; + let upstream_branch = match branch.upstream() { + Ok(upstream) => upstream, + Err(_) => return Ok(None), + }; + let upstream_ref = upstream_branch.get(); + let upstream_name = upstream_ref.name().or_else(|| upstream_ref.shorthand()); + Ok(upstream_name.and_then(parse_upstream_ref)) +} + +async fn push_with_upstream(repo_root: &Path) -> Result<(), String> { + let upstream = upstream_remote_and_branch(repo_root)?; + if let Some((remote, branch)) = upstream { + let _ = run_git_command(repo_root, &["fetch", "--prune", remote.as_str()]).await; + let refspec = format!("HEAD:{branch}"); + return run_git_command(repo_root, &["push", remote.as_str(), refspec.as_str()]).await; + } + run_git_command(repo_root, &["push"]).await +} + +async fn fetch_with_default_remote(repo_root: &Path) -> Result<(), String> { + let upstream = upstream_remote_and_branch(repo_root)?; + if let Some((remote, _)) = upstream { + return run_git_command(repo_root, &["fetch", "--prune", remote.as_str()]).await; + } + run_git_command(repo_root, &["fetch", "--prune"]).await +} + +async fn pull_with_default_strategy(repo_root: &Path) -> Result<(), String> { + fn autostash_unsupported(lower: &str) -> bool { + lower.contains("unknown option") && lower.contains("autostash") + } + + fn needs_reconcile_strategy(lower: &str) -> bool { + lower.contains("need to specify how to reconcile divergent branches") + || lower.contains("you have divergent branches") + } + + match run_git_command(repo_root, &["pull", "--autostash"]).await { + Ok(()) => Ok(()), + Err(err) => { + let lower = err.to_lowercase(); + if autostash_unsupported(&lower) { + match run_git_command(repo_root, &["pull"]).await { + Ok(()) => Ok(()), + Err(no_autostash_err) => { + let no_autostash_lower = no_autostash_err.to_lowercase(); + if needs_reconcile_strategy(&no_autostash_lower) { + return run_git_command(repo_root, &["pull", "--no-rebase"]).await; + } + Err(no_autostash_err) + } + } + } else if needs_reconcile_strategy(&lower) { + match run_git_command(repo_root, &["pull", "--no-rebase", "--autostash"]).await { + Ok(()) => Ok(()), + Err(merge_err) => { + let merge_lower = merge_err.to_lowercase(); + if autostash_unsupported(&merge_lower) { + return run_git_command(repo_root, &["pull", "--no-rebase"]).await; + } + Err(merge_err) + } + } + } else { + Err(err) + } + } + } +} + +pub(super) async fn stage_git_file_inner( + workspaces: &Mutex>, + workspace_id: String, + path: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + for path in action_paths_for_file(&repo_root, &path) { + run_git_command(&repo_root, &["add", "-A", "--", &path]).await?; + } + Ok(()) +} + +pub(super) async fn stage_git_all_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + run_git_command(&repo_root, &["add", "-A"]).await +} + +pub(super) async fn unstage_git_file_inner( + workspaces: &Mutex>, + workspace_id: String, + path: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + for path in action_paths_for_file(&repo_root, &path) { + run_git_command(&repo_root, &["restore", "--staged", "--", &path]).await?; + } + Ok(()) +} + +pub(super) async fn revert_git_file_inner( + workspaces: &Mutex>, + workspace_id: String, + path: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + for path in action_paths_for_file(&repo_root, &path) { + if run_git_command( + &repo_root, + &["restore", "--staged", "--worktree", "--", &path], + ) + .await + .is_ok() + { + continue; + } + run_git_command(&repo_root, &["clean", "-f", "--", &path]).await?; + } + Ok(()) +} + +pub(super) async fn revert_git_all_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + run_git_command( + &repo_root, + &["restore", "--staged", "--worktree", "--", "."], + ) + .await?; + run_git_command(&repo_root, &["clean", "-f", "-d"]).await +} + +pub(super) async fn commit_git_inner( + workspaces: &Mutex>, + workspace_id: String, + message: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + run_git_command(&repo_root, &["commit", "-m", &message]).await +} + +pub(super) async fn push_git_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + push_with_upstream(&repo_root).await +} + +pub(super) async fn pull_git_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + pull_with_default_strategy(&repo_root).await +} + +pub(super) async fn fetch_git_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + fetch_with_default_remote(&repo_root).await +} + +pub(super) async fn sync_git_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + pull_with_default_strategy(&repo_root).await?; + push_with_upstream(&repo_root).await +} + +pub(super) async fn list_git_roots_inner( + workspaces: &Mutex>, + workspace_id: String, + depth: Option, +) -> Result, String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let root = PathBuf::from(&entry.path); + let depth = depth.unwrap_or(2).clamp(1, 6); + Ok(scan_git_roots(&root, depth, 200)) +} + +pub(super) async fn init_git_repo_inner( + workspaces: &Mutex>, + workspace_id: String, + branch: String, + force: bool, +) -> Result { + const INITIAL_COMMIT_MESSAGE: &str = "Initial commit"; + + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let branch = validate_branch_name(&branch)?; + + if Repository::open(&repo_root).is_ok() { + return Ok(json!({ "status": "already_initialized" })); + } + + if !force { + let entry_count = count_effective_dir_entries(&repo_root)?; + if entry_count > 0 { + return Ok(json!({ "status": "needs_confirmation", "entryCount": entry_count })); + } + } + + let init_with_branch = + run_git_command(&repo_root, &["init", "--initial-branch", branch.as_str()]).await; + + if let Err(error) = init_with_branch { + let lower = error.to_lowercase(); + let unsupported = lower.contains("initial-branch") + && (lower.contains("unknown option") + || lower.contains("unrecognized option") + || lower.contains("unknown switch") + || lower.contains("usage:")); + if !unsupported { + return Err(error); + } + + run_git_command(&repo_root, &["init"]).await?; + let head_ref = format!("refs/heads/{branch}"); + run_git_command(&repo_root, &["symbolic-ref", "HEAD", head_ref.as_str()]).await?; + } + + let commit_error = match run_git_command(&repo_root, &["add", "-A"]).await { + Ok(()) => match run_git_command( + &repo_root, + &["commit", "--allow-empty", "-m", INITIAL_COMMIT_MESSAGE], + ) + .await + { + Ok(()) => None, + Err(err) => Some(err), + }, + Err(err) => Some(err), + }; + + if let Some(commit_error) = commit_error { + return Ok(json!({ "status": "initialized", "commitError": commit_error })); + } + + Ok(json!({ "status": "initialized" })) +} + +pub(super) async fn create_github_repo_inner( + workspaces: &Mutex>, + workspace_id: String, + repo: String, + visibility: String, + branch: Option, +) -> Result { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo = normalize_repo_full_name(&validate_github_repo_name(&repo)?); + + let visibility_flag = match visibility.trim() { + "private" => "--private", + "public" => "--public", + other => return Err(format!("Invalid repo visibility: {other}")), + }; + + let local_repo = Repository::open(&repo_root) + .map_err(|_| "Git is not initialized in this folder yet.".to_string())?; + let origin_url_before = local_repo + .find_remote("origin") + .ok() + .and_then(|remote| remote.url().map(|url| url.to_string())); + + let full_name = if repo.contains('/') { + repo + } else { + let owner = gh_stdout_trim(&repo_root, &["api", "user", "--jq", ".login"]).await?; + if owner.trim().is_empty() { + return Err("Failed to determine GitHub username.".to_string()); + } + format!("{owner}/{repo}") + }; + + if origin_url_before.is_none() { + let create_result = run_gh_command( + &repo_root, + &[ + "repo", + "create", + &full_name, + visibility_flag, + "--source=.", + "--remote=origin", + ], + ) + .await; + + if let Err(error) = create_result { + let lower = error.to_lowercase(); + if !github_repo_exists_message(&lower) { + return Err(error); + } + } + } + + if git_remote_url(&repo_root, "origin").is_none() { + let protocol = gh_git_protocol(&repo_root).await; + let jq_field = if protocol.trim() == "ssh" { + ".sshUrl" + } else { + ".httpsUrl" + }; + let remote_url = gh_stdout_trim( + &repo_root, + &[ + "repo", + "view", + &full_name, + "--json", + "sshUrl,httpsUrl", + "--jq", + jq_field, + ], + ) + .await?; + if remote_url.trim().is_empty() { + return Err("Failed to resolve GitHub remote URL.".to_string()); + } + run_git_command(&repo_root, &["remote", "add", "origin", remote_url.trim()]).await?; + } + + let remote_url = git_remote_url(&repo_root, "origin"); + let push_result = run_git_command(&repo_root, &["push", "-u", "origin", "HEAD"]).await; + + let default_branch = if let Some(branch) = branch { + Some(validate_branch_name(&branch)?) + } else { + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + let head = repo.head().ok(); + let name = head + .as_ref() + .filter(|head| head.is_branch()) + .and_then(|head| head.shorthand()) + .map(str::to_string); + name.and_then(|name| validate_branch_name(&name).ok()) + }; + + let default_branch_result = if let Some(branch) = default_branch.as_deref() { + run_gh_command( + &repo_root, + &[ + "api", + "-X", + "PATCH", + &format!("/repos/{full_name}"), + "-f", + &format!("default_branch={branch}"), + ], + ) + .await + .map(|_| ()) + } else { + Ok(()) + }; + + let push_error = push_result.err(); + let default_branch_error = default_branch_result.err(); + + if push_error.is_some() || default_branch_error.is_some() { + return Ok(json!({ + "status": "partial", + "repo": full_name, + "remoteUrl": remote_url, + "pushError": push_error, + "defaultBranchError": default_branch_error, + })); + } + + Ok(json!({ + "status": "ok", + "repo": full_name, + "remoteUrl": remote_url, + })) +} + +pub(super) async fn list_git_branches_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + let mut branches = Vec::new(); + let refs = repo + .branches(Some(BranchType::Local)) + .map_err(|e| e.to_string())?; + for branch_result in refs { + let (branch, _) = branch_result.map_err(|e| e.to_string())?; + let name = branch.name().ok().flatten().unwrap_or("").to_string(); + if name.is_empty() { + continue; + } + let last_commit = branch + .get() + .target() + .and_then(|oid| repo.find_commit(oid).ok()) + .map(|commit| commit.time().seconds()) + .unwrap_or(0); + branches.push(BranchInfo { name, last_commit }); + } + branches.sort_by(|a, b| b.last_commit.cmp(&a.last_commit)); + Ok(json!({ "branches": branches })) +} + +pub(super) async fn checkout_git_branch_inner( + workspaces: &Mutex>, + workspace_id: String, + name: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + checkout_branch(&repo, &name).map_err(|e| e.to_string()) +} + +pub(super) async fn create_git_branch_inner( + workspaces: &Mutex>, + workspace_id: String, + name: String, +) -> Result<(), String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + let head = repo.head().map_err(|e| e.to_string())?; + let target = head.peel_to_commit().map_err(|e| e.to_string())?; + repo.branch(&name, &target, false) + .map_err(|e| e.to_string())?; + checkout_branch(&repo, &name).map_err(|e| e.to_string()) +} diff --git a/src-tauri/src/shared/git_ui_core/context.rs b/src-tauri/src/shared/git_ui_core/context.rs new file mode 100644 index 000000000..13e02648a --- /dev/null +++ b/src-tauri/src/shared/git_ui_core/context.rs @@ -0,0 +1,26 @@ +use std::collections::HashMap; +use std::path::PathBuf; + +use tokio::sync::Mutex; + +use crate::git_utils::resolve_git_root; +use crate::types::WorkspaceEntry; + +pub(super) async fn workspace_entry_for_id( + workspaces: &Mutex>, + workspace_id: &str, +) -> Result { + let workspaces = workspaces.lock().await; + workspaces + .get(workspace_id) + .cloned() + .ok_or_else(|| "workspace not found".to_string()) +} + +pub(super) async fn resolve_repo_root_for_workspace( + workspaces: &Mutex>, + workspace_id: String, +) -> Result { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + resolve_git_root(&entry) +} diff --git a/src-tauri/src/shared/git_ui_core/diff.rs b/src-tauri/src/shared/git_ui_core/diff.rs new file mode 100644 index 000000000..9ed0bc200 --- /dev/null +++ b/src-tauri/src/shared/git_ui_core/diff.rs @@ -0,0 +1,746 @@ +use std::collections::{HashMap, HashSet}; +use std::fs; +use std::io::{Read, Write}; +use std::path::{Path, PathBuf}; +use std::process::Stdio; + +use base64::{engine::general_purpose::STANDARD, Engine as _}; +use git2::{DiffOptions, Repository, Status, StatusOptions}; +use serde_json::{json, Value}; +use tokio::sync::Mutex; + +use crate::git_utils::{ + diff_patch_to_string, diff_stats_for_path, image_mime_type, resolve_git_root, +}; +use crate::types::{AppSettings, GitCommitDiff, GitFileDiff, GitFileStatus, WorkspaceEntry}; +use crate::utils::{git_env_path, normalize_git_path, resolve_git_binary}; + +use super::context::workspace_entry_for_id; + +const INDEX_SKIP_WORKTREE_FLAG: u16 = 0x4000; +const MAX_IMAGE_BYTES: usize = 10 * 1024 * 1024; +const MAX_TEXT_DIFF_BYTES: usize = 2 * 1024 * 1024; + +fn encode_image_base64(data: &[u8]) -> Option { + if data.len() > MAX_IMAGE_BYTES { + return None; + } + Some(STANDARD.encode(data)) +} + +fn blob_to_base64(blob: git2::Blob) -> Option { + if blob.size() > MAX_IMAGE_BYTES { + return None; + } + encode_image_base64(blob.content()) +} + +fn read_image_base64(path: &Path) -> Option { + let metadata = fs::metadata(path).ok()?; + if metadata.len() > MAX_IMAGE_BYTES as u64 { + return None; + } + let data = fs::read(path).ok()?; + encode_image_base64(&data) +} + +fn bytes_look_binary(bytes: &[u8]) -> bool { + bytes.iter().take(8192).any(|byte| *byte == 0) +} + +fn split_lines_preserving_newlines(content: &str) -> Vec { + if content.is_empty() { + return Vec::new(); + } + content + .split_inclusive('\n') + .map(ToString::to_string) + .collect() +} + +fn blob_to_lines(blob: git2::Blob<'_>) -> Option> { + if blob.size() > MAX_TEXT_DIFF_BYTES || blob.is_binary() { + return None; + } + let content = String::from_utf8_lossy(blob.content()); + Some(split_lines_preserving_newlines(content.as_ref())) +} + +fn read_text_lines(path: &Path) -> Option> { + let metadata = fs::metadata(path).ok()?; + if metadata.len() > MAX_TEXT_DIFF_BYTES as u64 { + return None; + } + let data = fs::read(path).ok()?; + if bytes_look_binary(&data) { + return None; + } + let content = String::from_utf8_lossy(&data); + Some(split_lines_preserving_newlines(content.as_ref())) +} + +fn status_for_index(status: Status) -> Option<&'static str> { + if status.contains(Status::INDEX_NEW) { + Some("A") + } else if status.contains(Status::INDEX_MODIFIED) { + Some("M") + } else if status.contains(Status::INDEX_DELETED) { + Some("D") + } else if status.contains(Status::INDEX_RENAMED) { + Some("R") + } else if status.contains(Status::INDEX_TYPECHANGE) { + Some("T") + } else { + None + } +} + +fn status_for_workdir(status: Status) -> Option<&'static str> { + if status.contains(Status::WT_NEW) { + Some("A") + } else if status.contains(Status::WT_MODIFIED) { + Some("M") + } else if status.contains(Status::WT_DELETED) { + Some("D") + } else if status.contains(Status::WT_RENAMED) { + Some("R") + } else if status.contains(Status::WT_TYPECHANGE) { + Some("T") + } else { + None + } +} + +fn status_for_delta(status: git2::Delta) -> &'static str { + match status { + git2::Delta::Added => "A", + git2::Delta::Modified => "M", + git2::Delta::Deleted => "D", + git2::Delta::Renamed => "R", + git2::Delta::Typechange => "T", + _ => "M", + } +} + +fn has_ignored_parent_directory(repo: &Repository, path: &Path) -> bool { + let mut current = path.parent(); + while let Some(parent) = current { + if parent.as_os_str().is_empty() { + break; + } + let probe = parent.join(".codexmonitor-ignore-probe"); + if repo.status_should_ignore(&probe).unwrap_or(false) { + return true; + } + current = parent.parent(); + } + false +} + +pub(super) fn collect_ignored_paths_with_git( + repo: &Repository, + paths: &[PathBuf], +) -> Option> { + if paths.is_empty() { + return Some(HashSet::new()); + } + + let repo_root = repo.workdir()?; + let git_bin = resolve_git_binary().ok()?; + let mut child = std::process::Command::new(git_bin) + .arg("check-ignore") + .arg("--stdin") + .arg("-z") + .current_dir(repo_root) + .env("PATH", git_env_path()) + .stdin(Stdio::piped()) + .stdout(Stdio::piped()) + .stderr(Stdio::null()) + .spawn() + .ok()?; + + let mut stdout = child.stdout.take()?; + let stdout_thread = std::thread::spawn(move || { + let mut buffer = Vec::new(); + stdout.read_to_end(&mut buffer).ok()?; + Some(buffer) + }); + + let wrote_all_input = { + let mut wrote_all = true; + if let Some(mut stdin) = child.stdin.take() { + for path in paths { + if stdin + .write_all(path.as_os_str().as_encoded_bytes()) + .is_err() + { + wrote_all = false; + break; + } + if stdin.write_all(&[0]).is_err() { + wrote_all = false; + break; + } + } + } else { + wrote_all = false; + } + wrote_all + }; + + if !wrote_all_input { + let _ = child.kill(); + let _ = child.wait(); + let _ = stdout_thread.join(); + return None; + } + + let status = child.wait().ok()?; + let stdout = stdout_thread.join().ok().flatten()?; + match status.code() { + Some(0) | Some(1) => {} + _ => return None, + } + + let mut ignored_paths = HashSet::new(); + for raw in stdout.split(|byte| *byte == 0) { + if raw.is_empty() { + continue; + } + let path = String::from_utf8_lossy(raw); + ignored_paths.insert(PathBuf::from(path.as_ref())); + } + Some(ignored_paths) +} + +pub(super) fn check_ignore_with_git(repo: &Repository, path: &Path) -> Option { + let ignored_paths = collect_ignored_paths_with_git(repo, &[path.to_path_buf()])?; + Some(ignored_paths.contains(path)) +} + +fn is_tracked_path(repo: &Repository, path: &Path) -> bool { + if let Ok(index) = repo.index() { + if index.get_path(path, 0).is_some() { + return true; + } + } + if let Ok(head) = repo.head() { + if let Ok(tree) = head.peel_to_tree() { + if tree.get_path(path).is_ok() { + return true; + } + } + } + false +} + +pub(super) fn should_skip_ignored_path_with_cache( + repo: &Repository, + path: &Path, + ignored_paths: Option<&HashSet>, +) -> bool { + if is_tracked_path(repo, path) { + return false; + } + if let Some(ignored_paths) = ignored_paths { + return ignored_paths.contains(path); + } + if let Some(ignored) = check_ignore_with_git(repo, path) { + return ignored; + } + // Fallback when git check-ignore is unavailable. + repo.status_should_ignore(path).unwrap_or(false) || has_ignored_parent_directory(repo, path) +} + +fn build_combined_diff(repo: &Repository, diff: &git2::Diff) -> String { + let diff_entries: Vec<(usize, PathBuf)> = diff + .deltas() + .enumerate() + .filter_map(|(index, delta)| { + delta + .new_file() + .path() + .or_else(|| delta.old_file().path()) + .map(|path| (index, path.to_path_buf())) + }) + .collect(); + let diff_paths: Vec = diff_entries.iter().map(|(_, path)| path.clone()).collect(); + let ignored_paths = collect_ignored_paths_with_git(repo, &diff_paths); + + let mut combined_diff = String::new(); + for (index, path) in diff_entries { + if should_skip_ignored_path_with_cache(repo, &path, ignored_paths.as_ref()) { + continue; + } + let patch = match git2::Patch::from_diff(diff, index) { + Ok(patch) => patch, + Err(_) => continue, + }; + let Some(mut patch) = patch else { + continue; + }; + let content = match diff_patch_to_string(&mut patch) { + Ok(content) => content, + Err(_) => continue, + }; + if content.trim().is_empty() { + continue; + } + if !combined_diff.is_empty() { + combined_diff.push_str("\n\n"); + } + combined_diff.push_str(&format!("=== {} ===\n", path.display())); + combined_diff.push_str(&content); + } + combined_diff +} + +pub(super) fn collect_workspace_diff(repo_root: &Path) -> Result { + let repo = Repository::open(repo_root).map_err(|e| e.to_string())?; + let head_tree = repo.head().ok().and_then(|head| head.peel_to_tree().ok()); + + let mut options = DiffOptions::new(); + let index = repo.index().map_err(|e| e.to_string())?; + let diff = match head_tree.as_ref() { + Some(tree) => repo + .diff_tree_to_index(Some(tree), Some(&index), Some(&mut options)) + .map_err(|e| e.to_string())?, + None => repo + .diff_tree_to_index(None, Some(&index), Some(&mut options)) + .map_err(|e| e.to_string())?, + }; + let combined_diff = build_combined_diff(&repo, &diff); + if !combined_diff.trim().is_empty() { + return Ok(combined_diff); + } + + let mut options = DiffOptions::new(); + options + .include_untracked(true) + .recurse_untracked_dirs(true) + .show_untracked_content(true); + let diff = match head_tree.as_ref() { + Some(tree) => repo + .diff_tree_to_workdir_with_index(Some(tree), Some(&mut options)) + .map_err(|e| e.to_string())?, + None => repo + .diff_tree_to_workdir_with_index(None, Some(&mut options)) + .map_err(|e| e.to_string())?, + }; + Ok(build_combined_diff(&repo, &diff)) +} + +pub(super) async fn get_git_status_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + + let branch_name = repo + .head() + .ok() + .and_then(|head| head.shorthand().map(|s| s.to_string())) + .unwrap_or_else(|| "unknown".to_string()); + + let mut status_options = StatusOptions::new(); + status_options + .include_untracked(true) + .recurse_untracked_dirs(true) + .renames_head_to_index(true) + .renames_index_to_workdir(true) + .include_ignored(false); + + let statuses = repo + .statuses(Some(&mut status_options)) + .map_err(|e| e.to_string())?; + let status_paths: Vec = statuses + .iter() + .filter_map(|entry| entry.path().map(PathBuf::from)) + .filter(|path| !path.as_os_str().is_empty()) + .collect(); + let ignored_paths = collect_ignored_paths_with_git(&repo, &status_paths); + + let head_tree = repo.head().ok().and_then(|head| head.peel_to_tree().ok()); + let index = repo.index().ok(); + + let mut files = Vec::new(); + let mut staged_files = Vec::new(); + let mut unstaged_files = Vec::new(); + let mut total_additions = 0i64; + let mut total_deletions = 0i64; + for entry in statuses.iter() { + let path = entry.path().unwrap_or(""); + if path.is_empty() { + continue; + } + if should_skip_ignored_path_with_cache(&repo, Path::new(path), ignored_paths.as_ref()) { + continue; + } + if let Some(index) = index.as_ref() { + if let Some(entry) = index.get_path(Path::new(path), 0) { + if entry.flags_extended & INDEX_SKIP_WORKTREE_FLAG != 0 { + continue; + } + } + } + let status = entry.status(); + let normalized_path = normalize_git_path(path); + let include_index = status.intersects( + Status::INDEX_NEW + | Status::INDEX_MODIFIED + | Status::INDEX_DELETED + | Status::INDEX_RENAMED + | Status::INDEX_TYPECHANGE, + ); + let include_workdir = status.intersects( + Status::WT_NEW + | Status::WT_MODIFIED + | Status::WT_DELETED + | Status::WT_RENAMED + | Status::WT_TYPECHANGE, + ); + let mut combined_additions = 0i64; + let mut combined_deletions = 0i64; + + if include_index { + let (additions, deletions) = + diff_stats_for_path(&repo, head_tree.as_ref(), path, true, false).unwrap_or((0, 0)); + if let Some(status_str) = status_for_index(status) { + staged_files.push(GitFileStatus { + path: normalized_path.clone(), + status: status_str.to_string(), + additions, + deletions, + }); + } + combined_additions += additions; + combined_deletions += deletions; + total_additions += additions; + total_deletions += deletions; + } + + if include_workdir { + let (additions, deletions) = + diff_stats_for_path(&repo, head_tree.as_ref(), path, false, true).unwrap_or((0, 0)); + if let Some(status_str) = status_for_workdir(status) { + unstaged_files.push(GitFileStatus { + path: normalized_path.clone(), + status: status_str.to_string(), + additions, + deletions, + }); + } + combined_additions += additions; + combined_deletions += deletions; + total_additions += additions; + total_deletions += deletions; + } + + if include_index || include_workdir { + let status_str = status_for_workdir(status) + .or_else(|| status_for_index(status)) + .unwrap_or("--"); + files.push(GitFileStatus { + path: normalized_path, + status: status_str.to_string(), + additions: combined_additions, + deletions: combined_deletions, + }); + } + } + + Ok(json!({ + "branchName": branch_name, + "files": files, + "stagedFiles": staged_files, + "unstagedFiles": unstaged_files, + "totalAdditions": total_additions, + "totalDeletions": total_deletions, + })) +} + +pub(super) async fn get_git_diffs_inner( + workspaces: &Mutex>, + app_settings: &Mutex, + workspace_id: String, +) -> Result, String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let ignore_whitespace_changes = { + let settings = app_settings.lock().await; + settings.git_diff_ignore_whitespace_changes + }; + + tokio::task::spawn_blocking(move || { + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + let head_tree = repo.head().ok().and_then(|head| head.peel_to_tree().ok()); + + let mut options = DiffOptions::new(); + options + .include_untracked(true) + .recurse_untracked_dirs(true) + .show_untracked_content(true); + options.ignore_whitespace_change(ignore_whitespace_changes); + + let diff = match head_tree.as_ref() { + Some(tree) => repo + .diff_tree_to_workdir_with_index(Some(tree), Some(&mut options)) + .map_err(|e| e.to_string())?, + None => repo + .diff_tree_to_workdir_with_index(None, Some(&mut options)) + .map_err(|e| e.to_string())?, + }; + let diff_paths: Vec = diff + .deltas() + .filter_map(|delta| delta.new_file().path().or_else(|| delta.old_file().path())) + .map(PathBuf::from) + .collect(); + let ignored_paths = collect_ignored_paths_with_git(&repo, &diff_paths); + + let mut results = Vec::new(); + for (index, delta) in diff.deltas().enumerate() { + let old_path = delta.old_file().path(); + let new_path = delta.new_file().path(); + let display_path = new_path.or(old_path); + let Some(display_path) = display_path else { + continue; + }; + if should_skip_ignored_path_with_cache(&repo, display_path, ignored_paths.as_ref()) { + continue; + } + let old_path_str = old_path.map(|path| path.to_string_lossy()); + let new_path_str = new_path.map(|path| path.to_string_lossy()); + let display_path_str = display_path.to_string_lossy(); + let normalized_path = normalize_git_path(&display_path_str); + let old_image_mime = old_path_str.as_deref().and_then(image_mime_type); + let new_image_mime = new_path_str.as_deref().and_then(image_mime_type); + let is_image = old_image_mime.is_some() || new_image_mime.is_some(); + let is_deleted = delta.status() == git2::Delta::Deleted; + let is_added = delta.status() == git2::Delta::Added; + + let old_lines = if !is_added { + head_tree + .as_ref() + .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) + .and_then(|entry| repo.find_blob(entry.id()).ok()) + .and_then(blob_to_lines) + } else { + None + }; + + let new_lines = if !is_deleted { + match new_path { + Some(path) => { + let full_path = repo_root.join(path); + read_text_lines(&full_path) + } + None => None, + } + } else { + None + }; + + if is_image { + let old_image_data = if !is_added && old_image_mime.is_some() { + head_tree + .as_ref() + .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) + .and_then(|entry| repo.find_blob(entry.id()).ok()) + .and_then(blob_to_base64) + } else { + None + }; + + let new_image_data = if !is_deleted && new_image_mime.is_some() { + match new_path { + Some(path) => { + let full_path = repo_root.join(path); + read_image_base64(&full_path) + } + None => None, + } + } else { + None + }; + + results.push(GitFileDiff { + path: normalized_path, + diff: String::new(), + old_lines: None, + new_lines: None, + is_binary: true, + is_image: true, + old_image_data, + new_image_data, + old_image_mime: old_image_mime.map(str::to_string), + new_image_mime: new_image_mime.map(str::to_string), + }); + continue; + } + + let patch = match git2::Patch::from_diff(&diff, index) { + Ok(patch) => patch, + Err(_) => continue, + }; + let Some(mut patch) = patch else { + continue; + }; + let content = match diff_patch_to_string(&mut patch) { + Ok(content) => content, + Err(_) => continue, + }; + if content.trim().is_empty() { + continue; + } + results.push(GitFileDiff { + path: normalized_path, + diff: content, + old_lines, + new_lines, + is_binary: false, + is_image: false, + old_image_data: None, + new_image_data: None, + old_image_mime: None, + new_image_mime: None, + }); + } + + Ok(results) + }) + .await + .map_err(|e| e.to_string())? +} + +pub(super) async fn get_git_commit_diff_inner( + workspaces: &Mutex>, + app_settings: &Mutex, + workspace_id: String, + sha: String, +) -> Result, String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + + let ignore_whitespace_changes = { + let settings = app_settings.lock().await; + settings.git_diff_ignore_whitespace_changes + }; + + let repo_root = resolve_git_root(&entry)?; + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + let oid = git2::Oid::from_str(&sha).map_err(|e| e.to_string())?; + let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; + let commit_tree = commit.tree().map_err(|e| e.to_string())?; + let parent_tree = commit.parent(0).ok().and_then(|parent| parent.tree().ok()); + + let mut options = DiffOptions::new(); + options.ignore_whitespace_change(ignore_whitespace_changes); + let diff = repo + .diff_tree_to_tree(parent_tree.as_ref(), Some(&commit_tree), Some(&mut options)) + .map_err(|e| e.to_string())?; + + let mut results = Vec::new(); + for (index, delta) in diff.deltas().enumerate() { + let old_path = delta.old_file().path(); + let new_path = delta.new_file().path(); + let display_path = new_path.or(old_path); + let Some(display_path) = display_path else { + continue; + }; + let old_path_str = old_path.map(|path| path.to_string_lossy()); + let new_path_str = new_path.map(|path| path.to_string_lossy()); + let display_path_str = display_path.to_string_lossy(); + let normalized_path = normalize_git_path(&display_path_str); + let old_image_mime = old_path_str.as_deref().and_then(image_mime_type); + let new_image_mime = new_path_str.as_deref().and_then(image_mime_type); + let is_image = old_image_mime.is_some() || new_image_mime.is_some(); + let is_deleted = delta.status() == git2::Delta::Deleted; + let is_added = delta.status() == git2::Delta::Added; + + let old_lines = if !is_added { + parent_tree + .as_ref() + .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) + .and_then(|entry| repo.find_blob(entry.id()).ok()) + .and_then(blob_to_lines) + } else { + None + }; + + let new_lines = if !is_deleted { + new_path + .and_then(|path| commit_tree.get_path(path).ok()) + .and_then(|entry| repo.find_blob(entry.id()).ok()) + .and_then(blob_to_lines) + } else { + None + }; + + if is_image { + let old_image_data = if !is_added && old_image_mime.is_some() { + parent_tree + .as_ref() + .and_then(|tree| old_path.and_then(|path| tree.get_path(path).ok())) + .and_then(|entry| repo.find_blob(entry.id()).ok()) + .and_then(blob_to_base64) + } else { + None + }; + + let new_image_data = if !is_deleted && new_image_mime.is_some() { + new_path + .and_then(|path| commit_tree.get_path(path).ok()) + .and_then(|entry| repo.find_blob(entry.id()).ok()) + .and_then(blob_to_base64) + } else { + None + }; + + results.push(GitCommitDiff { + path: normalized_path, + status: status_for_delta(delta.status()).to_string(), + diff: String::new(), + old_lines: None, + new_lines: None, + is_binary: true, + is_image: true, + old_image_data, + new_image_data, + old_image_mime: old_image_mime.map(str::to_string), + new_image_mime: new_image_mime.map(str::to_string), + }); + continue; + } + + let patch = match git2::Patch::from_diff(&diff, index) { + Ok(patch) => patch, + Err(_) => continue, + }; + let Some(mut patch) = patch else { + continue; + }; + let content = match diff_patch_to_string(&mut patch) { + Ok(content) => content, + Err(_) => continue, + }; + if content.trim().is_empty() { + continue; + } + results.push(GitCommitDiff { + path: normalized_path, + status: status_for_delta(delta.status()).to_string(), + diff: content, + old_lines, + new_lines, + is_binary: false, + is_image: false, + old_image_data: None, + new_image_data: None, + old_image_mime: None, + new_image_mime: None, + }); + } + + Ok(results) +} diff --git a/src-tauri/src/shared/git_ui_core/github.rs b/src-tauri/src/shared/git_ui_core/github.rs new file mode 100644 index 000000000..c5114a67d --- /dev/null +++ b/src-tauri/src/shared/git_ui_core/github.rs @@ -0,0 +1,329 @@ +use std::collections::HashMap; +use std::path::Path; + +use git2::Repository; +use tokio::sync::Mutex; + +use crate::git_utils::{parse_github_repo, resolve_git_root}; +use crate::shared::process_core::tokio_command; +use crate::types::{ + GitHubIssue, GitHubIssuesResponse, GitHubPullRequest, GitHubPullRequestComment, + GitHubPullRequestDiff, GitHubPullRequestsResponse, WorkspaceEntry, +}; +use crate::utils::normalize_git_path; + +use super::context::workspace_entry_for_id; + +fn github_repo_from_path(path: &Path) -> Result { + let repo = Repository::open(path).map_err(|e| e.to_string())?; + let remotes = repo.remotes().map_err(|e| e.to_string())?; + let name = if remotes.iter().any(|remote| remote == Some("origin")) { + "origin".to_string() + } else { + remotes.iter().flatten().next().unwrap_or("").to_string() + }; + if name.is_empty() { + return Err("No git remote configured.".to_string()); + } + let remote = repo.find_remote(&name).map_err(|e| e.to_string())?; + let remote_url = remote.url().ok_or("Remote has no URL configured.")?; + parse_github_repo(remote_url).ok_or("Remote is not a GitHub repository.".to_string()) +} + +fn parse_pr_diff(diff: &str) -> Vec { + let mut entries = Vec::new(); + let mut current_lines: Vec<&str> = Vec::new(); + let mut current_old_path: Option = None; + let mut current_new_path: Option = None; + let mut current_status: Option = None; + + let finalize = |lines: &Vec<&str>, + old_path: &Option, + new_path: &Option, + status: &Option, + results: &mut Vec| { + if lines.is_empty() { + return; + } + let diff_text = lines.join("\n"); + if diff_text.trim().is_empty() { + return; + } + let status_value = status.clone().unwrap_or_else(|| "M".to_string()); + let path = if status_value == "D" { + old_path.clone().unwrap_or_default() + } else { + new_path + .clone() + .or_else(|| old_path.clone()) + .unwrap_or_default() + }; + if path.is_empty() { + return; + } + results.push(GitHubPullRequestDiff { + path: normalize_git_path(&path), + status: status_value, + diff: diff_text, + }); + }; + + for line in diff.lines() { + if line.starts_with("diff --git ") { + finalize( + ¤t_lines, + ¤t_old_path, + ¤t_new_path, + ¤t_status, + &mut entries, + ); + current_lines = vec![line]; + current_old_path = None; + current_new_path = None; + current_status = None; + + let rest = line.trim_start_matches("diff --git ").trim(); + let mut parts = rest.split_whitespace(); + let old_part = parts.next().unwrap_or("").trim_start_matches("a/"); + let new_part = parts.next().unwrap_or("").trim_start_matches("b/"); + if !old_part.is_empty() { + current_old_path = Some(old_part.to_string()); + } + if !new_part.is_empty() { + current_new_path = Some(new_part.to_string()); + } + continue; + } + if line.starts_with("new file mode ") { + current_status = Some("A".to_string()); + } else if line.starts_with("deleted file mode ") { + current_status = Some("D".to_string()); + } else if line.starts_with("rename from ") { + current_status = Some("R".to_string()); + let path = line.trim_start_matches("rename from ").trim(); + if !path.is_empty() { + current_old_path = Some(path.to_string()); + } + } else if line.starts_with("rename to ") { + current_status = Some("R".to_string()); + let path = line.trim_start_matches("rename to ").trim(); + if !path.is_empty() { + current_new_path = Some(path.to_string()); + } + } + current_lines.push(line); + } + + finalize( + ¤t_lines, + ¤t_old_path, + ¤t_new_path, + ¤t_status, + &mut entries, + ); + + entries +} + +fn command_failure_detail(stdout: &[u8], stderr: &[u8], fallback: &str) -> String { + let stderr = String::from_utf8_lossy(stderr); + let stdout = String::from_utf8_lossy(stdout); + let detail = if stderr.trim().is_empty() { + stdout.trim() + } else { + stderr.trim() + }; + if detail.is_empty() { + fallback.to_string() + } else { + detail.to_string() + } +} + +pub(super) async fn get_github_issues_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo_name = github_repo_from_path(&repo_root)?; + + let output = tokio_command("gh") + .args([ + "issue", + "list", + "--repo", + &repo_name, + "--limit", + "50", + "--json", + "number,title,url,updatedAt", + ]) + .current_dir(&repo_root) + .output() + .await + .map_err(|e| format!("Failed to run gh: {e}"))?; + + if !output.status.success() { + return Err(command_failure_detail( + &output.stdout, + &output.stderr, + "GitHub CLI command failed.", + )); + } + + let issues: Vec = + serde_json::from_slice(&output.stdout).map_err(|e| e.to_string())?; + + let search_query = format!("repo:{repo_name} is:issue is:open").replace(' ', "+"); + let total = match tokio_command("gh") + .args([ + "api", + &format!("/search/issues?q={search_query}"), + "--jq", + ".total_count", + ]) + .current_dir(&repo_root) + .output() + .await + { + Ok(output) if output.status.success() => String::from_utf8_lossy(&output.stdout) + .trim() + .parse::() + .unwrap_or(issues.len()), + _ => issues.len(), + }; + + Ok(GitHubIssuesResponse { total, issues }) +} + +pub(super) async fn get_github_pull_requests_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo_name = github_repo_from_path(&repo_root)?; + + let output = tokio_command("gh") + .args([ + "pr", + "list", + "--repo", + &repo_name, + "--state", + "open", + "--limit", + "50", + "--json", + "number,title,url,updatedAt,createdAt,body,headRefName,baseRefName,isDraft,author", + ]) + .current_dir(&repo_root) + .output() + .await + .map_err(|e| format!("Failed to run gh: {e}"))?; + + if !output.status.success() { + return Err(command_failure_detail( + &output.stdout, + &output.stderr, + "GitHub CLI command failed.", + )); + } + + let pull_requests: Vec = + serde_json::from_slice(&output.stdout).map_err(|e| e.to_string())?; + + let search_query = format!("repo:{repo_name} is:pr is:open").replace(' ', "+"); + let total = match tokio_command("gh") + .args([ + "api", + &format!("/search/issues?q={search_query}"), + "--jq", + ".total_count", + ]) + .current_dir(&repo_root) + .output() + .await + { + Ok(output) if output.status.success() => String::from_utf8_lossy(&output.stdout) + .trim() + .parse::() + .unwrap_or(pull_requests.len()), + _ => pull_requests.len(), + }; + + Ok(GitHubPullRequestsResponse { + total, + pull_requests, + }) +} + +pub(super) async fn get_github_pull_request_diff_inner( + workspaces: &Mutex>, + workspace_id: String, + pr_number: u64, +) -> Result, String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo_name = github_repo_from_path(&repo_root)?; + + let output = tokio_command("gh") + .args([ + "pr", + "diff", + &pr_number.to_string(), + "--repo", + &repo_name, + "--color", + "never", + ]) + .current_dir(&repo_root) + .output() + .await + .map_err(|e| format!("Failed to run gh: {e}"))?; + + if !output.status.success() { + return Err(command_failure_detail( + &output.stdout, + &output.stderr, + "GitHub CLI command failed.", + )); + } + + let diff_text = String::from_utf8_lossy(&output.stdout); + Ok(parse_pr_diff(&diff_text)) +} + +pub(super) async fn get_github_pull_request_comments_inner( + workspaces: &Mutex>, + workspace_id: String, + pr_number: u64, +) -> Result, String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo_name = github_repo_from_path(&repo_root)?; + + let comments_endpoint = format!("/repos/{repo_name}/issues/{pr_number}/comments?per_page=30"); + let jq_filter = r#"[.[] | {id, body, createdAt: .created_at, url: .html_url, author: (if .user then {login: .user.login} else null end)}]"#; + + let output = tokio_command("gh") + .args(["api", &comments_endpoint, "--jq", jq_filter]) + .current_dir(&repo_root) + .output() + .await + .map_err(|e| format!("Failed to run gh: {e}"))?; + + if !output.status.success() { + return Err(command_failure_detail( + &output.stdout, + &output.stderr, + "GitHub CLI command failed.", + )); + } + + let comments: Vec = + serde_json::from_slice(&output.stdout).map_err(|e| e.to_string())?; + + Ok(comments) +} diff --git a/src-tauri/src/shared/git_ui_core/log.rs b/src-tauri/src/shared/git_ui_core/log.rs new file mode 100644 index 000000000..4d2d48e7c --- /dev/null +++ b/src-tauri/src/shared/git_ui_core/log.rs @@ -0,0 +1,121 @@ +use std::collections::HashMap; + +use git2::{BranchType, Repository, Sort}; +use tokio::sync::Mutex; + +use crate::git_utils::{commit_to_entry, resolve_git_root}; +use crate::types::{GitLogResponse, WorkspaceEntry}; + +use super::context::workspace_entry_for_id; + +pub(super) async fn get_git_log_inner( + workspaces: &Mutex>, + workspace_id: String, + limit: Option, +) -> Result { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + let max_items = limit.unwrap_or(40); + let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; + revwalk.push_head().map_err(|e| e.to_string())?; + revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; + + let mut total = 0usize; + for oid_result in revwalk { + oid_result.map_err(|e| e.to_string())?; + total += 1; + } + + let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; + revwalk.push_head().map_err(|e| e.to_string())?; + revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; + + let mut entries = Vec::new(); + for oid_result in revwalk.take(max_items) { + let oid = oid_result.map_err(|e| e.to_string())?; + let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; + entries.push(commit_to_entry(commit)); + } + + let mut ahead = 0usize; + let mut behind = 0usize; + let mut ahead_entries = Vec::new(); + let mut behind_entries = Vec::new(); + let mut upstream = None; + + if let Ok(head) = repo.head() { + if head.is_branch() { + if let Some(branch_name) = head.shorthand() { + if let Ok(branch) = repo.find_branch(branch_name, BranchType::Local) { + if let Ok(upstream_branch) = branch.upstream() { + let upstream_ref = upstream_branch.get(); + upstream = upstream_ref + .shorthand() + .map(|name| name.to_string()) + .or_else(|| upstream_ref.name().map(|name| name.to_string())); + if let (Some(head_oid), Some(upstream_oid)) = + (head.target(), upstream_ref.target()) + { + let (ahead_count, behind_count) = repo + .graph_ahead_behind(head_oid, upstream_oid) + .map_err(|e| e.to_string())?; + ahead = ahead_count; + behind = behind_count; + + let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; + revwalk.push(head_oid).map_err(|e| e.to_string())?; + revwalk.hide(upstream_oid).map_err(|e| e.to_string())?; + revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; + for oid_result in revwalk.take(max_items) { + let oid = oid_result.map_err(|e| e.to_string())?; + let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; + ahead_entries.push(commit_to_entry(commit)); + } + + let mut revwalk = repo.revwalk().map_err(|e| e.to_string())?; + revwalk.push(upstream_oid).map_err(|e| e.to_string())?; + revwalk.hide(head_oid).map_err(|e| e.to_string())?; + revwalk.set_sorting(Sort::TIME).map_err(|e| e.to_string())?; + for oid_result in revwalk.take(max_items) { + let oid = oid_result.map_err(|e| e.to_string())?; + let commit = repo.find_commit(oid).map_err(|e| e.to_string())?; + behind_entries.push(commit_to_entry(commit)); + } + } + } + } + } + } + } + + Ok(GitLogResponse { + total, + entries, + ahead, + behind, + ahead_entries, + behind_entries, + upstream, + }) +} + +pub(super) async fn get_git_remote_inner( + workspaces: &Mutex>, + workspace_id: String, +) -> Result, String> { + let entry = workspace_entry_for_id(workspaces, &workspace_id).await?; + let repo_root = resolve_git_root(&entry)?; + let repo = Repository::open(&repo_root).map_err(|e| e.to_string())?; + let remotes = repo.remotes().map_err(|e| e.to_string())?; + let name = if remotes.iter().any(|remote| remote == Some("origin")) { + "origin".to_string() + } else { + remotes.iter().flatten().next().unwrap_or("").to_string() + }; + if name.is_empty() { + return Ok(None); + } + let remote = repo.find_remote(&name).map_err(|e| e.to_string())?; + Ok(remote.url().map(|url| url.to_string())) +} diff --git a/src-tauri/src/shared/git_ui_core/tests.rs b/src-tauri/src/shared/git_ui_core/tests.rs new file mode 100644 index 000000000..b3f4ceac1 --- /dev/null +++ b/src-tauri/src/shared/git_ui_core/tests.rs @@ -0,0 +1,395 @@ +use std::collections::HashMap; +use std::fs; +use std::path::{Path, PathBuf}; + +use git2::Repository; +use serde_json::Value; +use tokio::runtime::Runtime; +use tokio::sync::Mutex; + +use crate::types::{AppSettings, WorkspaceEntry, WorkspaceKind, WorkspaceSettings}; + +use super::commands; +use super::diff; + +fn create_temp_repo() -> (PathBuf, Repository) { + let root = std::env::temp_dir().join(format!("codex-monitor-test-{}", uuid::Uuid::new_v4())); + fs::create_dir_all(&root).expect("create temp repo root"); + let repo = Repository::init(&root).expect("init repo"); + (root, repo) +} + +#[test] +fn collect_workspace_diff_prefers_staged_changes() { + let (root, repo) = create_temp_repo(); + let file_path = root.join("staged.txt"); + fs::write(&file_path, "staged\n").expect("write staged file"); + let mut index = repo.index().expect("index"); + index.add_path(Path::new("staged.txt")).expect("add path"); + index.write().expect("write index"); + + let diff_output = diff::collect_workspace_diff(&root).expect("collect diff"); + assert!(diff_output.contains("staged.txt")); + assert!(diff_output.contains("staged")); +} + +#[test] +fn collect_workspace_diff_falls_back_to_workdir() { + let (root, _repo) = create_temp_repo(); + let file_path = root.join("unstaged.txt"); + fs::write(&file_path, "unstaged\n").expect("write unstaged file"); + + let diff_output = diff::collect_workspace_diff(&root).expect("collect diff"); + assert!(diff_output.contains("unstaged.txt")); + assert!(diff_output.contains("unstaged")); +} + +#[test] +fn action_paths_for_file_expands_renames() { + let (root, repo) = create_temp_repo(); + fs::write(root.join("a.txt"), "hello\n").expect("write file"); + + let mut index = repo.index().expect("repo index"); + index.add_path(Path::new("a.txt")).expect("add path"); + let tree_id = index.write_tree().expect("write tree"); + let tree = repo.find_tree(tree_id).expect("find tree"); + let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); + repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) + .expect("commit"); + + fs::rename(root.join("a.txt"), root.join("b.txt")).expect("rename file"); + + let mut index = repo.index().expect("repo index"); + index + .remove_path(Path::new("a.txt")) + .expect("remove old path"); + index.add_path(Path::new("b.txt")).expect("add new path"); + index.write().expect("write index"); + + let paths = commands::action_paths_for_file(&root, "b.txt"); + assert_eq!(paths, vec!["a.txt".to_string(), "b.txt".to_string()]); +} + +#[test] +fn get_git_status_omits_global_ignored_paths() { + let (root, repo) = create_temp_repo(); + fs::write(root.join("tracked.txt"), "tracked\n").expect("write tracked file"); + let mut index = repo.index().expect("repo index"); + index.add_path(Path::new("tracked.txt")).expect("add path"); + let tree_id = index.write_tree().expect("write tree"); + let tree = repo.find_tree(tree_id).expect("find tree"); + let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); + repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) + .expect("commit"); + + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + let ignored_path = root.join("ignored_root/example/foo/bar.txt"); + fs::create_dir_all(ignored_path.parent().expect("parent")).expect("create ignored dir"); + fs::write(&ignored_path, "ignored\n").expect("write ignored file"); + + let workspace = WorkspaceEntry { + id: "w1".to_string(), + name: "w1".to_string(), + path: root.to_string_lossy().to_string(), + codex_bin: None, + kind: WorkspaceKind::Main, + parent_id: None, + worktree: None, + settings: WorkspaceSettings::default(), + }; + let mut entries = HashMap::new(); + entries.insert("w1".to_string(), workspace); + let workspaces = Mutex::new(entries); + + let runtime = Runtime::new().expect("create tokio runtime"); + let status = runtime + .block_on(diff::get_git_status_inner(&workspaces, "w1".to_string())) + .expect("get git status"); + + let has_ignored = status + .get("unstagedFiles") + .and_then(Value::as_array) + .into_iter() + .flatten() + .filter_map(|entry| entry.get("path").and_then(Value::as_str)) + .any(|path| path.starts_with("ignored_root/example/foo/bar")); + assert!( + !has_ignored, + "ignored files should not appear in unstagedFiles" + ); +} + +#[test] +fn get_git_diffs_omits_global_ignored_paths() { + let (root, repo) = create_temp_repo(); + fs::write(root.join("tracked.txt"), "tracked\n").expect("write tracked file"); + let mut index = repo.index().expect("repo index"); + index.add_path(Path::new("tracked.txt")).expect("add path"); + let tree_id = index.write_tree().expect("write tree"); + let tree = repo.find_tree(tree_id).expect("find tree"); + let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); + repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) + .expect("commit"); + + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + let ignored_path = root.join("ignored_root/example/foo/bar.txt"); + fs::create_dir_all(ignored_path.parent().expect("parent")).expect("create ignored dir"); + fs::write(&ignored_path, "ignored\n").expect("write ignored file"); + + let workspace = WorkspaceEntry { + id: "w1".to_string(), + name: "w1".to_string(), + path: root.to_string_lossy().to_string(), + codex_bin: None, + kind: WorkspaceKind::Main, + parent_id: None, + worktree: None, + settings: WorkspaceSettings::default(), + }; + let mut entries = HashMap::new(); + entries.insert("w1".to_string(), workspace); + let workspaces = Mutex::new(entries); + let app_settings = Mutex::new(AppSettings::default()); + + let runtime = Runtime::new().expect("create tokio runtime"); + let diffs = runtime + .block_on(diff::get_git_diffs_inner( + &workspaces, + &app_settings, + "w1".to_string(), + )) + .expect("get git diffs"); + + let has_ignored = diffs + .iter() + .any(|diff| diff.path.starts_with("ignored_root/example/foo/bar")); + assert!(!has_ignored, "ignored files should not appear in diff list"); +} + +#[test] +fn check_ignore_with_git_respects_negated_rule_for_specific_file() { + let (root, repo) = create_temp_repo(); + + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root/*\n!ignored_root/keep.txt\n") + .expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + let kept_path = Path::new("ignored_root/keep.txt"); + assert!( + diff::check_ignore_with_git(&repo, kept_path) == Some(false), + "keep.txt should be visible because of negated rule" + ); +} + +#[test] +fn should_skip_ignored_path_respects_negated_rule_for_specific_file() { + let (root, repo) = create_temp_repo(); + + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root/*\n!ignored_root/keep.txt\n") + .expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + assert!( + !diff::should_skip_ignored_path_with_cache(&repo, Path::new("ignored_root/keep.txt"), None), + "keep.txt should not be skipped when unignored by negated rule" + ); +} + +#[test] +fn should_skip_ignored_path_skips_paths_with_ignored_parent() { + let (root, repo) = create_temp_repo(); + + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + assert!( + diff::should_skip_ignored_path_with_cache( + &repo, + Path::new("ignored_root/example/foo/bar.txt"), + None, + ), + "nested path should be skipped when parent directory is ignored" + ); +} + +#[test] +fn should_skip_ignored_path_keeps_tracked_file_under_ignored_parent_pattern() { + let (root, repo) = create_temp_repo(); + let tracked_path = root.join("ignored_root/tracked.txt"); + fs::create_dir_all(tracked_path.parent().expect("parent")).expect("create tracked dir"); + fs::write(&tracked_path, "tracked\n").expect("write tracked file"); + let mut index = repo.index().expect("repo index"); + index + .add_path(Path::new("ignored_root/tracked.txt")) + .expect("add tracked path"); + index.write().expect("write index"); + let tree_id = index.write_tree().expect("write tree"); + let tree = repo.find_tree(tree_id).expect("find tree"); + let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); + repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) + .expect("commit"); + + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root/*\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + assert!( + !diff::should_skip_ignored_path_with_cache( + &repo, + Path::new("ignored_root/tracked.txt"), + None, + ), + "tracked file should not be skipped even if ignore pattern matches its path" + ); +} + +#[test] +fn check_ignore_with_git_treats_tracked_file_as_not_ignored() { + let (root, repo) = create_temp_repo(); + let tracked_path = root.join("ignored_root/tracked.txt"); + fs::create_dir_all(tracked_path.parent().expect("parent")).expect("create tracked dir"); + fs::write(&tracked_path, "tracked\n").expect("write tracked file"); + let mut index = repo.index().expect("repo index"); + index + .add_path(Path::new("ignored_root/tracked.txt")) + .expect("add tracked path"); + index.write().expect("write index"); + let tree_id = index.write_tree().expect("write tree"); + let tree = repo.find_tree(tree_id).expect("find tree"); + let sig = git2::Signature::now("Test", "test@example.com").expect("signature"); + repo.commit(Some("HEAD"), &sig, &sig, "init", &tree, &[]) + .expect("commit"); + + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root/*\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + assert_eq!( + diff::check_ignore_with_git(&repo, Path::new("ignored_root/tracked.txt")), + Some(false), + "git check-ignore should treat tracked files as not ignored" + ); +} + +#[test] +fn should_skip_ignored_path_respects_repo_negation_over_global_ignore() { + let (root, repo) = create_temp_repo(); + + fs::write(root.join(".gitignore"), "!keep.log\n").expect("write repo gitignore"); + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "*.log\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + assert_eq!( + diff::check_ignore_with_git(&repo, Path::new("keep.log")), + Some(false), + "repo negation should override global ignore for keep.log" + ); + assert!( + !diff::should_skip_ignored_path_with_cache(&repo, Path::new("keep.log"), None), + "keep.log should remain visible when repo .gitignore negates global ignore" + ); +} + +#[test] +fn collect_ignored_paths_with_git_checks_multiple_paths_in_one_call() { + let (root, repo) = create_temp_repo(); + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + let ignored_path = PathBuf::from("ignored_root/example/foo/bar.txt"); + let visible_path = PathBuf::from("visible.txt"); + let ignored_paths = + diff::collect_ignored_paths_with_git(&repo, &[ignored_path.clone(), visible_path.clone()]) + .expect("collect ignored paths"); + + assert!(ignored_paths.contains(&ignored_path)); + assert!(!ignored_paths.contains(&visible_path)); +} + +#[test] +fn collect_ignored_paths_with_git_handles_large_ignored_output() { + let (root, repo) = create_temp_repo(); + let excludes_path = root.join("global-excludes.txt"); + fs::write(&excludes_path, "ignored_root\n").expect("write excludes file"); + let mut config = repo.config().expect("repo config"); + config + .set_str( + "core.excludesfile", + excludes_path.to_string_lossy().as_ref(), + ) + .expect("set core.excludesfile"); + + let total = 6000usize; + let paths: Vec = (0..total) + .map(|i| PathBuf::from(format!("ignored_root/deep/path/file-{i}.txt"))) + .collect(); + let ignored_paths = + diff::collect_ignored_paths_with_git(&repo, &paths).expect("collect ignored paths"); + + assert_eq!(ignored_paths.len(), total); +} diff --git a/src-tauri/src/shared/workspaces_core.rs b/src-tauri/src/shared/workspaces_core.rs index 8fefe2037..0b97a6871 100644 --- a/src-tauri/src/shared/workspaces_core.rs +++ b/src-tauri/src/shared/workspaces_core.rs @@ -1,1667 +1,22 @@ -use std::collections::HashMap; -use std::future::Future; -#[cfg(target_os = "windows")] -use std::path::Path; -use std::path::PathBuf; -use std::process::Stdio; -use std::sync::Arc; - -use tokio::io::AsyncWriteExt; -use tokio::sync::Mutex; - -use crate::backend::app_server::WorkspaceSession; -use crate::codex::args::resolve_workspace_codex_args; -use crate::codex::home::resolve_workspace_codex_home; -use crate::git_utils::resolve_git_root; -#[cfg(target_os = "windows")] -use crate::shared::process_core::{build_cmd_c_command, resolve_windows_executable}; -use crate::shared::process_core::{kill_child_process_tree, tokio_command}; -use crate::shared::{git_core, worktree_core}; -use crate::storage::write_workspaces; -use crate::types::{ - AppSettings, WorkspaceEntry, WorkspaceInfo, WorkspaceKind, WorkspaceSettings, WorktreeInfo, - WorktreeSetupStatus, +mod connect; +mod crud_persistence; +mod git_orchestration; +mod helpers; +mod io; +mod worktree; + +pub(crate) use connect::connect_workspace_core; +pub(crate) use crud_persistence::{ + add_clone_core, add_workspace_core, remove_workspace_core, update_workspace_codex_bin_core, + update_workspace_settings_core, +}; +pub(crate) use git_orchestration::{apply_worktree_changes_core, run_git_command_unit}; +pub(crate) use helpers::{is_workspace_path_dir_core, list_workspaces_core}; +pub(crate) use io::{ + get_open_app_icon_core, list_workspace_files_core, open_workspace_in_core, + read_workspace_file_core, +}; +pub(crate) use worktree::{ + add_worktree_core, remove_worktree_core, rename_worktree_core, rename_worktree_upstream_core, + worktree_setup_mark_ran_core, worktree_setup_status_core, }; -use uuid::Uuid; - -pub(crate) const WORKTREE_SETUP_MARKERS_DIR: &str = "worktree-setup"; -pub(crate) const WORKTREE_SETUP_MARKER_EXT: &str = "ran"; -const AGENTS_MD_FILE_NAME: &str = "AGENTS.md"; - -fn copy_agents_md_from_parent_to_worktree( - parent_repo_root: &PathBuf, - worktree_root: &PathBuf, -) -> Result<(), String> { - let source_path = parent_repo_root.join(AGENTS_MD_FILE_NAME); - if !source_path.is_file() { - return Ok(()); - } - - let destination_path = worktree_root.join(AGENTS_MD_FILE_NAME); - if destination_path.is_file() { - return Ok(()); - } - - let temp_path = worktree_root.join(format!("{AGENTS_MD_FILE_NAME}.tmp")); - - std::fs::copy(&source_path, &temp_path).map_err(|err| { - format!( - "Failed to copy {} from {} to {}: {err}", - AGENTS_MD_FILE_NAME, - source_path.display(), - temp_path.display() - ) - })?; - - std::fs::rename(&temp_path, &destination_path).map_err(|err| { - let _ = std::fs::remove_file(&temp_path); - format!( - "Failed to finalize {} copy to {}: {err}", - AGENTS_MD_FILE_NAME, - destination_path.display() - ) - })?; - - Ok(()) -} - -pub(crate) fn normalize_setup_script(script: Option) -> Option { - match script { - Some(value) if value.trim().is_empty() => None, - Some(value) => Some(value), - None => None, - } -} - -pub(crate) fn worktree_setup_marker_path(data_dir: &PathBuf, workspace_id: &str) -> PathBuf { - data_dir - .join(WORKTREE_SETUP_MARKERS_DIR) - .join(format!("{workspace_id}.{WORKTREE_SETUP_MARKER_EXT}")) -} - -pub(crate) fn is_workspace_path_dir_core(path: &str) -> bool { - PathBuf::from(path).is_dir() -} - -pub(crate) async fn list_workspaces_core( - workspaces: &Mutex>, - sessions: &Mutex>>, -) -> Vec { - let workspaces = workspaces.lock().await; - let sessions = sessions.lock().await; - let mut result = Vec::new(); - for entry in workspaces.values() { - result.push(WorkspaceInfo { - id: entry.id.clone(), - name: entry.name.clone(), - path: entry.path.clone(), - codex_bin: entry.codex_bin.clone(), - connected: sessions.contains_key(&entry.id), - kind: entry.kind.clone(), - parent_id: entry.parent_id.clone(), - worktree: entry.worktree.clone(), - settings: entry.settings.clone(), - }); - } - sort_workspaces(&mut result); - result -} - -async fn resolve_entry_and_parent( - workspaces: &Mutex>, - workspace_id: &str, -) -> Result<(WorkspaceEntry, Option), String> { - let workspaces = workspaces.lock().await; - let entry = workspaces - .get(workspace_id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - let parent_entry = entry - .parent_id - .as_ref() - .and_then(|parent_id| workspaces.get(parent_id)) - .cloned(); - Ok((entry, parent_entry)) -} - -async fn resolve_workspace_root( - workspaces: &Mutex>, - workspace_id: &str, -) -> Result { - let workspaces = workspaces.lock().await; - let entry = workspaces - .get(workspace_id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - Ok(PathBuf::from(entry.path)) -} - -pub(crate) async fn worktree_setup_status_core( - workspaces: &Mutex>, - workspace_id: &str, - data_dir: &PathBuf, -) -> Result { - let entry = { - let workspaces = workspaces.lock().await; - workspaces - .get(workspace_id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())? - }; - - let script = normalize_setup_script(entry.settings.worktree_setup_script.clone()); - let marker_exists = if entry.kind.is_worktree() { - worktree_setup_marker_path(data_dir, &entry.id).exists() - } else { - false - }; - let should_run = entry.kind.is_worktree() && script.is_some() && !marker_exists; - - Ok(WorktreeSetupStatus { should_run, script }) -} - -pub(crate) async fn worktree_setup_mark_ran_core( - workspaces: &Mutex>, - workspace_id: &str, - data_dir: &PathBuf, -) -> Result<(), String> { - let entry = { - let workspaces = workspaces.lock().await; - workspaces - .get(workspace_id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())? - }; - if !entry.kind.is_worktree() { - return Err("Not a worktree workspace.".to_string()); - } - let marker_path = worktree_setup_marker_path(data_dir, &entry.id); - if let Some(parent) = marker_path.parent() { - std::fs::create_dir_all(parent) - .map_err(|err| format!("Failed to prepare worktree marker directory: {err}"))?; - } - let ran_at = std::time::SystemTime::now() - .duration_since(std::time::UNIX_EPOCH) - .map(|duration| duration.as_secs()) - .unwrap_or(0); - std::fs::write(&marker_path, format!("ran_at={ran_at}\n")) - .map_err(|err| format!("Failed to write worktree setup marker: {err}"))?; - Ok(()) -} - -pub(crate) async fn add_workspace_core( - path: String, - codex_bin: Option, - workspaces: &Mutex>, - sessions: &Mutex>>, - app_settings: &Mutex, - storage_path: &PathBuf, - spawn_session: F, -) -> Result -where - F: Fn(WorkspaceEntry, Option, Option, Option) -> Fut, - Fut: Future, String>>, -{ - if !PathBuf::from(&path).is_dir() { - return Err("Workspace path must be a folder.".to_string()); - } - - let name = PathBuf::from(&path) - .file_name() - .and_then(|s| s.to_str()) - .unwrap_or("Workspace") - .to_string(); - let entry = WorkspaceEntry { - id: Uuid::new_v4().to_string(), - name: name.clone(), - path: path.clone(), - codex_bin, - kind: WorkspaceKind::Main, - parent_id: None, - worktree: None, - settings: WorkspaceSettings::default(), - }; - - let (default_bin, codex_args) = { - let settings = app_settings.lock().await; - ( - settings.codex_bin.clone(), - resolve_workspace_codex_args(&entry, None, Some(&settings)), - ) - }; - let codex_home = resolve_workspace_codex_home(&entry, None); - let session = spawn_session(entry.clone(), default_bin, codex_args, codex_home).await?; - - if let Err(error) = { - let mut workspaces = workspaces.lock().await; - workspaces.insert(entry.id.clone(), entry.clone()); - let list: Vec<_> = workspaces.values().cloned().collect(); - write_workspaces(storage_path, &list) - } { - { - let mut workspaces = workspaces.lock().await; - workspaces.remove(&entry.id); - } - let mut child = session.child.lock().await; - kill_child_process_tree(&mut child).await; - return Err(error); - } - - sessions.lock().await.insert(entry.id.clone(), session); - - Ok(WorkspaceInfo { - id: entry.id, - name: entry.name, - path: entry.path, - codex_bin: entry.codex_bin, - connected: true, - kind: entry.kind, - parent_id: entry.parent_id, - worktree: entry.worktree, - settings: entry.settings, - }) -} - -pub(crate) async fn add_clone_core( - source_workspace_id: String, - copy_name: String, - copies_folder: String, - workspaces: &Mutex>, - sessions: &Mutex>>, - app_settings: &Mutex, - storage_path: &PathBuf, - spawn_session: F, -) -> Result -where - F: Fn(WorkspaceEntry, Option, Option, Option) -> Fut, - Fut: Future, String>>, -{ - let copy_name = copy_name.trim().to_string(); - if copy_name.is_empty() { - return Err("Copy name is required.".to_string()); - } - - let copies_folder = copies_folder.trim().to_string(); - if copies_folder.is_empty() { - return Err("Copies folder is required.".to_string()); - } - let copies_folder_path = PathBuf::from(&copies_folder); - std::fs::create_dir_all(&copies_folder_path) - .map_err(|e| format!("Failed to create copies folder: {e}"))?; - if !copies_folder_path.is_dir() { - return Err("Copies folder must be a directory.".to_string()); - } - - let (source_entry, inherited_group_id) = { - let workspaces = workspaces.lock().await; - let source_entry = workspaces - .get(&source_workspace_id) - .cloned() - .ok_or_else(|| "source workspace not found".to_string())?; - let inherited_group_id = if source_entry.kind.is_worktree() { - source_entry - .parent_id - .as_ref() - .and_then(|parent_id| workspaces.get(parent_id)) - .and_then(|parent| parent.settings.group_id.clone()) - } else { - source_entry.settings.group_id.clone() - }; - (source_entry, inherited_group_id) - }; - - let destination_path = - worktree_core::build_clone_destination_path(&copies_folder_path, ©_name); - let destination_path_string = destination_path.to_string_lossy().to_string(); - - if let Err(error) = git_core::run_git_command( - &copies_folder_path, - &["clone", &source_entry.path, &destination_path_string], - ) - .await - { - let _ = tokio::fs::remove_dir_all(&destination_path).await; - return Err(error); - } - - if let Some(origin_url) = git_core::git_get_origin_url(&PathBuf::from(&source_entry.path)).await - { - let _ = git_core::run_git_command( - &destination_path, - &["remote", "set-url", "origin", &origin_url], - ) - .await; - } - - let entry = WorkspaceEntry { - id: Uuid::new_v4().to_string(), - name: copy_name, - path: destination_path_string, - codex_bin: source_entry.codex_bin.clone(), - kind: WorkspaceKind::Main, - parent_id: None, - worktree: None, - settings: WorkspaceSettings { - group_id: inherited_group_id, - ..WorkspaceSettings::default() - }, - }; - - let (default_bin, codex_args) = { - let settings = app_settings.lock().await; - ( - settings.codex_bin.clone(), - resolve_workspace_codex_args(&entry, None, Some(&settings)), - ) - }; - let codex_home = resolve_workspace_codex_home(&entry, None); - let session = match spawn_session(entry.clone(), default_bin, codex_args, codex_home).await { - Ok(session) => session, - Err(error) => { - let _ = tokio::fs::remove_dir_all(&destination_path).await; - return Err(error); - } - }; - - if let Err(error) = { - let mut workspaces = workspaces.lock().await; - workspaces.insert(entry.id.clone(), entry.clone()); - let list: Vec<_> = workspaces.values().cloned().collect(); - write_workspaces(storage_path, &list) - } { - { - let mut workspaces = workspaces.lock().await; - workspaces.remove(&entry.id); - } - let mut child = session.child.lock().await; - kill_child_process_tree(&mut child).await; - let _ = tokio::fs::remove_dir_all(&destination_path).await; - return Err(error); - } - - sessions.lock().await.insert(entry.id.clone(), session); - - Ok(WorkspaceInfo { - id: entry.id, - name: entry.name, - path: entry.path, - codex_bin: entry.codex_bin, - connected: true, - kind: entry.kind, - parent_id: entry.parent_id, - worktree: entry.worktree, - settings: entry.settings, - }) -} - -pub(crate) async fn apply_worktree_changes_core( - workspaces: &Mutex>, - workspace_id: String, -) -> Result<(), String> { - let (entry, parent) = { - let workspaces = workspaces.lock().await; - let entry = workspaces - .get(&workspace_id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - if !entry.kind.is_worktree() { - return Err("Not a worktree workspace.".to_string()); - } - let parent_id = entry - .parent_id - .clone() - .ok_or_else(|| "worktree parent not found".to_string())?; - let parent = workspaces - .get(&parent_id) - .cloned() - .ok_or_else(|| "worktree parent not found".to_string())?; - (entry, parent) - }; - - apply_worktree_changes_inner_core(&entry, &parent).await -} - -async fn apply_worktree_changes_inner_core( - entry: &WorkspaceEntry, - parent: &WorkspaceEntry, -) -> Result<(), String> { - let worktree_root = resolve_git_root(entry)?; - let parent_root = resolve_git_root(parent)?; - - let parent_status = - git_core::run_git_command_bytes(&parent_root, &["status", "--porcelain"]).await?; - if !String::from_utf8_lossy(&parent_status).trim().is_empty() { - return Err( - "Your current branch has uncommitted changes. Please commit, stash, or discard them before applying worktree changes." - .to_string(), - ); - } - - let mut patch: Vec = Vec::new(); - let staged_patch = git_core::run_git_diff( - &worktree_root, - &["diff", "--binary", "--no-color", "--cached"], - ) - .await?; - patch.extend_from_slice(&staged_patch); - let unstaged_patch = - git_core::run_git_diff(&worktree_root, &["diff", "--binary", "--no-color"]).await?; - patch.extend_from_slice(&unstaged_patch); - - let untracked_output = git_core::run_git_command_bytes( - &worktree_root, - &["ls-files", "--others", "--exclude-standard", "-z"], - ) - .await?; - for raw_path in untracked_output.split(|byte| *byte == 0) { - if raw_path.is_empty() { - continue; - } - let path = String::from_utf8_lossy(raw_path).to_string(); - let diff = git_core::run_git_diff( - &worktree_root, - &[ - "diff", - "--binary", - "--no-color", - "--no-index", - "--", - worktree_core::null_device_path(), - &path, - ], - ) - .await?; - patch.extend_from_slice(&diff); - } - - if String::from_utf8_lossy(&patch).trim().is_empty() { - return Err("No changes to apply.".to_string()); - } - - let git_bin = - crate::utils::resolve_git_binary().map_err(|e| format!("Failed to run git: {e}"))?; - let mut child = tokio_command(git_bin) - .args(["apply", "--3way", "--whitespace=nowarn", "-"]) - .current_dir(&parent_root) - .env("PATH", crate::utils::git_env_path()) - .stdin(Stdio::piped()) - .stdout(Stdio::piped()) - .stderr(Stdio::piped()) - .spawn() - .map_err(|e| format!("Failed to run git: {e}"))?; - - if let Some(mut stdin) = child.stdin.take() { - stdin - .write_all(&patch) - .await - .map_err(|e| format!("Failed to write git apply input: {e}"))?; - } - - let output = child - .wait_with_output() - .await - .map_err(|e| format!("Failed to run git: {e}"))?; - - if output.status.success() { - return Ok(()); - } - - let stderr = String::from_utf8_lossy(&output.stderr); - let stdout = String::from_utf8_lossy(&output.stdout); - let detail = if stderr.trim().is_empty() { - stdout.trim() - } else { - stderr.trim() - }; - if detail.is_empty() { - return Err("Git apply failed.".to_string()); - } - - if detail.contains("Applied patch to") { - if detail.contains("with conflicts") { - return Err( - "Applied with conflicts. Resolve conflicts in the parent repo before retrying." - .to_string(), - ); - } - return Err( - "Patch applied partially. Resolve changes in the parent repo before retrying." - .to_string(), - ); - } - - Err(detail.to_string()) -} - -pub(crate) async fn open_workspace_in_core( - path: String, - app: Option, - args: Vec, - command: Option, -) -> Result<(), String> { - fn output_snippet(bytes: &[u8]) -> Option { - const MAX_CHARS: usize = 240; - let text = String::from_utf8_lossy(bytes).trim().replace('\n', "\\n"); - if text.is_empty() { - return None; - } - let mut chars = text.chars(); - let snippet: String = chars.by_ref().take(MAX_CHARS).collect(); - if chars.next().is_some() { - Some(format!("{snippet}...")) - } else { - Some(snippet) - } - } - - let target_label = command - .as_ref() - .map(|value| format!("command `{value}`")) - .or_else(|| app.as_ref().map(|value| format!("app `{value}`"))) - .unwrap_or_else(|| "target".to_string()); - - let output = if let Some(command) = command { - let trimmed = command.trim(); - if trimmed.is_empty() { - return Err("Missing app or command".to_string()); - } - - #[cfg(target_os = "windows")] - let mut cmd = { - let resolved = resolve_windows_executable(trimmed, None); - let resolved_path = resolved.as_deref().unwrap_or_else(|| Path::new(trimmed)); - let ext = resolved_path - .extension() - .and_then(|ext| ext.to_str()) - .map(|ext| ext.to_ascii_lowercase()); - - if matches!(ext.as_deref(), Some("cmd") | Some("bat")) { - let mut cmd = tokio_command("cmd"); - let mut command_args = args.clone(); - command_args.push(path.clone()); - let command_line = build_cmd_c_command(resolved_path, &command_args)?; - cmd.arg("/D"); - cmd.arg("/S"); - cmd.arg("/C"); - cmd.raw_arg(command_line); - cmd - } else { - let mut cmd = tokio_command(resolved_path); - cmd.args(&args).arg(&path); - cmd - } - }; - - #[cfg(not(target_os = "windows"))] - let mut cmd = { - let mut cmd = tokio_command(trimmed); - cmd.args(&args).arg(&path); - cmd - }; - - cmd.output() - .await - .map_err(|error| format!("Failed to open app ({target_label}): {error}"))? - } else if let Some(app) = app { - let trimmed = app.trim(); - if trimmed.is_empty() { - return Err("Missing app or command".to_string()); - } - - #[cfg(target_os = "macos")] - let mut cmd = { - let mut cmd = tokio_command("open"); - cmd.arg("-a").arg(trimmed).arg(&path); - if !args.is_empty() { - cmd.arg("--args").args(&args); - } - cmd - }; - - #[cfg(not(target_os = "macos"))] - let mut cmd = { - let mut cmd = tokio_command(trimmed); - cmd.args(&args).arg(&path); - cmd - }; - - cmd.output() - .await - .map_err(|error| format!("Failed to open app ({target_label}): {error}"))? - } else { - return Err("Missing app or command".to_string()); - }; - - if output.status.success() { - return Ok(()); - } - - let exit_detail = output - .status - .code() - .map(|code| format!("exit code {code}")) - .unwrap_or_else(|| "terminated by signal".to_string()); - let mut details = Vec::new(); - if let Some(stderr) = output_snippet(&output.stderr) { - details.push(format!("stderr: {stderr}")); - } - if let Some(stdout) = output_snippet(&output.stdout) { - details.push(format!("stdout: {stdout}")); - } - - if details.is_empty() { - Err(format!( - "Failed to open app ({target_label} returned {exit_detail})." - )) - } else { - Err(format!( - "Failed to open app ({target_label} returned {exit_detail}; {}).", - details.join("; ") - )) - } -} - -#[cfg(target_os = "macos")] -pub(crate) async fn get_open_app_icon_core( - app_name: String, - icon_loader: F, -) -> Result, String> -where - F: Fn(&str) -> Option + Send + Sync + 'static, -{ - let trimmed = app_name.trim().to_string(); - if trimmed.is_empty() { - return Ok(None); - } - let icon_loader = Arc::new(icon_loader); - tokio::task::spawn_blocking(move || icon_loader(&trimmed)) - .await - .map_err(|err| err.to_string()) -} - -#[cfg(not(target_os = "macos"))] -pub(crate) async fn get_open_app_icon_core( - app_name: String, - icon_loader: F, -) -> Result, String> -where - F: Fn(&str) -> Option + Send + Sync + 'static, -{ - let _ = app_name; - let _ = icon_loader; - Ok(None) -} - -pub(crate) fn run_git_command_unit( - repo_path: &PathBuf, - args: &[&str], - run_git_command: F, -) -> impl Future> -where - F: Fn(PathBuf, Vec) -> Fut, - Fut: Future>, -{ - // Own the inputs so the returned future does not borrow temporary references. - let repo_path = repo_path.clone(); - let args_owned = args - .iter() - .map(|value| value.to_string()) - .collect::>(); - async move { - run_git_command(repo_path, args_owned) - .await - .map(|_output| ()) - } -} - -pub(crate) async fn add_worktree_core< - FSpawn, - FutSpawn, - FSanitize, - FUniquePath, - FBranchExists, - FutBranchExists, - FFindRemoteTracking, - FutFindRemoteTracking, - FRunGit, - FutRunGit, ->( - parent_id: String, - branch: String, - name: Option, - copy_agents_md: bool, - data_dir: &PathBuf, - workspaces: &Mutex>, - sessions: &Mutex>>, - app_settings: &Mutex, - storage_path: &PathBuf, - sanitize_worktree_name: FSanitize, - unique_worktree_path: FUniquePath, - git_branch_exists: FBranchExists, - git_find_remote_tracking_branch: Option, - run_git_command: FRunGit, - spawn_session: FSpawn, -) -> Result -where - FSpawn: Fn(WorkspaceEntry, Option, Option, Option) -> FutSpawn, - FutSpawn: Future, String>>, - FSanitize: Fn(&str) -> String, - FUniquePath: Fn(&PathBuf, &str) -> Result, - FBranchExists: Fn(&PathBuf, &str) -> FutBranchExists, - FutBranchExists: Future>, - FFindRemoteTracking: Fn(&PathBuf, &str) -> FutFindRemoteTracking, - FutFindRemoteTracking: Future, String>>, - FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, - FutRunGit: Future>, -{ - let branch = branch.trim().to_string(); - if branch.is_empty() { - return Err("Branch name is required.".to_string()); - } - let name = name - .map(|value| value.trim().to_string()) - .filter(|value| !value.is_empty()); - - let parent_entry = { - let workspaces = workspaces.lock().await; - workspaces - .get(&parent_id) - .cloned() - .ok_or_else(|| "parent workspace not found".to_string())? - }; - - if parent_entry.kind.is_worktree() { - return Err("Cannot create a worktree from another worktree.".to_string()); - } - - let worktree_root = data_dir.join("worktrees").join(&parent_entry.id); - std::fs::create_dir_all(&worktree_root) - .map_err(|err| format!("Failed to create worktree directory: {err}"))?; - - let safe_name = sanitize_worktree_name(&branch); - let worktree_path = unique_worktree_path(&worktree_root, &safe_name)?; - let worktree_path_string = worktree_path.to_string_lossy().to_string(); - - let repo_path = PathBuf::from(&parent_entry.path); - let branch_exists = git_branch_exists(&repo_path, &branch).await?; - if branch_exists { - run_git_command( - &repo_path, - &["worktree", "add", &worktree_path_string, &branch], - ) - .await?; - } else if let Some(find_remote_tracking) = git_find_remote_tracking_branch { - if let Some(remote_ref) = find_remote_tracking(&repo_path, &branch).await? { - run_git_command( - &repo_path, - &[ - "worktree", - "add", - "-b", - &branch, - &worktree_path_string, - &remote_ref, - ], - ) - .await?; - } else { - run_git_command( - &repo_path, - &["worktree", "add", "-b", &branch, &worktree_path_string], - ) - .await?; - } - } else { - run_git_command( - &repo_path, - &["worktree", "add", "-b", &branch, &worktree_path_string], - ) - .await?; - } - - if copy_agents_md { - if let Err(error) = copy_agents_md_from_parent_to_worktree(&repo_path, &worktree_path) { - eprintln!( - "add_worktree: optional {} copy failed for {}: {}", - AGENTS_MD_FILE_NAME, - worktree_path.display(), - error - ); - } - } - - let entry = WorkspaceEntry { - id: Uuid::new_v4().to_string(), - name: name.clone().unwrap_or_else(|| branch.clone()), - path: worktree_path_string, - codex_bin: parent_entry.codex_bin.clone(), - kind: WorkspaceKind::Worktree, - parent_id: Some(parent_entry.id.clone()), - worktree: Some(WorktreeInfo { branch }), - settings: WorkspaceSettings { - worktree_setup_script: normalize_setup_script( - parent_entry.settings.worktree_setup_script.clone(), - ), - ..WorkspaceSettings::default() - }, - }; - - let (default_bin, codex_args) = { - let settings = app_settings.lock().await; - ( - settings.codex_bin.clone(), - resolve_workspace_codex_args(&entry, Some(&parent_entry), Some(&settings)), - ) - }; - let codex_home = resolve_workspace_codex_home(&entry, Some(&parent_entry)); - let session = spawn_session(entry.clone(), default_bin, codex_args, codex_home).await?; - - { - let mut workspaces = workspaces.lock().await; - workspaces.insert(entry.id.clone(), entry.clone()); - let list: Vec<_> = workspaces.values().cloned().collect(); - write_workspaces(storage_path, &list)?; - } - - sessions.lock().await.insert(entry.id.clone(), session); - - Ok(WorkspaceInfo { - id: entry.id, - name: entry.name, - path: entry.path, - codex_bin: entry.codex_bin, - connected: true, - kind: entry.kind, - parent_id: entry.parent_id, - worktree: entry.worktree, - settings: entry.settings, - }) -} - -pub(crate) async fn connect_workspace_core( - workspace_id: String, - workspaces: &Mutex>, - sessions: &Mutex>>, - app_settings: &Mutex, - spawn_session: F, -) -> Result<(), String> -where - F: Fn(WorkspaceEntry, Option, Option, Option) -> Fut, - Fut: Future, String>>, -{ - let (entry, parent_entry) = resolve_entry_and_parent(workspaces, &workspace_id).await?; - let (default_bin, codex_args) = { - let settings = app_settings.lock().await; - ( - settings.codex_bin.clone(), - resolve_workspace_codex_args(&entry, parent_entry.as_ref(), Some(&settings)), - ) - }; - let codex_home = resolve_workspace_codex_home(&entry, parent_entry.as_ref()); - let session = spawn_session(entry.clone(), default_bin, codex_args, codex_home).await?; - sessions.lock().await.insert(entry.id, session); - Ok(()) -} - -async fn kill_session_by_id(sessions: &Mutex>>, id: &str) { - if let Some(session) = sessions.lock().await.remove(id) { - let mut child = session.child.lock().await; - kill_child_process_tree(&mut child).await; - } -} - -pub(crate) async fn remove_workspace_core( - id: String, - workspaces: &Mutex>, - sessions: &Mutex>>, - storage_path: &PathBuf, - run_git_command: FRunGit, - is_missing_worktree_error: FIsMissing, - remove_dir_all: FRemoveDirAll, - require_all_children_removed_to_remove_parent: bool, - continue_on_child_error: bool, -) -> Result<(), String> -where - FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, - FutRunGit: Future>, - FIsMissing: Fn(&str) -> bool, - FRemoveDirAll: Fn(&PathBuf) -> Result<(), String>, -{ - let (entry, child_worktrees) = { - let workspaces = workspaces.lock().await; - let entry = workspaces - .get(&id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - if entry.kind.is_worktree() { - return Err("Use remove_worktree for worktree agents.".to_string()); - } - let children = workspaces - .values() - .filter(|workspace| workspace.parent_id.as_deref() == Some(&id)) - .cloned() - .collect::>(); - (entry, children) - }; - - let repo_path = PathBuf::from(&entry.path); - let mut removed_child_ids = Vec::new(); - let mut failures: Vec<(String, String)> = Vec::new(); - - for child in &child_worktrees { - kill_session_by_id(sessions, &child.id).await; - - let child_path = PathBuf::from(&child.path); - if child_path.exists() { - if let Err(error) = - run_git_command(&repo_path, &["worktree", "remove", "--force", &child.path]).await - { - if is_missing_worktree_error(&error) { - if child_path.exists() { - if let Err(fs_error) = remove_dir_all(&child_path) { - if continue_on_child_error { - failures.push((child.id.clone(), fs_error)); - continue; - } - return Err(fs_error); - } - } - } else { - if continue_on_child_error { - failures.push((child.id.clone(), error)); - continue; - } - return Err(error); - } - } - } - removed_child_ids.push(child.id.clone()); - } - - let _ = run_git_command(&repo_path, &["worktree", "prune", "--expire", "now"]).await; - - let mut ids_to_remove = removed_child_ids; - if failures.is_empty() || !require_all_children_removed_to_remove_parent { - kill_session_by_id(sessions, &id).await; - ids_to_remove.push(id.clone()); - } - - { - let mut workspaces = workspaces.lock().await; - for workspace_id in ids_to_remove { - workspaces.remove(&workspace_id); - } - let list: Vec<_> = workspaces.values().cloned().collect(); - write_workspaces(storage_path, &list)?; - } - - if failures.is_empty() { - return Ok(()); - } - - if require_all_children_removed_to_remove_parent { - let mut message = - "Failed to remove one or more worktrees; parent workspace was not removed.".to_string(); - for (child_id, error) in failures { - message.push_str(&format!("\n- {child_id}: {error}")); - } - return Err(message); - } - - Ok(()) -} - -pub(crate) async fn remove_worktree_core( - id: String, - workspaces: &Mutex>, - sessions: &Mutex>>, - storage_path: &PathBuf, - run_git_command: FRunGit, - is_missing_worktree_error: FIsMissing, - remove_dir_all: FRemoveDirAll, -) -> Result<(), String> -where - FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, - FutRunGit: Future>, - FIsMissing: Fn(&str) -> bool, - FRemoveDirAll: Fn(&PathBuf) -> Result<(), String>, -{ - let (entry, parent) = { - let workspaces = workspaces.lock().await; - let entry = workspaces - .get(&id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - if !entry.kind.is_worktree() { - return Err("Not a worktree workspace.".to_string()); - } - let parent_id = entry - .parent_id - .clone() - .ok_or_else(|| "worktree parent not found".to_string())?; - let parent = workspaces - .get(&parent_id) - .cloned() - .ok_or_else(|| "worktree parent not found".to_string())?; - (entry, parent) - }; - - let parent_path = PathBuf::from(&parent.path); - let entry_path = PathBuf::from(&entry.path); - kill_session_by_id(sessions, &entry.id).await; - - if entry_path.exists() { - if let Err(error) = run_git_command( - &parent_path, - &["worktree", "remove", "--force", &entry.path], - ) - .await - { - if is_missing_worktree_error(&error) { - if entry_path.exists() { - remove_dir_all(&entry_path)?; - } - } else { - return Err(error); - } - } - } - let _ = run_git_command(&parent_path, &["worktree", "prune", "--expire", "now"]).await; - - { - let mut workspaces = workspaces.lock().await; - workspaces.remove(&entry.id); - let list: Vec<_> = workspaces.values().cloned().collect(); - write_workspaces(storage_path, &list)?; - } - - Ok(()) -} - -pub(crate) async fn rename_worktree_core< - FSpawn, - FutSpawn, - FResolveGitRoot, - FUniqueBranch, - FutUniqueBranch, - FSanitize, - FUniqueRenamePath, - FRunGit, - FutRunGit, ->( - id: String, - branch: String, - data_dir: &PathBuf, - workspaces: &Mutex>, - sessions: &Mutex>>, - app_settings: &Mutex, - storage_path: &PathBuf, - resolve_git_root: FResolveGitRoot, - unique_branch_name: FUniqueBranch, - sanitize_worktree_name: FSanitize, - unique_worktree_path_for_rename: FUniqueRenamePath, - run_git_command: FRunGit, - spawn_session: FSpawn, -) -> Result -where - FSpawn: Fn(WorkspaceEntry, Option, Option, Option) -> FutSpawn, - FutSpawn: Future, String>>, - FResolveGitRoot: Fn(&WorkspaceEntry) -> Result, - FUniqueBranch: Fn(&PathBuf, &str) -> FutUniqueBranch, - FutUniqueBranch: Future>, - FSanitize: Fn(&str) -> String, - FUniqueRenamePath: Fn(&PathBuf, &str, &PathBuf) -> Result, - FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, - FutRunGit: Future>, -{ - let trimmed = branch.trim(); - if trimmed.is_empty() { - return Err("Branch name is required.".to_string()); - } - - let (entry, parent) = { - let workspaces = workspaces.lock().await; - let entry = workspaces - .get(&id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - if !entry.kind.is_worktree() { - return Err("Not a worktree workspace.".to_string()); - } - let parent_id = entry - .parent_id - .clone() - .ok_or_else(|| "worktree parent not found".to_string())?; - let parent = workspaces - .get(&parent_id) - .cloned() - .ok_or_else(|| "worktree parent not found".to_string())?; - (entry, parent) - }; - - let old_branch = entry - .worktree - .as_ref() - .map(|worktree| worktree.branch.clone()) - .ok_or_else(|| "worktree metadata missing".to_string())?; - if old_branch == trimmed { - return Err("Branch name is unchanged.".to_string()); - } - - let parent_root = resolve_git_root(&parent)?; - let final_branch = unique_branch_name(&parent_root, trimmed).await?; - if final_branch == old_branch { - return Err("Branch name is unchanged.".to_string()); - } - - run_git_command(&parent_root, &["branch", "-m", &old_branch, &final_branch]).await?; - - let worktree_root = data_dir.join("worktrees").join(&parent.id); - std::fs::create_dir_all(&worktree_root) - .map_err(|err| format!("Failed to create worktree directory: {err}"))?; - - let safe_name = sanitize_worktree_name(&final_branch); - let current_path = PathBuf::from(&entry.path); - let next_path = unique_worktree_path_for_rename(&worktree_root, &safe_name, ¤t_path)?; - let next_path_string = next_path.to_string_lossy().to_string(); - if next_path_string != entry.path { - if let Err(error) = run_git_command( - &parent_root, - &["worktree", "move", &entry.path, &next_path_string], - ) - .await - { - let _ = - run_git_command(&parent_root, &["branch", "-m", &final_branch, &old_branch]).await; - return Err(error); - } - } - - let (entry_snapshot, list) = { - let mut workspaces = workspaces.lock().await; - let entry = match workspaces.get_mut(&id) { - Some(entry) => entry, - None => return Err("workspace not found".to_string()), - }; - if entry.name.trim() == old_branch { - entry.name = final_branch.clone(); - } - entry.path = next_path_string.clone(); - match entry.worktree.as_mut() { - Some(worktree) => { - worktree.branch = final_branch.clone(); - } - None => { - entry.worktree = Some(WorktreeInfo { - branch: final_branch.clone(), - }); - } - } - let snapshot = entry.clone(); - let list: Vec<_> = workspaces.values().cloned().collect(); - (snapshot, list) - }; - write_workspaces(storage_path, &list)?; - - let was_connected = sessions.lock().await.contains_key(&entry_snapshot.id); - if was_connected { - kill_session_by_id(sessions, &entry_snapshot.id).await; - let (default_bin, codex_args) = { - let settings = app_settings.lock().await; - ( - settings.codex_bin.clone(), - resolve_workspace_codex_args(&entry_snapshot, Some(&parent), Some(&settings)), - ) - }; - let codex_home = resolve_workspace_codex_home(&entry_snapshot, Some(&parent)); - match spawn_session(entry_snapshot.clone(), default_bin, codex_args, codex_home).await { - Ok(session) => { - sessions - .lock() - .await - .insert(entry_snapshot.id.clone(), session); - } - Err(error) => { - eprintln!( - "rename_worktree: respawn failed for {} after rename: {error}", - entry_snapshot.id - ); - } - } - } - - let connected = sessions.lock().await.contains_key(&entry_snapshot.id); - Ok(WorkspaceInfo { - id: entry_snapshot.id, - name: entry_snapshot.name, - path: entry_snapshot.path, - codex_bin: entry_snapshot.codex_bin, - connected, - kind: entry_snapshot.kind, - parent_id: entry_snapshot.parent_id, - worktree: entry_snapshot.worktree, - settings: entry_snapshot.settings, - }) -} - -pub(crate) async fn rename_worktree_upstream_core< - FResolveGitRoot, - FBranchExists, - FutBranchExists, - FFindRemote, - FutFindRemote, - FRemoteExists, - FutRemoteExists, - FRemoteBranchExists, - FutRemoteBranchExists, - FRunGit, - FutRunGit, ->( - id: String, - old_branch: String, - new_branch: String, - workspaces: &Mutex>, - resolve_git_root: FResolveGitRoot, - git_branch_exists: FBranchExists, - git_find_remote_for_branch: FFindRemote, - git_remote_exists: FRemoteExists, - git_remote_branch_exists: FRemoteBranchExists, - run_git_command: FRunGit, -) -> Result<(), String> -where - FResolveGitRoot: Fn(&WorkspaceEntry) -> Result, - FBranchExists: Fn(&PathBuf, &str) -> FutBranchExists, - FutBranchExists: Future>, - FFindRemote: Fn(&PathBuf, &str) -> FutFindRemote, - FutFindRemote: Future, String>>, - FRemoteExists: Fn(&PathBuf, &str) -> FutRemoteExists, - FutRemoteExists: Future>, - FRemoteBranchExists: Fn(&PathBuf, &str, &str) -> FutRemoteBranchExists, - FutRemoteBranchExists: Future>, - FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, - FutRunGit: Future>, -{ - let old_branch = old_branch.trim().to_string(); - let new_branch = new_branch.trim().to_string(); - if old_branch.is_empty() || new_branch.is_empty() { - return Err("Branch name is required.".to_string()); - } - if old_branch == new_branch { - return Err("Branch name is unchanged.".to_string()); - } - - let (_entry, parent) = { - let workspaces = workspaces.lock().await; - let entry = workspaces - .get(&id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - if !entry.kind.is_worktree() { - return Err("Not a worktree workspace.".to_string()); - } - let parent_id = entry - .parent_id - .clone() - .ok_or_else(|| "worktree parent not found".to_string())?; - let parent = workspaces - .get(&parent_id) - .cloned() - .ok_or_else(|| "worktree parent not found".to_string())?; - (entry, parent) - }; - - let parent_root = resolve_git_root(&parent)?; - if !git_branch_exists(&parent_root, &new_branch).await? { - return Err("Local branch not found.".to_string()); - } - - let remote_for_old = git_find_remote_for_branch(&parent_root, &old_branch).await?; - let remote_name = match remote_for_old.as_ref() { - Some(remote) => remote.clone(), - None => { - if git_remote_exists(&parent_root, "origin").await? { - "origin".to_string() - } else { - return Err("No git remote configured for this worktree.".to_string()); - } - } - }; - - if git_remote_branch_exists(&parent_root, &remote_name, &new_branch).await? { - return Err("Remote branch already exists.".to_string()); - } - - if remote_for_old.is_some() { - run_git_command( - &parent_root, - &["push", &remote_name, &format!("{new_branch}:{new_branch}")], - ) - .await?; - run_git_command( - &parent_root, - &["push", &remote_name, &format!(":{old_branch}")], - ) - .await?; - } else { - run_git_command(&parent_root, &["push", &remote_name, &new_branch]).await?; - } - - run_git_command( - &parent_root, - &[ - "branch", - "--set-upstream-to", - &format!("{remote_name}/{new_branch}"), - &new_branch, - ], - ) - .await?; - - Ok(()) -} - -pub(crate) async fn update_workspace_settings_core( - id: String, - mut settings: WorkspaceSettings, - workspaces: &Mutex>, - sessions: &Mutex>>, - app_settings: &Mutex, - storage_path: &PathBuf, - apply_settings_update: FApplySettings, - spawn_session: FSpawn, -) -> Result -where - FApplySettings: Fn( - &mut HashMap, - &str, - WorkspaceSettings, - ) -> Result, - FSpawn: Fn(WorkspaceEntry, Option, Option, Option) -> FutSpawn, - FutSpawn: Future, String>>, -{ - settings.worktree_setup_script = normalize_setup_script(settings.worktree_setup_script); - - let ( - previous_entry, - entry_snapshot, - parent_entry, - previous_codex_home, - previous_codex_args, - previous_worktree_setup_script, - child_entries, - ) = { - let mut workspaces = workspaces.lock().await; - let previous_entry = workspaces - .get(&id) - .cloned() - .ok_or_else(|| "workspace not found".to_string())?; - let previous_codex_home = previous_entry.settings.codex_home.clone(); - let previous_codex_args = previous_entry.settings.codex_args.clone(); - let previous_worktree_setup_script = previous_entry.settings.worktree_setup_script.clone(); - let entry_snapshot = apply_settings_update(&mut workspaces, &id, settings)?; - let parent_entry = entry_snapshot - .parent_id - .as_ref() - .and_then(|parent_id| workspaces.get(parent_id)) - .cloned(); - let child_entries = workspaces - .values() - .filter(|entry| entry.parent_id.as_deref() == Some(&id)) - .cloned() - .collect::>(); - ( - previous_entry, - entry_snapshot, - parent_entry, - previous_codex_home, - previous_codex_args, - previous_worktree_setup_script, - child_entries, - ) - }; - - let codex_home_changed = previous_codex_home != entry_snapshot.settings.codex_home; - let codex_args_changed = previous_codex_args != entry_snapshot.settings.codex_args; - let worktree_setup_script_changed = - previous_worktree_setup_script != entry_snapshot.settings.worktree_setup_script; - let connected = sessions.lock().await.contains_key(&id); - if connected && (codex_home_changed || codex_args_changed) { - let rollback_entry = previous_entry.clone(); - let (default_bin, codex_args) = { - let settings = app_settings.lock().await; - ( - settings.codex_bin.clone(), - resolve_workspace_codex_args( - &entry_snapshot, - parent_entry.as_ref(), - Some(&settings), - ), - ) - }; - let codex_home = resolve_workspace_codex_home(&entry_snapshot, parent_entry.as_ref()); - let new_session = match spawn_session( - entry_snapshot.clone(), - default_bin, - codex_args, - codex_home, - ) - .await - { - Ok(session) => session, - Err(error) => { - let mut workspaces = workspaces.lock().await; - workspaces.insert(rollback_entry.id.clone(), rollback_entry); - return Err(error); - } - }; - if let Some(old_session) = sessions - .lock() - .await - .insert(entry_snapshot.id.clone(), new_session) - { - let mut child = old_session.child.lock().await; - kill_child_process_tree(&mut child).await; - } - } - if codex_home_changed || codex_args_changed { - let app_settings_snapshot = app_settings.lock().await.clone(); - let default_bin = app_settings_snapshot.codex_bin.clone(); - for child in &child_entries { - let connected = sessions.lock().await.contains_key(&child.id); - if !connected { - continue; - } - let previous_child_home = resolve_workspace_codex_home(child, Some(&previous_entry)); - let next_child_home = resolve_workspace_codex_home(child, Some(&entry_snapshot)); - let previous_child_args = resolve_workspace_codex_args( - child, - Some(&previous_entry), - Some(&app_settings_snapshot), - ); - let next_child_args = resolve_workspace_codex_args( - child, - Some(&entry_snapshot), - Some(&app_settings_snapshot), - ); - if previous_child_home == next_child_home && previous_child_args == next_child_args { - continue; - } - let new_session = match spawn_session( - child.clone(), - default_bin.clone(), - next_child_args, - next_child_home, - ) - .await - { - Ok(session) => session, - Err(error) => { - eprintln!( - "update_workspace_settings: respawn failed for worktree {} after parent override change: {error}", - child.id - ); - continue; - } - }; - if let Some(old_session) = sessions.lock().await.insert(child.id.clone(), new_session) { - let mut child = old_session.child.lock().await; - kill_child_process_tree(&mut child).await; - } - } - } - if worktree_setup_script_changed && !entry_snapshot.kind.is_worktree() { - let child_ids = child_entries - .iter() - .map(|child| child.id.clone()) - .collect::>(); - if !child_ids.is_empty() { - let mut workspaces = workspaces.lock().await; - for child_id in child_ids { - if let Some(child) = workspaces.get_mut(&child_id) { - child.settings.worktree_setup_script = - entry_snapshot.settings.worktree_setup_script.clone(); - } - } - } - } - let list: Vec<_> = { - let workspaces = workspaces.lock().await; - workspaces.values().cloned().collect() - }; - write_workspaces(storage_path, &list)?; - Ok(WorkspaceInfo { - id: entry_snapshot.id, - name: entry_snapshot.name, - path: entry_snapshot.path, - codex_bin: entry_snapshot.codex_bin, - connected, - kind: entry_snapshot.kind, - parent_id: entry_snapshot.parent_id, - worktree: entry_snapshot.worktree, - settings: entry_snapshot.settings, - }) -} - -pub(crate) async fn update_workspace_codex_bin_core( - id: String, - codex_bin: Option, - workspaces: &Mutex>, - sessions: &Mutex>>, - storage_path: &PathBuf, -) -> Result { - let (entry_snapshot, list) = { - let mut workspaces = workspaces.lock().await; - let entry_snapshot = match workspaces.get_mut(&id) { - Some(entry) => { - entry.codex_bin = codex_bin.clone(); - entry.clone() - } - None => return Err("workspace not found".to_string()), - }; - let list: Vec<_> = workspaces.values().cloned().collect(); - (entry_snapshot, list) - }; - write_workspaces(storage_path, &list)?; - - let connected = sessions.lock().await.contains_key(&id); - Ok(WorkspaceInfo { - id: entry_snapshot.id, - name: entry_snapshot.name, - path: entry_snapshot.path, - codex_bin: entry_snapshot.codex_bin, - connected, - kind: entry_snapshot.kind, - parent_id: entry_snapshot.parent_id, - worktree: entry_snapshot.worktree, - settings: entry_snapshot.settings, - }) -} - -pub(crate) async fn list_workspace_files_core( - workspaces: &Mutex>, - workspace_id: &str, - list_files: F, -) -> Result, String> -where - F: Fn(&PathBuf) -> Vec, -{ - let root = resolve_workspace_root(workspaces, workspace_id).await?; - Ok(list_files(&root)) -} - -pub(crate) async fn read_workspace_file_core( - workspaces: &Mutex>, - workspace_id: &str, - path: &str, - read_file: F, -) -> Result -where - F: Fn(&PathBuf, &str) -> Result, -{ - let root = resolve_workspace_root(workspaces, workspace_id).await?; - read_file(&root, path) -} - -fn sort_workspaces(workspaces: &mut [WorkspaceInfo]) { - workspaces.sort_by(|a, b| { - let a_order = a.settings.sort_order.unwrap_or(u32::MAX); - let b_order = b.settings.sort_order.unwrap_or(u32::MAX); - if a_order != b_order { - return a_order.cmp(&b_order); - } - a.name.cmp(&b.name).then_with(|| a.id.cmp(&b.id)) - }); -} - -#[cfg(test)] -mod tests { - use super::copy_agents_md_from_parent_to_worktree; - use super::AGENTS_MD_FILE_NAME; - use uuid::Uuid; - - fn make_temp_dir() -> std::path::PathBuf { - let dir = std::env::temp_dir().join(format!("codex-monitor-{}", Uuid::new_v4())); - std::fs::create_dir_all(&dir).expect("failed to create temp dir"); - dir - } - - #[test] - fn copies_agents_md_when_missing_in_worktree() { - let parent = make_temp_dir(); - let worktree = make_temp_dir(); - let parent_agents = parent.join(AGENTS_MD_FILE_NAME); - let worktree_agents = worktree.join(AGENTS_MD_FILE_NAME); - - std::fs::write(&parent_agents, "parent").expect("failed to write parent AGENTS.md"); - - copy_agents_md_from_parent_to_worktree(&parent, &worktree).expect("copy should succeed"); - - let copied = std::fs::read_to_string(&worktree_agents) - .expect("worktree AGENTS.md should exist after copy"); - assert_eq!(copied, "parent"); - - let _ = std::fs::remove_dir_all(parent); - let _ = std::fs::remove_dir_all(worktree); - } - - #[test] - fn does_not_overwrite_existing_worktree_agents_md() { - let parent = make_temp_dir(); - let worktree = make_temp_dir(); - let parent_agents = parent.join(AGENTS_MD_FILE_NAME); - let worktree_agents = worktree.join(AGENTS_MD_FILE_NAME); - - std::fs::write(&parent_agents, "parent").expect("failed to write parent AGENTS.md"); - std::fs::write(&worktree_agents, "branch-specific") - .expect("failed to write worktree AGENTS.md"); - - copy_agents_md_from_parent_to_worktree(&parent, &worktree).expect("copy should succeed"); - - let retained = std::fs::read_to_string(&worktree_agents) - .expect("worktree AGENTS.md should still exist"); - assert_eq!(retained, "branch-specific"); - - let _ = std::fs::remove_dir_all(parent); - let _ = std::fs::remove_dir_all(worktree); - } -} diff --git a/src-tauri/src/shared/workspaces_core/connect.rs b/src-tauri/src/shared/workspaces_core/connect.rs new file mode 100644 index 000000000..e59c5d122 --- /dev/null +++ b/src-tauri/src/shared/workspaces_core/connect.rs @@ -0,0 +1,49 @@ +use std::collections::HashMap; +use std::future::Future; +use std::path::PathBuf; +use std::sync::Arc; + +use tokio::sync::Mutex; + +use crate::backend::app_server::WorkspaceSession; +use crate::codex::args::resolve_workspace_codex_args; +use crate::codex::home::resolve_workspace_codex_home; +use crate::shared::process_core::kill_child_process_tree; +use crate::types::{AppSettings, WorkspaceEntry}; + +use super::helpers::resolve_entry_and_parent; + +pub(crate) async fn connect_workspace_core( + workspace_id: String, + workspaces: &Mutex>, + sessions: &Mutex>>, + app_settings: &Mutex, + spawn_session: F, +) -> Result<(), String> +where + F: Fn(WorkspaceEntry, Option, Option, Option) -> Fut, + Fut: Future, String>>, +{ + let (entry, parent_entry) = resolve_entry_and_parent(workspaces, &workspace_id).await?; + let (default_bin, codex_args) = { + let settings = app_settings.lock().await; + ( + settings.codex_bin.clone(), + resolve_workspace_codex_args(&entry, parent_entry.as_ref(), Some(&settings)), + ) + }; + let codex_home = resolve_workspace_codex_home(&entry, parent_entry.as_ref()); + let session = spawn_session(entry.clone(), default_bin, codex_args, codex_home).await?; + sessions.lock().await.insert(entry.id, session); + Ok(()) +} + +pub(super) async fn kill_session_by_id( + sessions: &Mutex>>, + id: &str, +) { + if let Some(session) = sessions.lock().await.remove(id) { + let mut child = session.child.lock().await; + kill_child_process_tree(&mut child).await; + } +} diff --git a/src-tauri/src/shared/workspaces_core/crud_persistence.rs b/src-tauri/src/shared/workspaces_core/crud_persistence.rs new file mode 100644 index 000000000..19a868b64 --- /dev/null +++ b/src-tauri/src/shared/workspaces_core/crud_persistence.rs @@ -0,0 +1,541 @@ +use std::collections::HashMap; +use std::future::Future; +use std::path::PathBuf; +use std::sync::Arc; + +use tokio::sync::Mutex; +use uuid::Uuid; + +use crate::backend::app_server::WorkspaceSession; +use crate::codex::args::resolve_workspace_codex_args; +use crate::codex::home::resolve_workspace_codex_home; +use crate::shared::process_core::kill_child_process_tree; +use crate::shared::{git_core, worktree_core}; +use crate::storage::write_workspaces; +use crate::types::{AppSettings, WorkspaceEntry, WorkspaceInfo, WorkspaceKind, WorkspaceSettings}; + +use super::connect::kill_session_by_id; +use super::helpers::normalize_setup_script; + +pub(crate) async fn add_workspace_core( + path: String, + codex_bin: Option, + workspaces: &Mutex>, + sessions: &Mutex>>, + app_settings: &Mutex, + storage_path: &PathBuf, + spawn_session: F, +) -> Result +where + F: Fn(WorkspaceEntry, Option, Option, Option) -> Fut, + Fut: Future, String>>, +{ + if !PathBuf::from(&path).is_dir() { + return Err("Workspace path must be a folder.".to_string()); + } + + let name = PathBuf::from(&path) + .file_name() + .and_then(|s| s.to_str()) + .unwrap_or("Workspace") + .to_string(); + let entry = WorkspaceEntry { + id: Uuid::new_v4().to_string(), + name: name.clone(), + path: path.clone(), + codex_bin, + kind: WorkspaceKind::Main, + parent_id: None, + worktree: None, + settings: WorkspaceSettings::default(), + }; + + let (default_bin, codex_args) = { + let settings = app_settings.lock().await; + ( + settings.codex_bin.clone(), + resolve_workspace_codex_args(&entry, None, Some(&settings)), + ) + }; + let codex_home = resolve_workspace_codex_home(&entry, None); + let session = spawn_session(entry.clone(), default_bin, codex_args, codex_home).await?; + + if let Err(error) = { + let mut workspaces = workspaces.lock().await; + workspaces.insert(entry.id.clone(), entry.clone()); + let list: Vec<_> = workspaces.values().cloned().collect(); + write_workspaces(storage_path, &list) + } { + { + let mut workspaces = workspaces.lock().await; + workspaces.remove(&entry.id); + } + let mut child = session.child.lock().await; + kill_child_process_tree(&mut child).await; + return Err(error); + } + + sessions.lock().await.insert(entry.id.clone(), session); + + Ok(WorkspaceInfo { + id: entry.id, + name: entry.name, + path: entry.path, + codex_bin: entry.codex_bin, + connected: true, + kind: entry.kind, + parent_id: entry.parent_id, + worktree: entry.worktree, + settings: entry.settings, + }) +} + +pub(crate) async fn add_clone_core( + source_workspace_id: String, + copy_name: String, + copies_folder: String, + workspaces: &Mutex>, + sessions: &Mutex>>, + app_settings: &Mutex, + storage_path: &PathBuf, + spawn_session: F, +) -> Result +where + F: Fn(WorkspaceEntry, Option, Option, Option) -> Fut, + Fut: Future, String>>, +{ + let copy_name = copy_name.trim().to_string(); + if copy_name.is_empty() { + return Err("Copy name is required.".to_string()); + } + + let copies_folder = copies_folder.trim().to_string(); + if copies_folder.is_empty() { + return Err("Copies folder is required.".to_string()); + } + let copies_folder_path = PathBuf::from(&copies_folder); + std::fs::create_dir_all(&copies_folder_path) + .map_err(|e| format!("Failed to create copies folder: {e}"))?; + if !copies_folder_path.is_dir() { + return Err("Copies folder must be a directory.".to_string()); + } + + let (source_entry, inherited_group_id) = { + let workspaces = workspaces.lock().await; + let source_entry = workspaces + .get(&source_workspace_id) + .cloned() + .ok_or_else(|| "source workspace not found".to_string())?; + let inherited_group_id = if source_entry.kind.is_worktree() { + source_entry + .parent_id + .as_ref() + .and_then(|parent_id| workspaces.get(parent_id)) + .and_then(|parent| parent.settings.group_id.clone()) + } else { + source_entry.settings.group_id.clone() + }; + (source_entry, inherited_group_id) + }; + + let destination_path = + worktree_core::build_clone_destination_path(&copies_folder_path, ©_name); + let destination_path_string = destination_path.to_string_lossy().to_string(); + + if let Err(error) = git_core::run_git_command( + &copies_folder_path, + &["clone", &source_entry.path, &destination_path_string], + ) + .await + { + let _ = tokio::fs::remove_dir_all(&destination_path).await; + return Err(error); + } + + if let Some(origin_url) = git_core::git_get_origin_url(&PathBuf::from(&source_entry.path)).await + { + let _ = git_core::run_git_command( + &destination_path, + &["remote", "set-url", "origin", &origin_url], + ) + .await; + } + + let entry = WorkspaceEntry { + id: Uuid::new_v4().to_string(), + name: copy_name, + path: destination_path_string, + codex_bin: source_entry.codex_bin.clone(), + kind: WorkspaceKind::Main, + parent_id: None, + worktree: None, + settings: WorkspaceSettings { + group_id: inherited_group_id, + ..WorkspaceSettings::default() + }, + }; + + let (default_bin, codex_args) = { + let settings = app_settings.lock().await; + ( + settings.codex_bin.clone(), + resolve_workspace_codex_args(&entry, None, Some(&settings)), + ) + }; + let codex_home = resolve_workspace_codex_home(&entry, None); + let session = match spawn_session(entry.clone(), default_bin, codex_args, codex_home).await { + Ok(session) => session, + Err(error) => { + let _ = tokio::fs::remove_dir_all(&destination_path).await; + return Err(error); + } + }; + + if let Err(error) = { + let mut workspaces = workspaces.lock().await; + workspaces.insert(entry.id.clone(), entry.clone()); + let list: Vec<_> = workspaces.values().cloned().collect(); + write_workspaces(storage_path, &list) + } { + { + let mut workspaces = workspaces.lock().await; + workspaces.remove(&entry.id); + } + let mut child = session.child.lock().await; + kill_child_process_tree(&mut child).await; + let _ = tokio::fs::remove_dir_all(&destination_path).await; + return Err(error); + } + + sessions.lock().await.insert(entry.id.clone(), session); + + Ok(WorkspaceInfo { + id: entry.id, + name: entry.name, + path: entry.path, + codex_bin: entry.codex_bin, + connected: true, + kind: entry.kind, + parent_id: entry.parent_id, + worktree: entry.worktree, + settings: entry.settings, + }) +} + +pub(crate) async fn remove_workspace_core( + id: String, + workspaces: &Mutex>, + sessions: &Mutex>>, + storage_path: &PathBuf, + run_git_command: FRunGit, + is_missing_worktree_error: FIsMissing, + remove_dir_all: FRemoveDirAll, + require_all_children_removed_to_remove_parent: bool, + continue_on_child_error: bool, +) -> Result<(), String> +where + FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, + FutRunGit: Future>, + FIsMissing: Fn(&str) -> bool, + FRemoveDirAll: Fn(&PathBuf) -> Result<(), String>, +{ + let (entry, child_worktrees) = { + let workspaces = workspaces.lock().await; + let entry = workspaces + .get(&id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + if entry.kind.is_worktree() { + return Err("Use remove_worktree for worktree agents.".to_string()); + } + let children = workspaces + .values() + .filter(|workspace| workspace.parent_id.as_deref() == Some(&id)) + .cloned() + .collect::>(); + (entry, children) + }; + + let repo_path = PathBuf::from(&entry.path); + let mut removed_child_ids = Vec::new(); + let mut failures: Vec<(String, String)> = Vec::new(); + + for child in &child_worktrees { + kill_session_by_id(sessions, &child.id).await; + + let child_path = PathBuf::from(&child.path); + if child_path.exists() { + if let Err(error) = + run_git_command(&repo_path, &["worktree", "remove", "--force", &child.path]).await + { + if is_missing_worktree_error(&error) { + if child_path.exists() { + if let Err(fs_error) = remove_dir_all(&child_path) { + if continue_on_child_error { + failures.push((child.id.clone(), fs_error)); + continue; + } + return Err(fs_error); + } + } + } else { + if continue_on_child_error { + failures.push((child.id.clone(), error)); + continue; + } + return Err(error); + } + } + } + removed_child_ids.push(child.id.clone()); + } + + let _ = run_git_command(&repo_path, &["worktree", "prune", "--expire", "now"]).await; + + let mut ids_to_remove = removed_child_ids; + if failures.is_empty() || !require_all_children_removed_to_remove_parent { + kill_session_by_id(sessions, &id).await; + ids_to_remove.push(id.clone()); + } + + { + let mut workspaces = workspaces.lock().await; + for workspace_id in ids_to_remove { + workspaces.remove(&workspace_id); + } + let list: Vec<_> = workspaces.values().cloned().collect(); + write_workspaces(storage_path, &list)?; + } + + if failures.is_empty() { + return Ok(()); + } + + if require_all_children_removed_to_remove_parent { + let mut message = + "Failed to remove one or more worktrees; parent workspace was not removed.".to_string(); + for (child_id, error) in failures { + message.push_str(&format!("\n- {child_id}: {error}")); + } + return Err(message); + } + + Ok(()) +} + +pub(crate) async fn update_workspace_settings_core( + id: String, + mut settings: WorkspaceSettings, + workspaces: &Mutex>, + sessions: &Mutex>>, + app_settings: &Mutex, + storage_path: &PathBuf, + apply_settings_update: FApplySettings, + spawn_session: FSpawn, +) -> Result +where + FApplySettings: Fn( + &mut HashMap, + &str, + WorkspaceSettings, + ) -> Result, + FSpawn: Fn(WorkspaceEntry, Option, Option, Option) -> FutSpawn, + FutSpawn: Future, String>>, +{ + settings.worktree_setup_script = normalize_setup_script(settings.worktree_setup_script); + + let ( + previous_entry, + entry_snapshot, + parent_entry, + previous_codex_home, + previous_codex_args, + previous_worktree_setup_script, + child_entries, + ) = { + let mut workspaces = workspaces.lock().await; + let previous_entry = workspaces + .get(&id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + let previous_codex_home = previous_entry.settings.codex_home.clone(); + let previous_codex_args = previous_entry.settings.codex_args.clone(); + let previous_worktree_setup_script = previous_entry.settings.worktree_setup_script.clone(); + let entry_snapshot = apply_settings_update(&mut workspaces, &id, settings)?; + let parent_entry = entry_snapshot + .parent_id + .as_ref() + .and_then(|parent_id| workspaces.get(parent_id)) + .cloned(); + let child_entries = workspaces + .values() + .filter(|entry| entry.parent_id.as_deref() == Some(&id)) + .cloned() + .collect::>(); + ( + previous_entry, + entry_snapshot, + parent_entry, + previous_codex_home, + previous_codex_args, + previous_worktree_setup_script, + child_entries, + ) + }; + + let codex_home_changed = previous_codex_home != entry_snapshot.settings.codex_home; + let codex_args_changed = previous_codex_args != entry_snapshot.settings.codex_args; + let worktree_setup_script_changed = + previous_worktree_setup_script != entry_snapshot.settings.worktree_setup_script; + let connected = sessions.lock().await.contains_key(&id); + if connected && (codex_home_changed || codex_args_changed) { + let rollback_entry = previous_entry.clone(); + let (default_bin, codex_args) = { + let settings = app_settings.lock().await; + ( + settings.codex_bin.clone(), + resolve_workspace_codex_args( + &entry_snapshot, + parent_entry.as_ref(), + Some(&settings), + ), + ) + }; + let codex_home = resolve_workspace_codex_home(&entry_snapshot, parent_entry.as_ref()); + let new_session = match spawn_session( + entry_snapshot.clone(), + default_bin, + codex_args, + codex_home, + ) + .await + { + Ok(session) => session, + Err(error) => { + let mut workspaces = workspaces.lock().await; + workspaces.insert(rollback_entry.id.clone(), rollback_entry); + return Err(error); + } + }; + if let Some(old_session) = sessions + .lock() + .await + .insert(entry_snapshot.id.clone(), new_session) + { + let mut child = old_session.child.lock().await; + kill_child_process_tree(&mut child).await; + } + } + if codex_home_changed || codex_args_changed { + let app_settings_snapshot = app_settings.lock().await.clone(); + let default_bin = app_settings_snapshot.codex_bin.clone(); + for child in &child_entries { + let connected = sessions.lock().await.contains_key(&child.id); + if !connected { + continue; + } + let previous_child_home = resolve_workspace_codex_home(child, Some(&previous_entry)); + let next_child_home = resolve_workspace_codex_home(child, Some(&entry_snapshot)); + let previous_child_args = resolve_workspace_codex_args( + child, + Some(&previous_entry), + Some(&app_settings_snapshot), + ); + let next_child_args = resolve_workspace_codex_args( + child, + Some(&entry_snapshot), + Some(&app_settings_snapshot), + ); + if previous_child_home == next_child_home && previous_child_args == next_child_args { + continue; + } + let new_session = match spawn_session( + child.clone(), + default_bin.clone(), + next_child_args, + next_child_home, + ) + .await + { + Ok(session) => session, + Err(error) => { + eprintln!( + "update_workspace_settings: respawn failed for worktree {} after parent override change: {error}", + child.id + ); + continue; + } + }; + if let Some(old_session) = sessions.lock().await.insert(child.id.clone(), new_session) { + let mut child = old_session.child.lock().await; + kill_child_process_tree(&mut child).await; + } + } + } + if worktree_setup_script_changed && !entry_snapshot.kind.is_worktree() { + let child_ids = child_entries + .iter() + .map(|child| child.id.clone()) + .collect::>(); + if !child_ids.is_empty() { + let mut workspaces = workspaces.lock().await; + for child_id in child_ids { + if let Some(child) = workspaces.get_mut(&child_id) { + child.settings.worktree_setup_script = + entry_snapshot.settings.worktree_setup_script.clone(); + } + } + } + } + let list: Vec<_> = { + let workspaces = workspaces.lock().await; + workspaces.values().cloned().collect() + }; + write_workspaces(storage_path, &list)?; + Ok(WorkspaceInfo { + id: entry_snapshot.id, + name: entry_snapshot.name, + path: entry_snapshot.path, + codex_bin: entry_snapshot.codex_bin, + connected, + kind: entry_snapshot.kind, + parent_id: entry_snapshot.parent_id, + worktree: entry_snapshot.worktree, + settings: entry_snapshot.settings, + }) +} + +pub(crate) async fn update_workspace_codex_bin_core( + id: String, + codex_bin: Option, + workspaces: &Mutex>, + sessions: &Mutex>>, + storage_path: &PathBuf, +) -> Result { + let (entry_snapshot, list) = { + let mut workspaces = workspaces.lock().await; + let entry_snapshot = match workspaces.get_mut(&id) { + Some(entry) => { + entry.codex_bin = codex_bin.clone(); + entry.clone() + } + None => return Err("workspace not found".to_string()), + }; + let list: Vec<_> = workspaces.values().cloned().collect(); + (entry_snapshot, list) + }; + write_workspaces(storage_path, &list)?; + + let connected = sessions.lock().await.contains_key(&id); + Ok(WorkspaceInfo { + id: entry_snapshot.id, + name: entry_snapshot.name, + path: entry_snapshot.path, + codex_bin: entry_snapshot.codex_bin, + connected, + kind: entry_snapshot.kind, + parent_id: entry_snapshot.parent_id, + worktree: entry_snapshot.worktree, + settings: entry_snapshot.settings, + }) +} diff --git a/src-tauri/src/shared/workspaces_core/git_orchestration.rs b/src-tauri/src/shared/workspaces_core/git_orchestration.rs new file mode 100644 index 000000000..6ee25e29b --- /dev/null +++ b/src-tauri/src/shared/workspaces_core/git_orchestration.rs @@ -0,0 +1,172 @@ +use std::collections::HashMap; +use std::future::Future; +use std::path::PathBuf; +use std::process::Stdio; + +use tokio::io::AsyncWriteExt; +use tokio::sync::Mutex; + +use crate::git_utils::resolve_git_root; +use crate::shared::process_core::tokio_command; +use crate::shared::{git_core, worktree_core}; +use crate::types::WorkspaceEntry; + +pub(crate) fn run_git_command_unit( + repo_path: &PathBuf, + args: &[&str], + run_git_command: F, +) -> impl Future> +where + F: Fn(PathBuf, Vec) -> Fut, + Fut: Future>, +{ + let repo_path = repo_path.clone(); + let args_owned = args + .iter() + .map(|value| value.to_string()) + .collect::>(); + async move { + run_git_command(repo_path, args_owned) + .await + .map(|_output| ()) + } +} + +pub(crate) async fn apply_worktree_changes_core( + workspaces: &Mutex>, + workspace_id: String, +) -> Result<(), String> { + let (entry, parent) = { + let workspaces = workspaces.lock().await; + let entry = workspaces + .get(&workspace_id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + if !entry.kind.is_worktree() { + return Err("Not a worktree workspace.".to_string()); + } + let parent_id = entry + .parent_id + .clone() + .ok_or_else(|| "worktree parent not found".to_string())?; + let parent = workspaces + .get(&parent_id) + .cloned() + .ok_or_else(|| "worktree parent not found".to_string())?; + (entry, parent) + }; + + apply_worktree_changes_inner_core(&entry, &parent).await +} + +pub(super) async fn apply_worktree_changes_inner_core( + entry: &WorkspaceEntry, + parent: &WorkspaceEntry, +) -> Result<(), String> { + let worktree_root = resolve_git_root(entry)?; + let parent_root = resolve_git_root(parent)?; + + let parent_status = + git_core::run_git_command_bytes(&parent_root, &["status", "--porcelain"]).await?; + if !String::from_utf8_lossy(&parent_status).trim().is_empty() { + return Err( + "Your current branch has uncommitted changes. Please commit, stash, or discard them before applying worktree changes." + .to_string(), + ); + } + + let mut patch: Vec = Vec::new(); + let staged_patch = git_core::run_git_diff( + &worktree_root, + &["diff", "--binary", "--no-color", "--cached"], + ) + .await?; + patch.extend_from_slice(&staged_patch); + let unstaged_patch = + git_core::run_git_diff(&worktree_root, &["diff", "--binary", "--no-color"]).await?; + patch.extend_from_slice(&unstaged_patch); + + let untracked_output = git_core::run_git_command_bytes( + &worktree_root, + &["ls-files", "--others", "--exclude-standard", "-z"], + ) + .await?; + for raw_path in untracked_output.split(|byte| *byte == 0) { + if raw_path.is_empty() { + continue; + } + let path = String::from_utf8_lossy(raw_path).to_string(); + let diff = git_core::run_git_diff( + &worktree_root, + &[ + "diff", + "--binary", + "--no-color", + "--no-index", + "--", + worktree_core::null_device_path(), + &path, + ], + ) + .await?; + patch.extend_from_slice(&diff); + } + + if String::from_utf8_lossy(&patch).trim().is_empty() { + return Err("No changes to apply.".to_string()); + } + + let git_bin = + crate::utils::resolve_git_binary().map_err(|e| format!("Failed to run git: {e}"))?; + let mut child = tokio_command(git_bin) + .args(["apply", "--3way", "--whitespace=nowarn", "-"]) + .current_dir(&parent_root) + .env("PATH", crate::utils::git_env_path()) + .stdin(Stdio::piped()) + .stdout(Stdio::piped()) + .stderr(Stdio::piped()) + .spawn() + .map_err(|e| format!("Failed to run git: {e}"))?; + + if let Some(mut stdin) = child.stdin.take() { + stdin + .write_all(&patch) + .await + .map_err(|e| format!("Failed to write git apply input: {e}"))?; + } + + let output = child + .wait_with_output() + .await + .map_err(|e| format!("Failed to run git: {e}"))?; + + if output.status.success() { + return Ok(()); + } + + let stderr = String::from_utf8_lossy(&output.stderr); + let stdout = String::from_utf8_lossy(&output.stdout); + let detail = if stderr.trim().is_empty() { + stdout.trim() + } else { + stderr.trim() + }; + if detail.is_empty() { + return Err("Git apply failed.".to_string()); + } + + if detail.contains("Applied patch to") { + if detail.contains("with conflicts") { + return Err( + "Applied with conflicts. Resolve conflicts in the parent repo before retrying." + .to_string(), + ); + } + return Err( + "Patch applied partially. Resolve changes in the parent repo before retrying." + .to_string(), + ); + } + + Err(detail.to_string()) +} diff --git a/src-tauri/src/shared/workspaces_core/helpers.rs b/src-tauri/src/shared/workspaces_core/helpers.rs new file mode 100644 index 000000000..bcf3525a4 --- /dev/null +++ b/src-tauri/src/shared/workspaces_core/helpers.rs @@ -0,0 +1,183 @@ +use std::collections::HashMap; +use std::path::PathBuf; +use std::sync::Arc; + +use tokio::sync::Mutex; + +use crate::backend::app_server::WorkspaceSession; +use crate::types::{WorkspaceEntry, WorkspaceInfo}; + +pub(crate) const WORKTREE_SETUP_MARKERS_DIR: &str = "worktree-setup"; +pub(crate) const WORKTREE_SETUP_MARKER_EXT: &str = "ran"; +pub(super) const AGENTS_MD_FILE_NAME: &str = "AGENTS.md"; + +pub(super) fn copy_agents_md_from_parent_to_worktree( + parent_repo_root: &PathBuf, + worktree_root: &PathBuf, +) -> Result<(), String> { + let source_path = parent_repo_root.join(AGENTS_MD_FILE_NAME); + if !source_path.is_file() { + return Ok(()); + } + + let destination_path = worktree_root.join(AGENTS_MD_FILE_NAME); + if destination_path.is_file() { + return Ok(()); + } + + let temp_path = worktree_root.join(format!("{AGENTS_MD_FILE_NAME}.tmp")); + + std::fs::copy(&source_path, &temp_path).map_err(|err| { + format!( + "Failed to copy {} from {} to {}: {err}", + AGENTS_MD_FILE_NAME, + source_path.display(), + temp_path.display() + ) + })?; + + std::fs::rename(&temp_path, &destination_path).map_err(|err| { + let _ = std::fs::remove_file(&temp_path); + format!( + "Failed to finalize {} copy to {}: {err}", + AGENTS_MD_FILE_NAME, + destination_path.display() + ) + })?; + + Ok(()) +} + +pub(crate) fn normalize_setup_script(script: Option) -> Option { + match script { + Some(value) if value.trim().is_empty() => None, + Some(value) => Some(value), + None => None, + } +} + +pub(crate) fn worktree_setup_marker_path(data_dir: &PathBuf, workspace_id: &str) -> PathBuf { + data_dir + .join(WORKTREE_SETUP_MARKERS_DIR) + .join(format!("{workspace_id}.{WORKTREE_SETUP_MARKER_EXT}")) +} + +pub(crate) fn is_workspace_path_dir_core(path: &str) -> bool { + PathBuf::from(path).is_dir() +} + +pub(crate) async fn list_workspaces_core( + workspaces: &Mutex>, + sessions: &Mutex>>, +) -> Vec { + let workspaces = workspaces.lock().await; + let sessions = sessions.lock().await; + let mut result = Vec::new(); + for entry in workspaces.values() { + result.push(WorkspaceInfo { + id: entry.id.clone(), + name: entry.name.clone(), + path: entry.path.clone(), + codex_bin: entry.codex_bin.clone(), + connected: sessions.contains_key(&entry.id), + kind: entry.kind.clone(), + parent_id: entry.parent_id.clone(), + worktree: entry.worktree.clone(), + settings: entry.settings.clone(), + }); + } + sort_workspaces(&mut result); + result +} + +pub(super) async fn resolve_entry_and_parent( + workspaces: &Mutex>, + workspace_id: &str, +) -> Result<(WorkspaceEntry, Option), String> { + let workspaces = workspaces.lock().await; + let entry = workspaces + .get(workspace_id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + let parent_entry = entry + .parent_id + .as_ref() + .and_then(|parent_id| workspaces.get(parent_id)) + .cloned(); + Ok((entry, parent_entry)) +} + +pub(super) async fn resolve_workspace_root( + workspaces: &Mutex>, + workspace_id: &str, +) -> Result { + let workspaces = workspaces.lock().await; + let entry = workspaces + .get(workspace_id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + Ok(PathBuf::from(entry.path)) +} + +pub(super) fn sort_workspaces(workspaces: &mut [WorkspaceInfo]) { + workspaces.sort_by(|a, b| { + let a_order = a.settings.sort_order.unwrap_or(u32::MAX); + let b_order = b.settings.sort_order.unwrap_or(u32::MAX); + if a_order != b_order { + return a_order.cmp(&b_order); + } + a.name.cmp(&b.name).then_with(|| a.id.cmp(&b.id)) + }); +} + +#[cfg(test)] +mod tests { + use super::{copy_agents_md_from_parent_to_worktree, AGENTS_MD_FILE_NAME}; + use uuid::Uuid; + + fn make_temp_dir() -> std::path::PathBuf { + let dir = std::env::temp_dir().join(format!("codex-monitor-{}", Uuid::new_v4())); + std::fs::create_dir_all(&dir).expect("failed to create temp dir"); + dir + } + + #[test] + fn copies_agents_md_when_missing_in_worktree() { + let parent = make_temp_dir(); + let worktree = make_temp_dir(); + let parent_agents = parent.join(AGENTS_MD_FILE_NAME); + let worktree_agents = worktree.join(AGENTS_MD_FILE_NAME); + + std::fs::write(&parent_agents, "parent").expect("failed to write parent AGENTS.md"); + + copy_agents_md_from_parent_to_worktree(&parent, &worktree).expect("copy should succeed"); + + let copied = std::fs::read_to_string(&worktree_agents) + .expect("worktree AGENTS.md should exist after copy"); + assert_eq!(copied, "parent"); + + let _ = std::fs::remove_dir_all(parent); + let _ = std::fs::remove_dir_all(worktree); + } + + #[test] + fn does_not_overwrite_existing_worktree_agents_md() { + let parent = make_temp_dir(); + let worktree = make_temp_dir(); + let parent_agents = parent.join(AGENTS_MD_FILE_NAME); + let worktree_agents = worktree.join(AGENTS_MD_FILE_NAME); + + std::fs::write(&parent_agents, "parent").expect("failed to write parent AGENTS.md"); + std::fs::write(&worktree_agents, "branch-specific") + .expect("failed to write worktree AGENTS.md"); + + copy_agents_md_from_parent_to_worktree(&parent, &worktree).expect("copy should succeed"); + + let retained = std::fs::read_to_string(&worktree_agents) + .expect("worktree AGENTS.md should still exist"); + assert_eq!(retained, "branch-specific"); + + let _ = std::fs::remove_dir_all(parent); + let _ = std::fs::remove_dir_all(worktree); + } +} diff --git a/src-tauri/src/shared/workspaces_core/io.rs b/src-tauri/src/shared/workspaces_core/io.rs new file mode 100644 index 000000000..4445007c8 --- /dev/null +++ b/src-tauri/src/shared/workspaces_core/io.rs @@ -0,0 +1,197 @@ +use std::collections::HashMap; +#[cfg(target_os = "windows")] +use std::path::Path; +use std::path::PathBuf; + +use tokio::sync::Mutex; + +use crate::shared::process_core::tokio_command; +#[cfg(target_os = "windows")] +use crate::shared::process_core::{build_cmd_c_command, resolve_windows_executable}; +use crate::types::WorkspaceEntry; + +use super::helpers::resolve_workspace_root; + +pub(crate) async fn open_workspace_in_core( + path: String, + app: Option, + args: Vec, + command: Option, +) -> Result<(), String> { + fn output_snippet(bytes: &[u8]) -> Option { + const MAX_CHARS: usize = 240; + let text = String::from_utf8_lossy(bytes).trim().replace('\n', "\\n"); + if text.is_empty() { + return None; + } + let mut chars = text.chars(); + let snippet: String = chars.by_ref().take(MAX_CHARS).collect(); + if chars.next().is_some() { + Some(format!("{snippet}...")) + } else { + Some(snippet) + } + } + + let target_label = command + .as_ref() + .map(|value| format!("command `{value}`")) + .or_else(|| app.as_ref().map(|value| format!("app `{value}`"))) + .unwrap_or_else(|| "target".to_string()); + + let output = if let Some(command) = command { + let trimmed = command.trim(); + if trimmed.is_empty() { + return Err("Missing app or command".to_string()); + } + + #[cfg(target_os = "windows")] + let mut cmd = { + let resolved = resolve_windows_executable(trimmed, None); + let resolved_path = resolved.as_deref().unwrap_or_else(|| Path::new(trimmed)); + let ext = resolved_path + .extension() + .and_then(|ext| ext.to_str()) + .map(|ext| ext.to_ascii_lowercase()); + + if matches!(ext.as_deref(), Some("cmd") | Some("bat")) { + let mut cmd = tokio_command("cmd"); + let mut command_args = args.clone(); + command_args.push(path.clone()); + let command_line = build_cmd_c_command(resolved_path, &command_args)?; + cmd.arg("/D"); + cmd.arg("/S"); + cmd.arg("/C"); + cmd.raw_arg(command_line); + cmd + } else { + let mut cmd = tokio_command(resolved_path); + cmd.args(&args).arg(&path); + cmd + } + }; + + #[cfg(not(target_os = "windows"))] + let mut cmd = { + let mut cmd = tokio_command(trimmed); + cmd.args(&args).arg(&path); + cmd + }; + + cmd.output() + .await + .map_err(|error| format!("Failed to open app ({target_label}): {error}"))? + } else if let Some(app) = app { + let trimmed = app.trim(); + if trimmed.is_empty() { + return Err("Missing app or command".to_string()); + } + + #[cfg(target_os = "macos")] + let mut cmd = { + let mut cmd = tokio_command("open"); + cmd.arg("-a").arg(trimmed).arg(&path); + if !args.is_empty() { + cmd.arg("--args").args(&args); + } + cmd + }; + + #[cfg(not(target_os = "macos"))] + let mut cmd = { + let mut cmd = tokio_command(trimmed); + cmd.args(&args).arg(&path); + cmd + }; + + cmd.output() + .await + .map_err(|error| format!("Failed to open app ({target_label}): {error}"))? + } else { + return Err("Missing app or command".to_string()); + }; + + if output.status.success() { + return Ok(()); + } + + let exit_detail = output + .status + .code() + .map(|code| format!("exit code {code}")) + .unwrap_or_else(|| "terminated by signal".to_string()); + let mut details = Vec::new(); + if let Some(stderr) = output_snippet(&output.stderr) { + details.push(format!("stderr: {stderr}")); + } + if let Some(stdout) = output_snippet(&output.stdout) { + details.push(format!("stdout: {stdout}")); + } + + if details.is_empty() { + Err(format!( + "Failed to open app ({target_label} returned {exit_detail})." + )) + } else { + Err(format!( + "Failed to open app ({target_label} returned {exit_detail}; {}).", + details.join("; ") + )) + } +} + +#[cfg(target_os = "macos")] +pub(crate) async fn get_open_app_icon_core( + app_name: String, + icon_loader: F, +) -> Result, String> +where + F: Fn(&str) -> Option + Send + Sync + 'static, +{ + let trimmed = app_name.trim().to_string(); + if trimmed.is_empty() { + return Ok(None); + } + let icon_loader = std::sync::Arc::new(icon_loader); + tokio::task::spawn_blocking(move || icon_loader(&trimmed)) + .await + .map_err(|err| err.to_string()) +} + +#[cfg(not(target_os = "macos"))] +pub(crate) async fn get_open_app_icon_core( + app_name: String, + icon_loader: F, +) -> Result, String> +where + F: Fn(&str) -> Option + Send + Sync + 'static, +{ + let _ = app_name; + let _ = icon_loader; + Ok(None) +} + +pub(crate) async fn list_workspace_files_core( + workspaces: &Mutex>, + workspace_id: &str, + list_files: F, +) -> Result, String> +where + F: Fn(&PathBuf) -> Vec, +{ + let root = resolve_workspace_root(workspaces, workspace_id).await?; + Ok(list_files(&root)) +} + +pub(crate) async fn read_workspace_file_core( + workspaces: &Mutex>, + workspace_id: &str, + path: &str, + read_file: F, +) -> Result +where + F: Fn(&PathBuf, &str) -> Result, +{ + let root = resolve_workspace_root(workspaces, workspace_id).await?; + read_file(&root, path) +} diff --git a/src-tauri/src/shared/workspaces_core/worktree.rs b/src-tauri/src/shared/workspaces_core/worktree.rs new file mode 100644 index 000000000..c954abb10 --- /dev/null +++ b/src-tauri/src/shared/workspaces_core/worktree.rs @@ -0,0 +1,588 @@ +use std::collections::HashMap; +use std::future::Future; +use std::path::PathBuf; +use std::sync::Arc; + +use tokio::sync::Mutex; +use uuid::Uuid; + +use crate::backend::app_server::WorkspaceSession; +use crate::codex::args::resolve_workspace_codex_args; +use crate::codex::home::resolve_workspace_codex_home; +use crate::storage::write_workspaces; +use crate::types::{ + AppSettings, WorkspaceEntry, WorkspaceInfo, WorkspaceKind, WorkspaceSettings, WorktreeInfo, + WorktreeSetupStatus, +}; + +use super::connect::kill_session_by_id; +use super::helpers::{ + copy_agents_md_from_parent_to_worktree, normalize_setup_script, worktree_setup_marker_path, + AGENTS_MD_FILE_NAME, +}; + +pub(crate) async fn worktree_setup_status_core( + workspaces: &Mutex>, + workspace_id: &str, + data_dir: &PathBuf, +) -> Result { + let entry = { + let workspaces = workspaces.lock().await; + workspaces + .get(workspace_id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())? + }; + + let script = normalize_setup_script(entry.settings.worktree_setup_script.clone()); + let marker_exists = if entry.kind.is_worktree() { + worktree_setup_marker_path(data_dir, &entry.id).exists() + } else { + false + }; + let should_run = entry.kind.is_worktree() && script.is_some() && !marker_exists; + + Ok(WorktreeSetupStatus { should_run, script }) +} + +pub(crate) async fn worktree_setup_mark_ran_core( + workspaces: &Mutex>, + workspace_id: &str, + data_dir: &PathBuf, +) -> Result<(), String> { + let entry = { + let workspaces = workspaces.lock().await; + workspaces + .get(workspace_id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())? + }; + if !entry.kind.is_worktree() { + return Err("Not a worktree workspace.".to_string()); + } + let marker_path = worktree_setup_marker_path(data_dir, &entry.id); + if let Some(parent) = marker_path.parent() { + std::fs::create_dir_all(parent) + .map_err(|err| format!("Failed to prepare worktree marker directory: {err}"))?; + } + let ran_at = std::time::SystemTime::now() + .duration_since(std::time::UNIX_EPOCH) + .map(|duration| duration.as_secs()) + .unwrap_or(0); + std::fs::write(&marker_path, format!("ran_at={ran_at}\n")) + .map_err(|err| format!("Failed to write worktree setup marker: {err}"))?; + Ok(()) +} + +pub(crate) async fn add_worktree_core< + FSpawn, + FutSpawn, + FSanitize, + FUniquePath, + FBranchExists, + FutBranchExists, + FFindRemoteTracking, + FutFindRemoteTracking, + FRunGit, + FutRunGit, +>( + parent_id: String, + branch: String, + name: Option, + copy_agents_md: bool, + data_dir: &PathBuf, + workspaces: &Mutex>, + sessions: &Mutex>>, + app_settings: &Mutex, + storage_path: &PathBuf, + sanitize_worktree_name: FSanitize, + unique_worktree_path: FUniquePath, + git_branch_exists: FBranchExists, + git_find_remote_tracking_branch: Option, + run_git_command: FRunGit, + spawn_session: FSpawn, +) -> Result +where + FSpawn: Fn(WorkspaceEntry, Option, Option, Option) -> FutSpawn, + FutSpawn: Future, String>>, + FSanitize: Fn(&str) -> String, + FUniquePath: Fn(&PathBuf, &str) -> Result, + FBranchExists: Fn(&PathBuf, &str) -> FutBranchExists, + FutBranchExists: Future>, + FFindRemoteTracking: Fn(&PathBuf, &str) -> FutFindRemoteTracking, + FutFindRemoteTracking: Future, String>>, + FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, + FutRunGit: Future>, +{ + let branch = branch.trim().to_string(); + if branch.is_empty() { + return Err("Branch name is required.".to_string()); + } + let name = name + .map(|value| value.trim().to_string()) + .filter(|value| !value.is_empty()); + + let parent_entry = { + let workspaces = workspaces.lock().await; + workspaces + .get(&parent_id) + .cloned() + .ok_or_else(|| "parent workspace not found".to_string())? + }; + + if parent_entry.kind.is_worktree() { + return Err("Cannot create a worktree from another worktree.".to_string()); + } + + let worktree_root = data_dir.join("worktrees").join(&parent_entry.id); + std::fs::create_dir_all(&worktree_root) + .map_err(|err| format!("Failed to create worktree directory: {err}"))?; + + let safe_name = sanitize_worktree_name(&branch); + let worktree_path = unique_worktree_path(&worktree_root, &safe_name)?; + let worktree_path_string = worktree_path.to_string_lossy().to_string(); + + let repo_path = PathBuf::from(&parent_entry.path); + let branch_exists = git_branch_exists(&repo_path, &branch).await?; + if branch_exists { + run_git_command( + &repo_path, + &["worktree", "add", &worktree_path_string, &branch], + ) + .await?; + } else if let Some(find_remote_tracking) = git_find_remote_tracking_branch { + if let Some(remote_ref) = find_remote_tracking(&repo_path, &branch).await? { + run_git_command( + &repo_path, + &[ + "worktree", + "add", + "-b", + &branch, + &worktree_path_string, + &remote_ref, + ], + ) + .await?; + } else { + run_git_command( + &repo_path, + &["worktree", "add", "-b", &branch, &worktree_path_string], + ) + .await?; + } + } else { + run_git_command( + &repo_path, + &["worktree", "add", "-b", &branch, &worktree_path_string], + ) + .await?; + } + + if copy_agents_md { + if let Err(error) = copy_agents_md_from_parent_to_worktree(&repo_path, &worktree_path) { + eprintln!( + "add_worktree: optional {} copy failed for {}: {}", + AGENTS_MD_FILE_NAME, + worktree_path.display(), + error + ); + } + } + + let entry = WorkspaceEntry { + id: Uuid::new_v4().to_string(), + name: name.clone().unwrap_or_else(|| branch.clone()), + path: worktree_path_string, + codex_bin: parent_entry.codex_bin.clone(), + kind: WorkspaceKind::Worktree, + parent_id: Some(parent_entry.id.clone()), + worktree: Some(WorktreeInfo { branch }), + settings: WorkspaceSettings { + worktree_setup_script: normalize_setup_script( + parent_entry.settings.worktree_setup_script.clone(), + ), + ..WorkspaceSettings::default() + }, + }; + + let (default_bin, codex_args) = { + let settings = app_settings.lock().await; + ( + settings.codex_bin.clone(), + resolve_workspace_codex_args(&entry, Some(&parent_entry), Some(&settings)), + ) + }; + let codex_home = resolve_workspace_codex_home(&entry, Some(&parent_entry)); + let session = spawn_session(entry.clone(), default_bin, codex_args, codex_home).await?; + + { + let mut workspaces = workspaces.lock().await; + workspaces.insert(entry.id.clone(), entry.clone()); + let list: Vec<_> = workspaces.values().cloned().collect(); + write_workspaces(storage_path, &list)?; + } + + sessions.lock().await.insert(entry.id.clone(), session); + + Ok(WorkspaceInfo { + id: entry.id, + name: entry.name, + path: entry.path, + codex_bin: entry.codex_bin, + connected: true, + kind: entry.kind, + parent_id: entry.parent_id, + worktree: entry.worktree, + settings: entry.settings, + }) +} + +pub(crate) async fn remove_worktree_core( + id: String, + workspaces: &Mutex>, + sessions: &Mutex>>, + storage_path: &PathBuf, + run_git_command: FRunGit, + is_missing_worktree_error: FIsMissing, + remove_dir_all: FRemoveDirAll, +) -> Result<(), String> +where + FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, + FutRunGit: Future>, + FIsMissing: Fn(&str) -> bool, + FRemoveDirAll: Fn(&PathBuf) -> Result<(), String>, +{ + let (entry, parent) = { + let workspaces = workspaces.lock().await; + let entry = workspaces + .get(&id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + if !entry.kind.is_worktree() { + return Err("Not a worktree workspace.".to_string()); + } + let parent_id = entry + .parent_id + .clone() + .ok_or_else(|| "worktree parent not found".to_string())?; + let parent = workspaces + .get(&parent_id) + .cloned() + .ok_or_else(|| "worktree parent not found".to_string())?; + (entry, parent) + }; + + let parent_path = PathBuf::from(&parent.path); + let entry_path = PathBuf::from(&entry.path); + kill_session_by_id(sessions, &entry.id).await; + + if entry_path.exists() { + if let Err(error) = run_git_command( + &parent_path, + &["worktree", "remove", "--force", &entry.path], + ) + .await + { + if is_missing_worktree_error(&error) { + if entry_path.exists() { + remove_dir_all(&entry_path)?; + } + } else { + return Err(error); + } + } + } + let _ = run_git_command(&parent_path, &["worktree", "prune", "--expire", "now"]).await; + + { + let mut workspaces = workspaces.lock().await; + workspaces.remove(&entry.id); + let list: Vec<_> = workspaces.values().cloned().collect(); + write_workspaces(storage_path, &list)?; + } + + Ok(()) +} + +pub(crate) async fn rename_worktree_core< + FSpawn, + FutSpawn, + FResolveGitRoot, + FUniqueBranch, + FutUniqueBranch, + FSanitize, + FUniqueRenamePath, + FRunGit, + FutRunGit, +>( + id: String, + branch: String, + data_dir: &PathBuf, + workspaces: &Mutex>, + sessions: &Mutex>>, + app_settings: &Mutex, + storage_path: &PathBuf, + resolve_git_root: FResolveGitRoot, + unique_branch_name: FUniqueBranch, + sanitize_worktree_name: FSanitize, + unique_worktree_path_for_rename: FUniqueRenamePath, + run_git_command: FRunGit, + spawn_session: FSpawn, +) -> Result +where + FSpawn: Fn(WorkspaceEntry, Option, Option, Option) -> FutSpawn, + FutSpawn: Future, String>>, + FResolveGitRoot: Fn(&WorkspaceEntry) -> Result, + FUniqueBranch: Fn(&PathBuf, &str) -> FutUniqueBranch, + FutUniqueBranch: Future>, + FSanitize: Fn(&str) -> String, + FUniqueRenamePath: Fn(&PathBuf, &str, &PathBuf) -> Result, + FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, + FutRunGit: Future>, +{ + let trimmed = branch.trim(); + if trimmed.is_empty() { + return Err("Branch name is required.".to_string()); + } + + let (entry, parent) = { + let workspaces = workspaces.lock().await; + let entry = workspaces + .get(&id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + if !entry.kind.is_worktree() { + return Err("Not a worktree workspace.".to_string()); + } + let parent_id = entry + .parent_id + .clone() + .ok_or_else(|| "worktree parent not found".to_string())?; + let parent = workspaces + .get(&parent_id) + .cloned() + .ok_or_else(|| "worktree parent not found".to_string())?; + (entry, parent) + }; + + let old_branch = entry + .worktree + .as_ref() + .map(|worktree| worktree.branch.clone()) + .ok_or_else(|| "worktree metadata missing".to_string())?; + if old_branch == trimmed { + return Err("Branch name is unchanged.".to_string()); + } + + let parent_root = resolve_git_root(&parent)?; + let final_branch = unique_branch_name(&parent_root, trimmed).await?; + if final_branch == old_branch { + return Err("Branch name is unchanged.".to_string()); + } + + run_git_command(&parent_root, &["branch", "-m", &old_branch, &final_branch]).await?; + + let worktree_root = data_dir.join("worktrees").join(&parent.id); + std::fs::create_dir_all(&worktree_root) + .map_err(|err| format!("Failed to create worktree directory: {err}"))?; + + let safe_name = sanitize_worktree_name(&final_branch); + let current_path = PathBuf::from(&entry.path); + let next_path = unique_worktree_path_for_rename(&worktree_root, &safe_name, ¤t_path)?; + let next_path_string = next_path.to_string_lossy().to_string(); + if next_path_string != entry.path { + if let Err(error) = run_git_command( + &parent_root, + &["worktree", "move", &entry.path, &next_path_string], + ) + .await + { + let _ = + run_git_command(&parent_root, &["branch", "-m", &final_branch, &old_branch]).await; + return Err(error); + } + } + + let (entry_snapshot, list) = { + let mut workspaces = workspaces.lock().await; + let entry = match workspaces.get_mut(&id) { + Some(entry) => entry, + None => return Err("workspace not found".to_string()), + }; + if entry.name.trim() == old_branch { + entry.name = final_branch.clone(); + } + entry.path = next_path_string.clone(); + match entry.worktree.as_mut() { + Some(worktree) => { + worktree.branch = final_branch.clone(); + } + None => { + entry.worktree = Some(WorktreeInfo { + branch: final_branch.clone(), + }); + } + } + let snapshot = entry.clone(); + let list: Vec<_> = workspaces.values().cloned().collect(); + (snapshot, list) + }; + write_workspaces(storage_path, &list)?; + + let was_connected = sessions.lock().await.contains_key(&entry_snapshot.id); + if was_connected { + kill_session_by_id(sessions, &entry_snapshot.id).await; + let (default_bin, codex_args) = { + let settings = app_settings.lock().await; + ( + settings.codex_bin.clone(), + resolve_workspace_codex_args(&entry_snapshot, Some(&parent), Some(&settings)), + ) + }; + let codex_home = resolve_workspace_codex_home(&entry_snapshot, Some(&parent)); + match spawn_session(entry_snapshot.clone(), default_bin, codex_args, codex_home).await { + Ok(session) => { + sessions + .lock() + .await + .insert(entry_snapshot.id.clone(), session); + } + Err(error) => { + eprintln!( + "rename_worktree: respawn failed for {} after rename: {error}", + entry_snapshot.id + ); + } + } + } + + let connected = sessions.lock().await.contains_key(&entry_snapshot.id); + Ok(WorkspaceInfo { + id: entry_snapshot.id, + name: entry_snapshot.name, + path: entry_snapshot.path, + codex_bin: entry_snapshot.codex_bin, + connected, + kind: entry_snapshot.kind, + parent_id: entry_snapshot.parent_id, + worktree: entry_snapshot.worktree, + settings: entry_snapshot.settings, + }) +} + +pub(crate) async fn rename_worktree_upstream_core< + FResolveGitRoot, + FBranchExists, + FutBranchExists, + FFindRemote, + FutFindRemote, + FRemoteExists, + FutRemoteExists, + FRemoteBranchExists, + FutRemoteBranchExists, + FRunGit, + FutRunGit, +>( + id: String, + old_branch: String, + new_branch: String, + workspaces: &Mutex>, + resolve_git_root: FResolveGitRoot, + git_branch_exists: FBranchExists, + git_find_remote_for_branch: FFindRemote, + git_remote_exists: FRemoteExists, + git_remote_branch_exists: FRemoteBranchExists, + run_git_command: FRunGit, +) -> Result<(), String> +where + FResolveGitRoot: Fn(&WorkspaceEntry) -> Result, + FBranchExists: Fn(&PathBuf, &str) -> FutBranchExists, + FutBranchExists: Future>, + FFindRemote: Fn(&PathBuf, &str) -> FutFindRemote, + FutFindRemote: Future, String>>, + FRemoteExists: Fn(&PathBuf, &str) -> FutRemoteExists, + FutRemoteExists: Future>, + FRemoteBranchExists: Fn(&PathBuf, &str, &str) -> FutRemoteBranchExists, + FutRemoteBranchExists: Future>, + FRunGit: Fn(&PathBuf, &[&str]) -> FutRunGit, + FutRunGit: Future>, +{ + let old_branch = old_branch.trim().to_string(); + let new_branch = new_branch.trim().to_string(); + if old_branch.is_empty() || new_branch.is_empty() { + return Err("Branch name is required.".to_string()); + } + if old_branch == new_branch { + return Err("Branch name is unchanged.".to_string()); + } + + let (_entry, parent) = { + let workspaces = workspaces.lock().await; + let entry = workspaces + .get(&id) + .cloned() + .ok_or_else(|| "workspace not found".to_string())?; + if !entry.kind.is_worktree() { + return Err("Not a worktree workspace.".to_string()); + } + let parent_id = entry + .parent_id + .clone() + .ok_or_else(|| "worktree parent not found".to_string())?; + let parent = workspaces + .get(&parent_id) + .cloned() + .ok_or_else(|| "worktree parent not found".to_string())?; + (entry, parent) + }; + + let parent_root = resolve_git_root(&parent)?; + if !git_branch_exists(&parent_root, &new_branch).await? { + return Err("Local branch not found.".to_string()); + } + + let remote_for_old = git_find_remote_for_branch(&parent_root, &old_branch).await?; + let remote_name = match remote_for_old.as_ref() { + Some(remote) => remote.clone(), + None => { + if git_remote_exists(&parent_root, "origin").await? { + "origin".to_string() + } else { + return Err("No git remote configured for this worktree.".to_string()); + } + } + }; + + if git_remote_branch_exists(&parent_root, &remote_name, &new_branch).await? { + return Err("Remote branch already exists.".to_string()); + } + + if remote_for_old.is_some() { + run_git_command( + &parent_root, + &["push", &remote_name, &format!("{new_branch}:{new_branch}")], + ) + .await?; + run_git_command( + &parent_root, + &["push", &remote_name, &format!(":{old_branch}")], + ) + .await?; + } else { + run_git_command(&parent_root, &["push", &remote_name, &new_branch]).await?; + } + + run_git_command( + &parent_root, + &[ + "branch", + "--set-upstream-to", + &format!("{remote_name}/{new_branch}"), + &new_branch, + ], + ) + .await?; + + Ok(()) +} diff --git a/src-tauri/src/shared/worktree_core.rs b/src-tauri/src/shared/worktree_core.rs index 0de075c12..ad0b7b6c8 100644 --- a/src-tauri/src/shared/worktree_core.rs +++ b/src-tauri/src/shared/worktree_core.rs @@ -1,5 +1,3 @@ -#![allow(dead_code)] - use std::path::PathBuf; fn sanitize_name(value: &str, fallback: &str) -> String { @@ -42,6 +40,8 @@ pub(crate) fn unique_worktree_path_best_effort(base_dir: &PathBuf, name: &str) - candidate } +// Used by daemon-only worktree creation paths. +#[allow(dead_code)] pub(crate) fn unique_worktree_path_strict( base_dir: &PathBuf, name: &str, diff --git a/src-tauri/src/utils.rs b/src-tauri/src/utils.rs index 649b33a03..6c9ff7fa7 100644 --- a/src-tauri/src/utils.rs +++ b/src-tauri/src/utils.rs @@ -2,7 +2,6 @@ use std::env; use std::ffi::OsString; use std::path::PathBuf; -#[allow(dead_code)] pub(crate) fn normalize_git_path(path: &str) -> String { path.replace('\\', "/") } diff --git a/src-tauri/src/workspaces/git.rs b/src-tauri/src/workspaces/git.rs index 0bbe30735..cc1671c51 100644 --- a/src-tauri/src/workspaces/git.rs +++ b/src-tauri/src/workspaces/git.rs @@ -29,11 +29,6 @@ pub(crate) async fn git_remote_branch_exists( git_core::git_remote_branch_exists_live(repo_path, remote, branch).await } -#[allow(dead_code)] -pub(crate) async fn git_list_remotes(repo_path: &PathBuf) -> Result, String> { - git_core::git_list_remotes(repo_path).await -} - pub(crate) async fn git_find_remote_for_branch( repo_path: &PathBuf, branch: &str, diff --git a/src-tauri/src/workspaces/worktree.rs b/src-tauri/src/workspaces/worktree.rs index afe8d26a4..4e5433f39 100644 --- a/src-tauri/src/workspaces/worktree.rs +++ b/src-tauri/src/workspaces/worktree.rs @@ -6,7 +6,7 @@ pub(crate) fn sanitize_worktree_name(branch: &str) -> String { worktree_core::sanitize_worktree_name(branch) } -#[allow(dead_code)] +#[cfg(test)] pub(crate) fn sanitize_clone_dir_name(name: &str) -> String { worktree_core::sanitize_clone_dir_name(name) } diff --git a/src/App.tsx b/src/App.tsx index e4ea3d6a7..388af4c95 100644 --- a/src/App.tsx +++ b/src/App.tsx @@ -1,4 +1,4 @@ -import { lazy, Suspense, useCallback, useEffect, useLayoutEffect, useMemo, useRef, useState } from "react"; +import { lazy, Suspense, useCallback, useEffect, useMemo, useRef, useState } from "react"; import "./styles/base.css"; import "./styles/ds-tokens.css"; import "./styles/ds-modal.css"; @@ -31,126 +31,124 @@ import "./styles/tabbar.css"; import "./styles/worktree-modal.css"; import "./styles/clone-modal.css"; import "./styles/branch-switcher-modal.css"; +import "./styles/git-init-modal.css"; import "./styles/settings.css"; import "./styles/compact-base.css"; import "./styles/compact-phone.css"; import "./styles/compact-tablet.css"; -import successSoundUrl from "./assets/success-notification.mp3"; -import errorSoundUrl from "./assets/error-notification.mp3"; -import { AppLayout } from "./features/app/components/AppLayout"; -import { AppModals } from "./features/app/components/AppModals"; -import { MainHeaderActions } from "./features/app/components/MainHeaderActions"; -import { useLayoutNodes } from "./features/layout/hooks/useLayoutNodes"; -import { useWorkspaceDropZone } from "./features/workspaces/hooks/useWorkspaceDropZone"; -import { useThreads } from "./features/threads/hooks/useThreads"; -import { useWindowDrag } from "./features/layout/hooks/useWindowDrag"; -import { useGitPanelController } from "./features/app/hooks/useGitPanelController"; -import { useGitRemote } from "./features/git/hooks/useGitRemote"; -import { useGitRepoScan } from "./features/git/hooks/useGitRepoScan"; -import { usePullRequestComposer } from "./features/git/hooks/usePullRequestComposer"; -import { useGitActions } from "./features/git/hooks/useGitActions"; -import { useAutoExitEmptyDiff } from "./features/git/hooks/useAutoExitEmptyDiff"; -import { useModels } from "./features/models/hooks/useModels"; -import { useCollaborationModes } from "./features/collaboration/hooks/useCollaborationModes"; -import { useCollaborationModeSelection } from "./features/collaboration/hooks/useCollaborationModeSelection"; -import { useSkills } from "./features/skills/hooks/useSkills"; -import { useApps } from "./features/apps/hooks/useApps"; -import { useCustomPrompts } from "./features/prompts/hooks/useCustomPrompts"; -import { useWorkspaceFileListing } from "./features/app/hooks/useWorkspaceFileListing"; -import { useGitBranches } from "./features/git/hooks/useGitBranches"; -import { useBranchSwitcher } from "./features/git/hooks/useBranchSwitcher"; -import { useBranchSwitcherShortcut } from "./features/git/hooks/useBranchSwitcherShortcut"; -import { useDebugLog } from "./features/debug/hooks/useDebugLog"; -import { useWorkspaceRefreshOnFocus } from "./features/workspaces/hooks/useWorkspaceRefreshOnFocus"; -import { useWorkspaceRestore } from "./features/workspaces/hooks/useWorkspaceRestore"; -import { useRenameWorktreePrompt } from "./features/workspaces/hooks/useRenameWorktreePrompt"; -import { useLayoutController } from "./features/app/hooks/useLayoutController"; -import { useWindowLabel } from "./features/layout/hooks/useWindowLabel"; +import successSoundUrl from "@/assets/success-notification.mp3"; +import errorSoundUrl from "@/assets/error-notification.mp3"; +import { AppLayout } from "@app/components/AppLayout"; +import { AppModals } from "@app/components/AppModals"; +import { MainHeaderActions } from "@app/components/MainHeaderActions"; +import { useLayoutNodes } from "@/features/layout/hooks/useLayoutNodes"; +import { useWorkspaceDropZone } from "@/features/workspaces/hooks/useWorkspaceDropZone"; +import { useThreads } from "@threads/hooks/useThreads"; +import { useWindowDrag } from "@/features/layout/hooks/useWindowDrag"; +import { useGitPanelController } from "@app/hooks/useGitPanelController"; +import { useGitRemote } from "@/features/git/hooks/useGitRemote"; +import { useGitRepoScan } from "@/features/git/hooks/useGitRepoScan"; +import { usePullRequestComposer } from "@/features/git/hooks/usePullRequestComposer"; +import { useGitActions } from "@/features/git/hooks/useGitActions"; +import { useAutoExitEmptyDiff } from "@/features/git/hooks/useAutoExitEmptyDiff"; +import { isMissingRepo } from "@/features/git/utils/repoErrors"; +import { useInitGitRepoPrompt } from "@/features/git/hooks/useInitGitRepoPrompt"; +import { useModels } from "@/features/models/hooks/useModels"; +import { useCollaborationModes } from "@/features/collaboration/hooks/useCollaborationModes"; +import { useCollaborationModeSelection } from "@/features/collaboration/hooks/useCollaborationModeSelection"; +import { useSkills } from "@/features/skills/hooks/useSkills"; +import { useApps } from "@/features/apps/hooks/useApps"; +import { useCustomPrompts } from "@/features/prompts/hooks/useCustomPrompts"; +import { useWorkspaceFileListing } from "@app/hooks/useWorkspaceFileListing"; +import { useGitBranches } from "@/features/git/hooks/useGitBranches"; +import { useBranchSwitcher } from "@/features/git/hooks/useBranchSwitcher"; +import { useBranchSwitcherShortcut } from "@/features/git/hooks/useBranchSwitcherShortcut"; +import { useWorkspaceRefreshOnFocus } from "@/features/workspaces/hooks/useWorkspaceRefreshOnFocus"; +import { useWorkspaceRestore } from "@/features/workspaces/hooks/useWorkspaceRestore"; +import { useRenameWorktreePrompt } from "@/features/workspaces/hooks/useRenameWorktreePrompt"; +import { useLayoutController } from "@app/hooks/useLayoutController"; +import { useWindowLabel } from "@/features/layout/hooks/useWindowLabel"; import { revealItemInDir } from "@tauri-apps/plugin-opener"; import { SidebarCollapseButton, TitlebarExpandControls, -} from "./features/layout/components/SidebarToggleControls"; -import { useAppSettingsController } from "./features/app/hooks/useAppSettingsController"; -import { useUpdaterController } from "./features/app/hooks/useUpdaterController"; -import { useResponseRequiredNotificationsController } from "./features/app/hooks/useResponseRequiredNotificationsController"; -import { useErrorToasts } from "./features/notifications/hooks/useErrorToasts"; -import { useComposerShortcuts } from "./features/composer/hooks/useComposerShortcuts"; -import { useComposerMenuActions } from "./features/composer/hooks/useComposerMenuActions"; -import { useComposerEditorState } from "./features/composer/hooks/useComposerEditorState"; -import { useDictationController } from "./features/app/hooks/useDictationController"; -import { useComposerController } from "./features/app/hooks/useComposerController"; -import { useComposerInsert } from "./features/app/hooks/useComposerInsert"; -import { useRenameThreadPrompt } from "./features/threads/hooks/useRenameThreadPrompt"; -import { useWorktreePrompt } from "./features/workspaces/hooks/useWorktreePrompt"; -import { useClonePrompt } from "./features/workspaces/hooks/useClonePrompt"; -import { useWorkspaceController } from "./features/app/hooks/useWorkspaceController"; -import { useWorkspaceSelection } from "./features/workspaces/hooks/useWorkspaceSelection"; -import { useLocalUsage } from "./features/home/hooks/useLocalUsage"; -import { useGitHubPanelController } from "./features/app/hooks/useGitHubPanelController"; -import { useSettingsModalState } from "./features/app/hooks/useSettingsModalState"; -import { useSyncSelectedDiffPath } from "./features/app/hooks/useSyncSelectedDiffPath"; -import { useMenuAcceleratorController } from "./features/app/hooks/useMenuAcceleratorController"; -import { useAppMenuEvents } from "./features/app/hooks/useAppMenuEvents"; -import { usePlanReadyActions } from "./features/app/hooks/usePlanReadyActions"; -import { useWorkspaceActions } from "./features/app/hooks/useWorkspaceActions"; -import { useWorkspaceCycling } from "./features/app/hooks/useWorkspaceCycling"; -import { useThreadRows } from "./features/app/hooks/useThreadRows"; -import { useInterruptShortcut } from "./features/app/hooks/useInterruptShortcut"; -import { useArchiveShortcut } from "./features/app/hooks/useArchiveShortcut"; -import { useLiquidGlassEffect } from "./features/app/hooks/useLiquidGlassEffect"; -import { useCopyThread } from "./features/threads/hooks/useCopyThread"; -import { useTerminalController } from "./features/terminal/hooks/useTerminalController"; -import { useWorkspaceLaunchScript } from "./features/app/hooks/useWorkspaceLaunchScript"; -import { useWorkspaceLaunchScripts } from "./features/app/hooks/useWorkspaceLaunchScripts"; -import { useWorktreeSetupScript } from "./features/app/hooks/useWorktreeSetupScript"; -import { useGitCommitController } from "./features/app/hooks/useGitCommitController"; -import { WorkspaceHome } from "./features/workspaces/components/WorkspaceHome"; -import { MobileServerSetupWizard } from "./features/mobile/components/MobileServerSetupWizard"; -import { useMobileServerSetup } from "./features/mobile/hooks/useMobileServerSetup"; -import { useWorkspaceHome } from "./features/workspaces/hooks/useWorkspaceHome"; -import { useWorkspaceAgentMd } from "./features/workspaces/hooks/useWorkspaceAgentMd"; -import { useThreadCodexParams } from "./features/threads/hooks/useThreadCodexParams"; -import { makeThreadCodexParamsKey } from "./features/threads/utils/threadStorage"; -import { - buildThreadCodexSeedPatch, - createPendingThreadSeed, - resolveThreadCodexState, - type PendingNewThreadSeed, -} from "./features/threads/utils/threadCodexParamsSeed"; -import { isMobilePlatform } from "./utils/platformPaths"; +} from "@/features/layout/components/SidebarToggleControls"; +import { useUpdaterController } from "@app/hooks/useUpdaterController"; +import { useResponseRequiredNotificationsController } from "@app/hooks/useResponseRequiredNotificationsController"; +import { useErrorToasts } from "@/features/notifications/hooks/useErrorToasts"; +import { useComposerShortcuts } from "@/features/composer/hooks/useComposerShortcuts"; +import { useComposerMenuActions } from "@/features/composer/hooks/useComposerMenuActions"; +import { useComposerEditorState } from "@/features/composer/hooks/useComposerEditorState"; +import { useComposerController } from "@app/hooks/useComposerController"; +import { useComposerInsert } from "@app/hooks/useComposerInsert"; +import { useRenameThreadPrompt } from "@threads/hooks/useRenameThreadPrompt"; +import { useWorktreePrompt } from "@/features/workspaces/hooks/useWorktreePrompt"; +import { useClonePrompt } from "@/features/workspaces/hooks/useClonePrompt"; +import { useWorkspaceController } from "@app/hooks/useWorkspaceController"; +import { useWorkspaceSelection } from "@/features/workspaces/hooks/useWorkspaceSelection"; +import { useGitHubPanelController } from "@app/hooks/useGitHubPanelController"; +import { useSettingsModalState } from "@app/hooks/useSettingsModalState"; +import { useSyncSelectedDiffPath } from "@app/hooks/useSyncSelectedDiffPath"; +import { useMenuAcceleratorController } from "@app/hooks/useMenuAcceleratorController"; +import { useAppMenuEvents } from "@app/hooks/useAppMenuEvents"; +import { usePlanReadyActions } from "@app/hooks/usePlanReadyActions"; +import { useWorkspaceActions } from "@app/hooks/useWorkspaceActions"; +import { useWorkspaceCycling } from "@app/hooks/useWorkspaceCycling"; +import { useThreadRows } from "@app/hooks/useThreadRows"; +import { useInterruptShortcut } from "@app/hooks/useInterruptShortcut"; +import { useArchiveShortcut } from "@app/hooks/useArchiveShortcut"; +import { useCopyThread } from "@threads/hooks/useCopyThread"; +import { useTerminalController } from "@/features/terminal/hooks/useTerminalController"; +import { useWorkspaceLaunchScript } from "@app/hooks/useWorkspaceLaunchScript"; +import { useWorkspaceLaunchScripts } from "@app/hooks/useWorkspaceLaunchScripts"; +import { useWorktreeSetupScript } from "@app/hooks/useWorktreeSetupScript"; +import { useGitCommitController } from "@app/hooks/useGitCommitController"; +import { WorkspaceHome } from "@/features/workspaces/components/WorkspaceHome"; +import { MobileServerSetupWizard } from "@/features/mobile/components/MobileServerSetupWizard"; +import { useMobileServerSetup } from "@/features/mobile/hooks/useMobileServerSetup"; +import { useWorkspaceHome } from "@/features/workspaces/hooks/useWorkspaceHome"; +import { useWorkspaceAgentMd } from "@/features/workspaces/hooks/useWorkspaceAgentMd"; import type { - AccessMode, - AppMention, ComposerEditorSettings, WorkspaceInfo, -} from "./types"; -import { OPEN_APP_STORAGE_KEY } from "./features/app/constants"; -import { useOpenAppIcons } from "./features/app/hooks/useOpenAppIcons"; -import { useCodeCssVars } from "./features/app/hooks/useCodeCssVars"; -import { useAccountSwitching } from "./features/app/hooks/useAccountSwitching"; -import { useNewAgentDraft } from "./features/app/hooks/useNewAgentDraft"; -import { useSystemNotificationThreadLinks } from "./features/app/hooks/useSystemNotificationThreadLinks"; -import { useThreadListSortKey } from "./features/app/hooks/useThreadListSortKey"; -import { useThreadListActions } from "./features/app/hooks/useThreadListActions"; -import { useGitRootSelection } from "./features/app/hooks/useGitRootSelection"; -import { useTabActivationGuard } from "./features/app/hooks/useTabActivationGuard"; -import { useRemoteThreadRefreshOnFocus } from "./features/app/hooks/useRemoteThreadRefreshOnFocus"; +} from "@/types"; +import { OPEN_APP_STORAGE_KEY } from "@app/constants"; +import { useOpenAppIcons } from "@app/hooks/useOpenAppIcons"; +import { useAccountSwitching } from "@app/hooks/useAccountSwitching"; +import { useNewAgentDraft } from "@app/hooks/useNewAgentDraft"; +import { useSystemNotificationThreadLinks } from "@app/hooks/useSystemNotificationThreadLinks"; +import { useThreadListSortKey } from "@app/hooks/useThreadListSortKey"; +import { useThreadListActions } from "@app/hooks/useThreadListActions"; +import { useGitRootSelection } from "@app/hooks/useGitRootSelection"; +import { useTabActivationGuard } from "@app/hooks/useTabActivationGuard"; +import { useRemoteThreadRefreshOnFocus } from "@app/hooks/useRemoteThreadRefreshOnFocus"; +import { useAppBootstrapOrchestration } from "@app/bootstrap/useAppBootstrapOrchestration"; +import { + useThreadCodexBootstrapOrchestration, + useThreadCodexSyncOrchestration, + useThreadSelectionHandlersOrchestration, + useThreadUiOrchestration, +} from "@app/orchestration/useThreadOrchestration"; +import { + useWorkspaceInsightsOrchestration, + useWorkspaceOrderingOrchestration, +} from "@app/orchestration/useWorkspaceOrchestration"; +import { useAppShellOrchestration } from "@app/orchestration/useLayoutOrchestration"; const AboutView = lazy(() => - import("./features/about/components/AboutView").then((module) => ({ + import("@/features/about/components/AboutView").then((module) => ({ default: module.AboutView, })), ); const SettingsView = lazy(() => - import("./features/settings/components/SettingsView").then((module) => ({ + import("@settings/components/SettingsView").then((module) => ({ default: module.SettingsView, })), ); const GitHubPanelData = lazy(() => - import("./features/git/components/GitHubPanelData").then((module) => ({ + import("@/features/git/components/GitHubPanelData").then((module) => ({ default: module.GitHubPanelData, })), ); @@ -167,9 +165,6 @@ function MainApp() { scaleShortcutTitle, scaleShortcutText, queueSaveSettings, - } = useAppSettingsController(); - useCodeCssVars(appSettings); - const { dictationModel, dictationState, dictationLevel, @@ -181,8 +176,6 @@ function MainApp() { clearDictationTranscript, clearDictationError, clearDictationHint, - } = useDictationController(appSettings); - const { debugOpen, setDebugOpen, debugEntries, @@ -190,20 +183,8 @@ function MainApp() { addDebugEntry, handleCopyDebug, clearDebugEntries, - } = useDebugLog(); - const shouldReduceTransparency = reduceTransparency || isMobilePlatform(); - useLiquidGlassEffect({ reduceTransparency: shouldReduceTransparency, onDebug: addDebugEntry }); - const { version: threadCodexParamsVersion, getThreadCodexParams, patchThreadCodexParams } = - useThreadCodexParams(); - const [accessMode, setAccessMode] = useState("current"); - const [preferredModelId, setPreferredModelId] = useState(null); - const [preferredEffort, setPreferredEffort] = useState(null); - const [preferredCollabModeId, setPreferredCollabModeId] = useState( - null, - ); - const [threadCodexSelectionKey, setThreadCodexSelectionKey] = useState( - null, - ); + shouldReduceTransparency, + } = useAppBootstrapOrchestration(); const { threadListSortKey, setThreadListSortKey } = useThreadListSortKey(); const [activeTab, setActiveTab] = useState< "home" | "projects" | "codex" | "git" | "log" @@ -262,14 +243,26 @@ function MainApp() { () => new Map(workspaces.map((workspace) => [workspace.id, workspace])), [workspaces], ); - const activeWorkspaceIdForParamsRef = useRef(activeWorkspaceId ?? null); - useEffect(() => { - activeWorkspaceIdForParamsRef.current = activeWorkspaceId ?? null; - }, [activeWorkspaceId]); - const activeThreadIdRef = useRef(null); - // When sending the first message from the "no thread" composer, snapshot the - // collaboration mode so it can be applied to the created thread. - const pendingNewThreadSeedRef = useRef(null); + const { + threadCodexParamsVersion, + getThreadCodexParams, + patchThreadCodexParams, + accessMode, + setAccessMode, + preferredModelId, + setPreferredModelId, + preferredEffort, + setPreferredEffort, + preferredCollabModeId, + setPreferredCollabModeId, + threadCodexSelectionKey, + setThreadCodexSelectionKey, + activeThreadIdRef, + pendingNewThreadSeedRef, + persistThreadCodexParams, + } = useThreadCodexBootstrapOrchestration({ + activeWorkspaceId, + }); const { sidebarWidth, rightPanelWidth, @@ -439,7 +432,7 @@ function MainApp() { useEffect(() => { resetGitHubPanelState(); }, [activeWorkspaceId, resetGitHubPanelState]); - const { remote: gitRemoteUrl } = useGitRemote(activeWorkspace); + const { remote: gitRemoteUrl, refresh: refreshGitRemote } = useGitRemote(activeWorkspace); const { repos: gitRootCandidates, isLoading: gitRootScanLoading, @@ -480,96 +473,22 @@ function MainApp() { onDebug: addDebugEntry, }); - const persistThreadCodexParams = useCallback( - (patch: { - modelId?: string | null; - effort?: string | null; - accessMode?: AccessMode | null; - collaborationModeId?: string | null; - }) => { - const workspaceId = activeWorkspaceIdForParamsRef.current; - const threadId = activeThreadIdRef.current; - if (!workspaceId || !threadId) { - return; - } - patchThreadCodexParams(workspaceId, threadId, patch); - }, - [patchThreadCodexParams], - ); - - const handleSelectModel = useCallback( - (id: string | null) => { - setSelectedModelId(id); - const hasActiveThread = Boolean(activeThreadIdRef.current); - if (!appSettingsLoading) { - // Picking a model inside a thread should not overwrite the global defaults - // configured in Settings; it should remain thread-scoped. - if (!hasActiveThread) { - setAppSettings((current) => { - if (current.lastComposerModelId === id) { - return current; - } - const nextSettings = { ...current, lastComposerModelId: id }; - void queueSaveSettings(nextSettings); - return nextSettings; - }); - } - } - persistThreadCodexParams({ modelId: id }); - }, - [ - appSettingsLoading, - persistThreadCodexParams, - queueSaveSettings, - setAppSettings, - setSelectedModelId, - ], - ); - - const handleSelectEffort = useCallback( - (raw: string | null) => { - const next = typeof raw === "string" && raw.trim().length > 0 ? raw.trim() : null; - setSelectedEffort(next); - const hasActiveThread = Boolean(activeThreadIdRef.current); - if (!appSettingsLoading) { - // Keep per-thread overrides from mutating the global defaults. - if (!hasActiveThread) { - setAppSettings((current) => { - if (current.lastComposerReasoningEffort === next) { - return current; - } - const nextSettings = { ...current, lastComposerReasoningEffort: next }; - void queueSaveSettings(nextSettings); - return nextSettings; - }); - } - } - persistThreadCodexParams({ effort: next }); - }, - [ - appSettingsLoading, - persistThreadCodexParams, - queueSaveSettings, - setAppSettings, - setSelectedEffort, - ], - ); - - const handleSelectCollaborationMode = useCallback( - (id: string | null) => { - setSelectedCollaborationModeId(id); - persistThreadCodexParams({ collaborationModeId: id }); - }, - [persistThreadCodexParams, setSelectedCollaborationModeId], - ); - - const handleSelectAccessMode = useCallback( - (mode: AccessMode) => { - setAccessMode(mode); - persistThreadCodexParams({ accessMode: mode }); - }, - [persistThreadCodexParams], - ); + const { + handleSelectModel, + handleSelectEffort, + handleSelectCollaborationMode, + handleSelectAccessMode, + } = useThreadSelectionHandlersOrchestration({ + appSettingsLoading, + setAppSettings, + queueSaveSettings, + activeThreadIdRef, + setSelectedModelId, + setSelectedEffort, + setSelectedCollaborationModeId, + setAccessMode, + persistThreadCodexParams, + }); const composerShortcuts = { modelShortcut: appSettings.composerModelShortcut, @@ -662,6 +581,10 @@ function MainApp() { }, []); const { applyWorktreeChanges: handleApplyWorktreeChanges, + createGitHubRepo: handleCreateGitHubRepo, + createGitHubRepoLoading, + initGitRepo: handleInitGitRepo, + initGitRepoLoading, revertAllGitChanges: handleRevertAllGitChanges, revertGitFile: handleRevertGitFile, stageGitAll: handleStageGitAll, @@ -674,9 +597,27 @@ function MainApp() { activeWorkspace, onRefreshGitStatus: refreshGitStatus, onRefreshGitDiffs: refreshGitDiffs, + onClearGitRootCandidates: clearGitRootCandidates, onError: alertError, }); + const { + initGitRepoPrompt, + openInitGitRepoPrompt, + handleInitGitRepoPromptBranchChange, + handleInitGitRepoPromptCreateRemoteChange, + handleInitGitRepoPromptRepoNameChange, + handleInitGitRepoPromptPrivateChange, + handleInitGitRepoPromptCancel, + handleInitGitRepoPromptConfirm, + } = useInitGitRepoPrompt({ + activeWorkspace, + initGitRepo: handleInitGitRepo, + createGitHubRepo: handleCreateGitHubRepo, + refreshGitRemote, + isBusy: initGitRepoLoading || createGitHubRepoLoading, + }); + const resolvedModel = selectedModel?.model ?? null; const resolvedEffort = reasoningSupported ? selectedEffort : null; const { activeGitRoot, handleSetGitRoot, handlePickGitRoot } = useGitRootSelection({ @@ -823,91 +764,29 @@ function MainApp() { onDebug: addDebugEntry, }); - useLayoutEffect(() => { - const workspaceId = activeWorkspaceId ?? null; - const threadId = activeThreadId ?? null; - activeThreadIdRef.current = threadId; - - if (!workspaceId) { - return; - } - - const stored = threadId ? getThreadCodexParams(workspaceId, threadId) : null; - const resolved = resolveThreadCodexState({ - workspaceId, - threadId, + useThreadCodexSyncOrchestration({ + activeWorkspaceId, + activeThreadId, + appSettings: { defaultAccessMode: appSettings.defaultAccessMode, lastComposerModelId: appSettings.lastComposerModelId, lastComposerReasoningEffort: appSettings.lastComposerReasoningEffort, - stored, - pendingSeed: pendingNewThreadSeedRef.current, - }); - - setThreadCodexSelectionKey(resolved.scopeKey); - setAccessMode(resolved.accessMode); - setPreferredModelId(resolved.preferredModelId); - setPreferredEffort(resolved.preferredEffort); - setPreferredCollabModeId(resolved.preferredCollabModeId); - }, [ - activeThreadId, - activeWorkspaceId, - appSettings.defaultAccessMode, - appSettings.lastComposerModelId, - appSettings.lastComposerReasoningEffort, - getThreadCodexParams, - setPreferredCollabModeId, - setPreferredEffort, - setPreferredModelId, - setThreadCodexSelectionKey, + }, threadCodexParamsVersion, - ]); - - const seededThreadParamsRef = useRef(new Set()); - useEffect(() => { - const workspaceId = activeWorkspaceId ?? null; - const threadId = activeThreadId ?? null; - if (!workspaceId || !threadId) { - return; - } - - const key = makeThreadCodexParamsKey(workspaceId, threadId); - if (seededThreadParamsRef.current.has(key)) { - return; - } - - const stored = getThreadCodexParams(workspaceId, threadId); - if (stored) { - seededThreadParamsRef.current.add(key); - return; - } - - seededThreadParamsRef.current.add(key); - const pendingSeed = pendingNewThreadSeedRef.current; - patchThreadCodexParams( - workspaceId, - threadId, - buildThreadCodexSeedPatch({ - workspaceId, - selectedModelId, - resolvedEffort, - accessMode, - selectedCollaborationModeId, - pendingSeed, - }), - ); - if (pendingSeed?.workspaceId === workspaceId) { - pendingNewThreadSeedRef.current = null; - } - }, [ - activeThreadId, - activeWorkspaceId, - accessMode, getThreadCodexParams, patchThreadCodexParams, + setThreadCodexSelectionKey, + setAccessMode, + setPreferredModelId, + setPreferredEffort, + setPreferredCollabModeId, + activeThreadIdRef, + pendingNewThreadSeedRef, + selectedModelId, resolvedEffort, + accessMode, selectedCollaborationModeId, - selectedModelId, - ]); + }); const { handleSetThreadListSortKey, handleRefreshAllWorkspaceThreads } = useThreadListActions({ @@ -1212,50 +1091,30 @@ function MainApp() { }, }); - const latestAgentRuns = useMemo(() => { - const entries: Array<{ - threadId: string; - message: string; - timestamp: number; - projectName: string; - groupName?: string | null; - workspaceId: string; - isProcessing: boolean; - }> = []; - workspaces.forEach((workspace) => { - const threads = threadsByWorkspace[workspace.id] ?? []; - threads.forEach((thread) => { - const entry = lastAgentMessageByThread[thread.id]; - if (!entry) { - return; - } - entries.push({ - threadId: thread.id, - message: entry.text, - timestamp: entry.timestamp, - projectName: workspace.name, - groupName: getWorkspaceGroupName(workspace.id), - workspaceId: workspace.id, - isProcessing: threadStatusById[thread.id]?.isProcessing ?? false - }); - }); - }); - return entries.sort((a, b) => b.timestamp - a.timestamp).slice(0, 3); - }, [ + const showHome = !activeWorkspace; + const { + latestAgentRuns, + isLoadingLatestAgents, + usageMetric, + setUsageMetric, + usageWorkspaceId, + setUsageWorkspaceId, + usageWorkspaceOptions, + localUsageSnapshot, + isLoadingLocalUsage, + localUsageError, + refreshLocalUsage, + } = useWorkspaceInsightsOrchestration({ + workspaces, + workspacesById, + hasLoaded, + showHome, + threadsByWorkspace, lastAgentMessageByThread, - getWorkspaceGroupName, threadStatusById, - threadsByWorkspace, - workspaces - ]); - const isLoadingLatestAgents = useMemo( - () => - !hasLoaded || - workspaces.some( - (workspace) => threadListLoadingByWorkspace[workspace.id] ?? false - ), - [hasLoaded, threadListLoadingByWorkspace, workspaces] - ); + threadListLoadingByWorkspace, + getWorkspaceGroupName, + }); const activeRateLimits = activeWorkspaceId ? rateLimitsByWorkspace[activeWorkspaceId] ?? null @@ -1269,7 +1128,6 @@ function MainApp() { const hasActivePlan = Boolean( activePlan && (activePlan.steps.length > 0 || activePlan.explanation) ); - const showHome = !activeWorkspace; const showWorkspaceHome = Boolean(activeWorkspace && !activeThreadId && !isNewAgentDraftMode); const showComposer = (!isCompact ? centerMode === "chat" || centerMode === "diff" @@ -1287,40 +1145,6 @@ function MainApp() { hasComposerSurface: showComposer || showWorkspaceHome, onDebug: addDebugEntry, }); - const [usageMetric, setUsageMetric] = useState<"tokens" | "time">("tokens"); - const [usageWorkspaceId, setUsageWorkspaceId] = useState(null); - const usageWorkspaceOptions = useMemo( - () => - workspaces.map((workspace) => { - const groupName = getWorkspaceGroupName(workspace.id); - const label = groupName - ? `${groupName} / ${workspace.name}` - : workspace.name; - return { id: workspace.id, label }; - }), - [getWorkspaceGroupName, workspaces], - ); - const usageWorkspacePath = useMemo(() => { - if (!usageWorkspaceId) { - return null; - } - return workspacesById.get(usageWorkspaceId)?.path ?? null; - }, [usageWorkspaceId, workspacesById]); - useEffect(() => { - if (!usageWorkspaceId) { - return; - } - if (workspaces.some((workspace) => workspace.id === usageWorkspaceId)) { - return; - } - setUsageWorkspaceId(null); - }, [usageWorkspaceId, workspaces]); - const { - snapshot: localUsageSnapshot, - isLoading: isLoadingLocalUsage, - error: localUsageError, - refresh: refreshLocalUsage, - } = useLocalUsage(showHome, usageWorkspacePath); const canInterrupt = activeThreadId ? threadStatusById[activeThreadId]?.isProcessing ?? false : false; @@ -1631,8 +1455,6 @@ function MainApp() { useTabActivationGuard({ activeTab, - activeWorkspace, - isPhone, isTablet, setActiveTab, }); @@ -1703,21 +1525,6 @@ function MainApp() { onDropPaths: handleDropWorkspacePaths, }); - const handleArchiveActiveThread = useCallback(() => { - if (!activeWorkspaceId || !activeThreadId) { - return; - } - removeThread(activeWorkspaceId, activeThreadId); - clearDraftForThread(activeThreadId); - removeImagesForThread(activeThreadId); - }, [ - activeThreadId, - activeWorkspaceId, - clearDraftForThread, - removeImagesForThread, - removeThread, - ]); - useInterruptShortcut({ isEnabled: canInterrupt, shortcut: appSettings.interruptShortcut, @@ -1755,94 +1562,32 @@ function MainApp() { queueMessage, }); - const rememberPendingNewThreadSeed = useCallback(() => { - pendingNewThreadSeedRef.current = createPendingThreadSeed({ - activeThreadId: activeThreadId ?? null, - activeWorkspaceId: activeWorkspaceId ?? null, - selectedCollaborationModeId, - accessMode, - }); - }, [accessMode, activeThreadId, activeWorkspaceId, selectedCollaborationModeId]); - - const handleComposerSendWithDraftStart = useCallback( - (text: string, images: string[], appMentions?: AppMention[]) => { - rememberPendingNewThreadSeed(); - return runWithDraftStart(() => ( - appMentions && appMentions.length > 0 - ? handleComposerSend(text, images, appMentions) - : handleComposerSend(text, images) - )); - }, - [handleComposerSend, rememberPendingNewThreadSeed, runWithDraftStart], - ); - const handleComposerQueueWithDraftStart = useCallback( - (text: string, images: string[], appMentions?: AppMention[]) => { - // Queueing without an active thread would no-op; bootstrap through send so user input is not lost. - const runner = activeThreadId - ? () => ( - appMentions && appMentions.length > 0 - ? handleComposerQueue(text, images, appMentions) - : handleComposerQueue(text, images) - ) - : () => ( - appMentions && appMentions.length > 0 - ? handleComposerSend(text, images, appMentions) - : handleComposerSend(text, images) - ); - if (!activeThreadId) { - rememberPendingNewThreadSeed(); - } - return runWithDraftStart(runner); - }, - [ - activeThreadId, - handleComposerQueue, - handleComposerSend, - rememberPendingNewThreadSeed, - runWithDraftStart, - ], - ); - - const handleSelectWorkspaceInstance = useCallback( - (workspaceId: string, threadId: string) => { - exitDiffView(); - resetPullRequestSelection(); - clearDraftState(); - selectWorkspace(workspaceId); - setActiveThreadId(threadId, workspaceId); - if (isCompact) { - setActiveTab("codex"); - } - }, - [ - clearDraftState, - exitDiffView, - isCompact, - resetPullRequestSelection, - selectWorkspace, - setActiveTab, - setActiveThreadId, - ], - ); - - const handleOpenThreadLink = useCallback( - (threadId: string) => { - if (!activeWorkspaceId) { - return; - } - exitDiffView(); - resetPullRequestSelection(); - clearDraftState(); - setActiveThreadId(threadId, activeWorkspaceId); - }, - [ - activeWorkspaceId, - clearDraftState, - exitDiffView, - resetPullRequestSelection, - setActiveThreadId, - ], - ); + const { + handleComposerSendWithDraftStart, + handleComposerQueueWithDraftStart, + handleSelectWorkspaceInstance, + handleOpenThreadLink, + handleArchiveActiveThread, + } = useThreadUiOrchestration({ + activeWorkspaceId, + activeThreadId, + accessMode, + selectedCollaborationModeId, + pendingNewThreadSeedRef, + runWithDraftStart, + handleComposerSend, + handleComposerQueue, + clearDraftState, + exitDiffView, + resetPullRequestSelection, + selectWorkspace, + setActiveThreadId, + setActiveTab, + isCompact, + removeThread, + clearDraftForThread, + removeImagesForThread, + }); const { handlePlanAccept, handlePlanSubmitChanges } = usePlanReadyActions({ activeWorkspace, @@ -1855,58 +1600,38 @@ function MainApp() { setSelectedCollaborationModeId, }); - const orderValue = (entry: WorkspaceInfo) => - typeof entry.settings.sortOrder === "number" - ? entry.settings.sortOrder - : Number.MAX_SAFE_INTEGER; - - const handleMoveWorkspace = async ( - workspaceId: string, - direction: "up" | "down" - ) => { - const target = workspacesById.get(workspaceId); - if (!target || (target.kind ?? "main") === "worktree") { - return; - } - const targetGroupId = target.settings.groupId ?? null; - const ordered = workspaces - .filter( - (entry) => - (entry.kind ?? "main") !== "worktree" && - (entry.settings.groupId ?? null) === targetGroupId, - ) - .slice() - .sort((a, b) => { - const orderDiff = orderValue(a) - orderValue(b); - if (orderDiff !== 0) { - return orderDiff; - } - return a.name.localeCompare(b.name); - }); - const index = ordered.findIndex((entry) => entry.id === workspaceId); - if (index === -1) { - return; - } - const nextIndex = direction === "up" ? index - 1 : index + 1; - if (nextIndex < 0 || nextIndex >= ordered.length) { - return; - } - const next = ordered.slice(); - const temp = next[index]; - next[index] = next[nextIndex]; - next[nextIndex] = temp; - await Promise.all( - next.map((entry, idx) => - updateWorkspaceSettings(entry.id, { - sortOrder: idx - }) - ) - ); - }; + const { handleMoveWorkspace } = useWorkspaceOrderingOrchestration({ + workspaces, + workspacesById, + updateWorkspaceSettings, + }); - const showGitDetail = - Boolean(selectedDiffPath) && isPhone && centerMode === "diff"; - const isThreadOpen = Boolean(activeThreadId && showComposer); + const { + showGitDetail, + isThreadOpen, + dropOverlayActive, + dropOverlayText, + appClassName, + appStyle, + } = useAppShellOrchestration({ + isCompact, + isPhone, + isTablet, + sidebarCollapsed, + rightPanelCollapsed, + shouldReduceTransparency, + isWorkspaceDropActive, + centerMode, + selectedDiffPath, + showComposer, + activeThreadId, + sidebarWidth, + rightPanelWidth, + planPanelHeight, + terminalPanelHeight, + debugPanelHeight, + appSettings, + }); useArchiveShortcut({ isEnabled: isThreadOpen, @@ -1957,15 +1682,6 @@ function MainApp() { }); useMenuAcceleratorController({ appSettings, onDebug: addDebugEntry }); - const dropOverlayActive = isWorkspaceDropActive; - const dropOverlayText = "Drop Project Here"; - const appClassName = `app ${isCompact ? "layout-compact" : "layout-desktop"}${ - isPhone ? " layout-phone" : "" - }${isTablet ? " layout-tablet" : ""}${ - shouldReduceTransparency ? " reduced-transparency" : "" - }${!isCompact && sidebarCollapsed ? " sidebar-collapsed" : ""}${ - !isCompact && rightPanelCollapsed ? " right-panel-collapsed" : "" - }`; const { sidebarNode, messagesNode, @@ -2252,6 +1968,8 @@ function MainApp() { void handleSetGitRoot(null); }, onPickGitRoot: handlePickGitRoot, + onInitGitRepo: openInitGitRepoPrompt, + initGitRepoLoading, onStageGitAll: handleStageGitAll, onStageGitFile: handleStageGitFile, onUnstageGitFile: handleUnstageGitFile, @@ -2415,9 +2133,18 @@ function MainApp() { onWorkspaceDrop: handleWorkspaceDrop, }); + const gitRootOverride = activeWorkspace?.settings.gitRoot; + const hasGitRootOverride = + typeof gitRootOverride === "string" && gitRootOverride.trim().length > 0; + const showGitInitBanner = + Boolean(activeWorkspace) && !hasGitRootOverride && isMissingRepo(gitStatus.error); + const workspaceHomeNode = activeWorkspace ? ( +
{shouldLoadGitHubPanelData ? ( @@ -2569,6 +2278,14 @@ function MainApp() { onRenamePromptChange={handleRenamePromptChange} onRenamePromptCancel={handleRenamePromptCancel} onRenamePromptConfirm={handleRenamePromptConfirm} + initGitRepoPrompt={initGitRepoPrompt} + initGitRepoPromptBusy={initGitRepoLoading || createGitHubRepoLoading} + onInitGitRepoPromptBranchChange={handleInitGitRepoPromptBranchChange} + onInitGitRepoPromptCreateRemoteChange={handleInitGitRepoPromptCreateRemoteChange} + onInitGitRepoPromptRepoNameChange={handleInitGitRepoPromptRepoNameChange} + onInitGitRepoPromptPrivateChange={handleInitGitRepoPromptPrivateChange} + onInitGitRepoPromptCancel={handleInitGitRepoPromptCancel} + onInitGitRepoPromptConfirm={handleInitGitRepoPromptConfirm} worktreePrompt={worktreePrompt} onWorktreePromptNameChange={updateWorktreeName} onWorktreePromptChange={updateWorktreeBranch} diff --git a/src/features/app/bootstrap/useAppBootstrap.ts b/src/features/app/bootstrap/useAppBootstrap.ts new file mode 100644 index 000000000..bda739e21 --- /dev/null +++ b/src/features/app/bootstrap/useAppBootstrap.ts @@ -0,0 +1,29 @@ +import { isMobilePlatform } from "@utils/platformPaths"; +import { useDebugLog } from "@/features/debug/hooks/useDebugLog"; +import { useAppSettingsController } from "@app/hooks/useAppSettingsController"; +import { useCodeCssVars } from "@app/hooks/useCodeCssVars"; +import { useDictationController } from "@app/hooks/useDictationController"; +import { useLiquidGlassEffect } from "@app/hooks/useLiquidGlassEffect"; + +export function useAppBootstrap() { + const appSettingsState = useAppSettingsController(); + useCodeCssVars(appSettingsState.appSettings); + + const dictationState = useDictationController(appSettingsState.appSettings); + const debugState = useDebugLog(); + + const shouldReduceTransparency = + appSettingsState.reduceTransparency || isMobilePlatform(); + + useLiquidGlassEffect({ + reduceTransparency: shouldReduceTransparency, + onDebug: debugState.addDebugEntry, + }); + + return { + ...appSettingsState, + ...dictationState, + ...debugState, + shouldReduceTransparency, + }; +} diff --git a/src/features/app/bootstrap/useAppBootstrapOrchestration.ts b/src/features/app/bootstrap/useAppBootstrapOrchestration.ts new file mode 100644 index 000000000..7ce996b06 --- /dev/null +++ b/src/features/app/bootstrap/useAppBootstrapOrchestration.ts @@ -0,0 +1 @@ +export { useAppBootstrap as useAppBootstrapOrchestration } from "./useAppBootstrap"; diff --git a/src/features/app/components/AppModals.tsx b/src/features/app/components/AppModals.tsx index e64abddf4..1301a5dfe 100644 --- a/src/features/app/components/AppModals.tsx +++ b/src/features/app/components/AppModals.tsx @@ -28,6 +28,11 @@ const BranchSwitcherPrompt = lazy(() => default: module.BranchSwitcherPrompt, })), ); +const InitGitRepoPrompt = lazy(() => + import("../../git/components/InitGitRepoPrompt").then((module) => ({ + default: module.InitGitRepoPrompt, + })), +); type RenamePromptState = ReturnType["renamePrompt"]; @@ -40,6 +45,21 @@ type AppModalsProps = { onRenamePromptChange: (value: string) => void; onRenamePromptCancel: () => void; onRenamePromptConfirm: () => void; + initGitRepoPrompt: { + workspaceName: string; + branch: string; + createRemote: boolean; + repoName: string; + isPrivate: boolean; + error: string | null; + } | null; + initGitRepoPromptBusy: boolean; + onInitGitRepoPromptBranchChange: (value: string) => void; + onInitGitRepoPromptCreateRemoteChange: (value: boolean) => void; + onInitGitRepoPromptRepoNameChange: (value: string) => void; + onInitGitRepoPromptPrivateChange: (value: boolean) => void; + onInitGitRepoPromptCancel: () => void; + onInitGitRepoPromptConfirm: () => void; worktreePrompt: WorktreePromptState; onWorktreePromptNameChange: (value: string) => void; onWorktreePromptChange: (value: string) => void; @@ -73,6 +93,14 @@ export const AppModals = memo(function AppModals({ onRenamePromptChange, onRenamePromptCancel, onRenamePromptConfirm, + initGitRepoPrompt, + initGitRepoPromptBusy, + onInitGitRepoPromptBranchChange, + onInitGitRepoPromptCreateRemoteChange, + onInitGitRepoPromptRepoNameChange, + onInitGitRepoPromptPrivateChange, + onInitGitRepoPromptCancel, + onInitGitRepoPromptConfirm, worktreePrompt, onWorktreePromptNameChange, onWorktreePromptChange, @@ -117,6 +145,25 @@ export const AppModals = memo(function AppModals({ /> )} + {initGitRepoPrompt && ( + + + + )} {worktreePrompt && ( { + it("does not force home tab on phone when no workspace is selected", () => { + const setActiveTab = vi.fn(); + + renderHook(() => + useTabActivationGuard({ + activeTab: "git", + isTablet: false, + setActiveTab, + }), + ); + + expect(setActiveTab).not.toHaveBeenCalled(); + }); + + it("redirects tablet home tab selection to codex", () => { + const setActiveTab = vi.fn(); + + renderHook(() => + useTabActivationGuard({ + activeTab: "home", + isTablet: true, + setActiveTab, + }), + ); + + expect(setActiveTab).toHaveBeenCalledWith("codex"); + }); +}); diff --git a/src/features/app/hooks/useTabActivationGuard.ts b/src/features/app/hooks/useTabActivationGuard.ts index f5299839e..55463dd19 100644 --- a/src/features/app/hooks/useTabActivationGuard.ts +++ b/src/features/app/hooks/useTabActivationGuard.ts @@ -1,32 +1,18 @@ import { useEffect } from "react"; -import type { WorkspaceInfo } from "../../../types"; type AppTab = "home" | "projects" | "codex" | "git" | "log"; type UseTabActivationGuardOptions = { activeTab: AppTab; - activeWorkspace: WorkspaceInfo | null; - isPhone: boolean; isTablet: boolean; setActiveTab: (tab: AppTab) => void; }; export function useTabActivationGuard({ activeTab, - activeWorkspace, - isPhone, isTablet, setActiveTab, }: UseTabActivationGuardOptions) { - useEffect(() => { - if (!isPhone) { - return; - } - if (!activeWorkspace && activeTab !== "home" && activeTab !== "projects") { - setActiveTab("home"); - } - }, [activeTab, activeWorkspace, isPhone, setActiveTab]); - useEffect(() => { if (!isTablet) { return; diff --git a/src/features/app/orchestration/useLayoutOrchestration.ts b/src/features/app/orchestration/useLayoutOrchestration.ts new file mode 100644 index 000000000..039c0b3dc --- /dev/null +++ b/src/features/app/orchestration/useLayoutOrchestration.ts @@ -0,0 +1,90 @@ +import { useMemo, type CSSProperties } from "react"; +import type { AppSettings } from "@/types"; + +type UseAppShellOrchestrationOptions = { + isCompact: boolean; + isPhone: boolean; + isTablet: boolean; + sidebarCollapsed: boolean; + rightPanelCollapsed: boolean; + shouldReduceTransparency: boolean; + isWorkspaceDropActive: boolean; + centerMode: "chat" | "diff"; + selectedDiffPath: string | null; + showComposer: boolean; + activeThreadId: string | null; + sidebarWidth: number; + rightPanelWidth: number; + planPanelHeight: number; + terminalPanelHeight: number; + debugPanelHeight: number; + appSettings: Pick; +}; + +export function useAppShellOrchestration({ + isCompact, + isPhone, + isTablet, + sidebarCollapsed, + rightPanelCollapsed, + shouldReduceTransparency, + isWorkspaceDropActive, + centerMode, + selectedDiffPath, + showComposer, + activeThreadId, + sidebarWidth, + rightPanelWidth, + planPanelHeight, + terminalPanelHeight, + debugPanelHeight, + appSettings, +}: UseAppShellOrchestrationOptions) { + const showGitDetail = Boolean(selectedDiffPath) && isPhone && centerMode === "diff"; + const isThreadOpen = Boolean(activeThreadId && showComposer); + + const appClassName = `app ${isCompact ? "layout-compact" : "layout-desktop"}${ + isPhone ? " layout-phone" : "" + }${isTablet ? " layout-tablet" : ""}${ + shouldReduceTransparency ? " reduced-transparency" : "" + }${!isCompact && sidebarCollapsed ? " sidebar-collapsed" : ""}${ + !isCompact && rightPanelCollapsed ? " right-panel-collapsed" : "" + }`; + + const appStyle = useMemo( + () => ({ + "--sidebar-width": `${isCompact ? sidebarWidth : sidebarCollapsed ? 0 : sidebarWidth}px`, + "--right-panel-width": `${ + isCompact ? rightPanelWidth : rightPanelCollapsed ? 0 : rightPanelWidth + }px`, + "--plan-panel-height": `${planPanelHeight}px`, + "--terminal-panel-height": `${terminalPanelHeight}px`, + "--debug-panel-height": `${debugPanelHeight}px`, + "--ui-font-family": appSettings.uiFontFamily, + "--code-font-family": appSettings.codeFontFamily, + "--code-font-size": `${appSettings.codeFontSize}px`, + } as CSSProperties), + [ + appSettings.codeFontFamily, + appSettings.codeFontSize, + appSettings.uiFontFamily, + debugPanelHeight, + isCompact, + planPanelHeight, + rightPanelCollapsed, + rightPanelWidth, + sidebarCollapsed, + sidebarWidth, + terminalPanelHeight, + ], + ); + + return { + showGitDetail, + isThreadOpen, + dropOverlayActive: isWorkspaceDropActive, + dropOverlayText: "Drop Project Here", + appClassName, + appStyle, + }; +} diff --git a/src/features/app/orchestration/useThreadCodexOrchestration.ts b/src/features/app/orchestration/useThreadCodexOrchestration.ts new file mode 100644 index 000000000..7d3f735ad --- /dev/null +++ b/src/features/app/orchestration/useThreadCodexOrchestration.ts @@ -0,0 +1,105 @@ +import { useCallback, useMemo, useRef, useState } from "react"; +import type { Dispatch, MutableRefObject, SetStateAction } from "react"; +import type { AccessMode } from "@/types"; +import { useThreadCodexParams } from "@threads/hooks/useThreadCodexParams"; +import { + type PendingNewThreadSeed, +} from "@threads/utils/threadCodexParamsSeed"; + +type ThreadCodexOrchestration = { + accessMode: AccessMode; + setAccessMode: Dispatch>; + preferredModelId: string | null; + setPreferredModelId: Dispatch>; + preferredEffort: string | null; + setPreferredEffort: Dispatch>; + preferredCollabModeId: string | null; + setPreferredCollabModeId: Dispatch>; + threadCodexSelectionKey: string | null; + setThreadCodexSelectionKey: Dispatch>; + threadCodexParamsVersion: number; + getThreadCodexParams: ReturnType["getThreadCodexParams"]; + patchThreadCodexParams: ReturnType["patchThreadCodexParams"]; + persistThreadCodexParams: (patch: { + modelId?: string | null; + effort?: string | null; + accessMode?: AccessMode | null; + collaborationModeId?: string | null; + }) => void; + activeThreadIdRef: MutableRefObject; + pendingNewThreadSeedRef: MutableRefObject; +}; + +type UseThreadCodexOrchestrationParams = { + activeWorkspaceIdForParamsRef: MutableRefObject; +}; + +export function useThreadCodexOrchestration({ + activeWorkspaceIdForParamsRef, +}: UseThreadCodexOrchestrationParams): ThreadCodexOrchestration { + const { + version: threadCodexParamsVersion, + getThreadCodexParams, + patchThreadCodexParams, + } = useThreadCodexParams(); + const [accessMode, setAccessMode] = useState("current"); + const [preferredModelId, setPreferredModelId] = useState(null); + const [preferredEffort, setPreferredEffort] = useState(null); + const [preferredCollabModeId, setPreferredCollabModeId] = useState( + null, + ); + const [threadCodexSelectionKey, setThreadCodexSelectionKey] = useState( + null, + ); + const activeThreadIdRef = useRef(null); + const pendingNewThreadSeedRef = useRef(null); + + const persistThreadCodexParams = useCallback( + (patch: { + modelId?: string | null; + effort?: string | null; + accessMode?: AccessMode | null; + collaborationModeId?: string | null; + }) => { + const workspaceId = activeWorkspaceIdForParamsRef.current; + const threadId = activeThreadIdRef.current; + if (!workspaceId || !threadId) { + return; + } + patchThreadCodexParams(workspaceId, threadId, patch); + }, + [activeWorkspaceIdForParamsRef, patchThreadCodexParams], + ); + + return useMemo( + () => ({ + accessMode, + setAccessMode, + preferredModelId, + setPreferredModelId, + preferredEffort, + setPreferredEffort, + preferredCollabModeId, + setPreferredCollabModeId, + threadCodexSelectionKey, + setThreadCodexSelectionKey, + threadCodexParamsVersion, + getThreadCodexParams, + patchThreadCodexParams, + persistThreadCodexParams, + activeThreadIdRef, + pendingNewThreadSeedRef, + }), + [ + accessMode, + preferredCollabModeId, + preferredEffort, + preferredModelId, + threadCodexSelectionKey, + threadCodexParamsVersion, + getThreadCodexParams, + patchThreadCodexParams, + persistThreadCodexParams, + ], + ); +} diff --git a/src/features/app/orchestration/useThreadOrchestration.ts b/src/features/app/orchestration/useThreadOrchestration.ts new file mode 100644 index 000000000..40d189b5d --- /dev/null +++ b/src/features/app/orchestration/useThreadOrchestration.ts @@ -0,0 +1,439 @@ +import { useCallback, useEffect, useLayoutEffect, useRef } from "react"; +import type { Dispatch, MutableRefObject, SetStateAction } from "react"; +import type { AccessMode, AppMention, AppSettings } from "@/types"; +import { useThreadCodexParams } from "@threads/hooks/useThreadCodexParams"; +import { + buildThreadCodexSeedPatch, + createPendingThreadSeed, + resolveThreadCodexState, + type PendingNewThreadSeed, +} from "@threads/utils/threadCodexParamsSeed"; +import { makeThreadCodexParamsKey } from "@threads/utils/threadStorage"; +import { useThreadCodexOrchestration } from "./useThreadCodexOrchestration"; + +type SetState = Dispatch>; + +type PersistThreadCodexParams = ( + patch: { + modelId?: string | null; + effort?: string | null; + accessMode?: AccessMode | null; + collaborationModeId?: string | null; + }, +) => void; + +type UseThreadSelectionHandlersOrchestrationParams = { + appSettingsLoading: boolean; + setAppSettings: SetState; + queueSaveSettings: (next: AppSettings) => Promise; + activeThreadIdRef: MutableRefObject; + setSelectedModelId: (id: string | null) => void; + setSelectedEffort: (effort: string | null) => void; + setSelectedCollaborationModeId: (id: string | null) => void; + setAccessMode: SetState; + persistThreadCodexParams: PersistThreadCodexParams; +}; + +type UseThreadCodexBootstrapOrchestrationParams = { + activeWorkspaceId: string | null | undefined; +}; + +type UseThreadCodexSyncOrchestrationParams = { + activeWorkspaceId: string | null | undefined; + activeThreadId: string | null; + appSettings: Pick< + AppSettings, + "defaultAccessMode" | "lastComposerModelId" | "lastComposerReasoningEffort" + >; + threadCodexParamsVersion: number; + getThreadCodexParams: ReturnType["getThreadCodexParams"]; + patchThreadCodexParams: ReturnType["patchThreadCodexParams"]; + setThreadCodexSelectionKey: SetState; + setAccessMode: SetState; + setPreferredModelId: SetState; + setPreferredEffort: SetState; + setPreferredCollabModeId: SetState; + activeThreadIdRef: MutableRefObject; + pendingNewThreadSeedRef: MutableRefObject; + selectedModelId: string | null; + resolvedEffort: string | null; + accessMode: AccessMode; + selectedCollaborationModeId: string | null; +}; + +type MainTab = "home" | "projects" | "codex" | "git" | "log"; + +type SendOrQueueHandler = ( + text: string, + images: string[], + appMentions?: AppMention[], +) => Promise; + +type UseThreadUiOrchestrationParams = { + activeWorkspaceId: string | null | undefined; + activeThreadId: string | null; + accessMode: AccessMode; + selectedCollaborationModeId: string | null; + pendingNewThreadSeedRef: MutableRefObject; + runWithDraftStart: (runner: () => Promise) => Promise; + handleComposerSend: SendOrQueueHandler; + handleComposerQueue: SendOrQueueHandler; + clearDraftState: () => void; + exitDiffView: () => void; + resetPullRequestSelection: () => void; + selectWorkspace: (workspaceId: string) => void; + setActiveThreadId: (threadId: string | null, workspaceId?: string) => void; + setActiveTab: SetState; + isCompact: boolean; + removeThread: (workspaceId: string, threadId: string) => void; + clearDraftForThread: (threadId: string) => void; + removeImagesForThread: (threadId: string) => void; +}; + +export function useThreadCodexBootstrapOrchestration({ + activeWorkspaceId, +}: UseThreadCodexBootstrapOrchestrationParams) { + const activeWorkspaceIdForParamsRef = useRef(activeWorkspaceId ?? null); + + useEffect(() => { + activeWorkspaceIdForParamsRef.current = activeWorkspaceId ?? null; + }, [activeWorkspaceId]); + + return useThreadCodexOrchestration({ activeWorkspaceIdForParamsRef }); +} + +export function useThreadCodexSyncOrchestration({ + activeWorkspaceId, + activeThreadId, + appSettings, + threadCodexParamsVersion, + getThreadCodexParams, + patchThreadCodexParams, + setThreadCodexSelectionKey, + setAccessMode, + setPreferredModelId, + setPreferredEffort, + setPreferredCollabModeId, + activeThreadIdRef, + pendingNewThreadSeedRef, + selectedModelId, + resolvedEffort, + accessMode, + selectedCollaborationModeId, +}: UseThreadCodexSyncOrchestrationParams) { + useLayoutEffect(() => { + const workspaceId = activeWorkspaceId ?? null; + const threadId = activeThreadId ?? null; + activeThreadIdRef.current = threadId; + + if (!workspaceId) { + return; + } + + const stored = threadId ? getThreadCodexParams(workspaceId, threadId) : null; + const resolved = resolveThreadCodexState({ + workspaceId, + threadId, + defaultAccessMode: appSettings.defaultAccessMode, + lastComposerModelId: appSettings.lastComposerModelId, + lastComposerReasoningEffort: appSettings.lastComposerReasoningEffort, + stored, + pendingSeed: pendingNewThreadSeedRef.current, + }); + + setThreadCodexSelectionKey(resolved.scopeKey); + setAccessMode(resolved.accessMode); + setPreferredModelId(resolved.preferredModelId); + setPreferredEffort(resolved.preferredEffort); + setPreferredCollabModeId(resolved.preferredCollabModeId); + }, [ + activeThreadId, + activeWorkspaceId, + appSettings.defaultAccessMode, + appSettings.lastComposerModelId, + appSettings.lastComposerReasoningEffort, + getThreadCodexParams, + setPreferredCollabModeId, + setPreferredEffort, + setPreferredModelId, + setThreadCodexSelectionKey, + threadCodexParamsVersion, + setAccessMode, + activeThreadIdRef, + pendingNewThreadSeedRef, + ]); + + const seededThreadParamsRef = useRef(new Set()); + useEffect(() => { + const workspaceId = activeWorkspaceId ?? null; + const threadId = activeThreadId ?? null; + if (!workspaceId || !threadId) { + return; + } + + const key = makeThreadCodexParamsKey(workspaceId, threadId); + if (seededThreadParamsRef.current.has(key)) { + return; + } + + const stored = getThreadCodexParams(workspaceId, threadId); + if (stored) { + seededThreadParamsRef.current.add(key); + return; + } + + seededThreadParamsRef.current.add(key); + const pendingSeed = pendingNewThreadSeedRef.current; + patchThreadCodexParams( + workspaceId, + threadId, + buildThreadCodexSeedPatch({ + workspaceId, + selectedModelId, + resolvedEffort, + accessMode, + selectedCollaborationModeId, + pendingSeed, + }), + ); + if (pendingSeed?.workspaceId === workspaceId) { + pendingNewThreadSeedRef.current = null; + } + }, [ + activeThreadId, + activeWorkspaceId, + accessMode, + getThreadCodexParams, + patchThreadCodexParams, + resolvedEffort, + selectedCollaborationModeId, + selectedModelId, + pendingNewThreadSeedRef, + ]); +} + +export function useThreadSelectionHandlersOrchestration({ + appSettingsLoading, + setAppSettings, + queueSaveSettings, + activeThreadIdRef, + setSelectedModelId, + setSelectedEffort, + setSelectedCollaborationModeId, + setAccessMode, + persistThreadCodexParams, +}: UseThreadSelectionHandlersOrchestrationParams) { + const handleSelectModel = useCallback( + (id: string | null) => { + setSelectedModelId(id); + const hasActiveThread = Boolean(activeThreadIdRef.current); + if (!appSettingsLoading && !hasActiveThread) { + setAppSettings((current) => { + if (current.lastComposerModelId === id) { + return current; + } + const nextSettings = { ...current, lastComposerModelId: id }; + void queueSaveSettings(nextSettings); + return nextSettings; + }); + } + persistThreadCodexParams({ modelId: id }); + }, + [ + activeThreadIdRef, + appSettingsLoading, + persistThreadCodexParams, + queueSaveSettings, + setAppSettings, + setSelectedModelId, + ], + ); + + const handleSelectEffort = useCallback( + (raw: string | null) => { + const next = typeof raw === "string" && raw.trim().length > 0 ? raw.trim() : null; + setSelectedEffort(next); + const hasActiveThread = Boolean(activeThreadIdRef.current); + if (!appSettingsLoading && !hasActiveThread) { + setAppSettings((current) => { + if (current.lastComposerReasoningEffort === next) { + return current; + } + const nextSettings = { ...current, lastComposerReasoningEffort: next }; + void queueSaveSettings(nextSettings); + return nextSettings; + }); + } + persistThreadCodexParams({ effort: next }); + }, + [ + activeThreadIdRef, + appSettingsLoading, + persistThreadCodexParams, + queueSaveSettings, + setAppSettings, + setSelectedEffort, + ], + ); + + const handleSelectCollaborationMode = useCallback( + (id: string | null) => { + setSelectedCollaborationModeId(id); + persistThreadCodexParams({ collaborationModeId: id }); + }, + [persistThreadCodexParams, setSelectedCollaborationModeId], + ); + + const handleSelectAccessMode = useCallback( + (mode: AccessMode) => { + setAccessMode(mode); + persistThreadCodexParams({ accessMode: mode }); + }, + [persistThreadCodexParams, setAccessMode], + ); + + return { + handleSelectModel, + handleSelectEffort, + handleSelectCollaborationMode, + handleSelectAccessMode, + }; +} + +export function useThreadUiOrchestration({ + activeWorkspaceId, + activeThreadId, + accessMode, + selectedCollaborationModeId, + pendingNewThreadSeedRef, + runWithDraftStart, + handleComposerSend, + handleComposerQueue, + clearDraftState, + exitDiffView, + resetPullRequestSelection, + selectWorkspace, + setActiveThreadId, + setActiveTab, + isCompact, + removeThread, + clearDraftForThread, + removeImagesForThread, +}: UseThreadUiOrchestrationParams) { + const rememberPendingNewThreadSeed = useCallback(() => { + pendingNewThreadSeedRef.current = createPendingThreadSeed({ + activeThreadId: activeThreadId ?? null, + activeWorkspaceId: activeWorkspaceId ?? null, + selectedCollaborationModeId, + accessMode, + }); + }, [ + accessMode, + activeThreadId, + activeWorkspaceId, + pendingNewThreadSeedRef, + selectedCollaborationModeId, + ]); + + const handleComposerSendWithDraftStart = useCallback( + (text: string, images: string[], appMentions?: AppMention[]) => { + rememberPendingNewThreadSeed(); + return runWithDraftStart(() => + appMentions && appMentions.length > 0 + ? handleComposerSend(text, images, appMentions) + : handleComposerSend(text, images), + ); + }, + [handleComposerSend, rememberPendingNewThreadSeed, runWithDraftStart], + ); + + const handleComposerQueueWithDraftStart = useCallback( + (text: string, images: string[], appMentions?: AppMention[]) => { + const runner = activeThreadId + ? () => + appMentions && appMentions.length > 0 + ? handleComposerQueue(text, images, appMentions) + : handleComposerQueue(text, images) + : () => + appMentions && appMentions.length > 0 + ? handleComposerSend(text, images, appMentions) + : handleComposerSend(text, images); + + if (!activeThreadId) { + rememberPendingNewThreadSeed(); + } + return runWithDraftStart(runner); + }, + [ + activeThreadId, + handleComposerQueue, + handleComposerSend, + rememberPendingNewThreadSeed, + runWithDraftStart, + ], + ); + + const handleSelectWorkspaceInstance = useCallback( + (workspaceId: string, threadId: string) => { + exitDiffView(); + resetPullRequestSelection(); + clearDraftState(); + selectWorkspace(workspaceId); + setActiveThreadId(threadId, workspaceId); + if (isCompact) { + setActiveTab("codex"); + } + }, + [ + clearDraftState, + exitDiffView, + isCompact, + resetPullRequestSelection, + selectWorkspace, + setActiveTab, + setActiveThreadId, + ], + ); + + const handleOpenThreadLink = useCallback( + (threadId: string) => { + if (!activeWorkspaceId) { + return; + } + exitDiffView(); + resetPullRequestSelection(); + clearDraftState(); + setActiveThreadId(threadId, activeWorkspaceId); + }, + [ + activeWorkspaceId, + clearDraftState, + exitDiffView, + resetPullRequestSelection, + setActiveThreadId, + ], + ); + + const handleArchiveActiveThread = useCallback(() => { + if (!activeWorkspaceId || !activeThreadId) { + return; + } + removeThread(activeWorkspaceId, activeThreadId); + clearDraftForThread(activeThreadId); + removeImagesForThread(activeThreadId); + }, [ + activeThreadId, + activeWorkspaceId, + clearDraftForThread, + removeImagesForThread, + removeThread, + ]); + + return { + handleComposerSendWithDraftStart, + handleComposerQueueWithDraftStart, + handleSelectWorkspaceInstance, + handleOpenThreadLink, + handleArchiveActiveThread, + }; +} diff --git a/src/features/app/orchestration/useWorkspaceOrchestration.ts b/src/features/app/orchestration/useWorkspaceOrchestration.ts new file mode 100644 index 000000000..43c4dc4b2 --- /dev/null +++ b/src/features/app/orchestration/useWorkspaceOrchestration.ts @@ -0,0 +1,212 @@ +import { useCallback, useEffect, useMemo, useState } from "react"; +import type { WorkspaceInfo } from "@/types"; +import { useLocalUsage } from "@/features/home/hooks/useLocalUsage"; + +type ThreadSummary = { + id: string; + name?: string | null; + updatedAt: number; +}; + +type LastAgentMessage = { + text: string; + timestamp: number; +}; + +type ThreadStatus = { + isProcessing?: boolean; +}; + +type UseWorkspaceInsightsOrchestrationOptions = { + workspaces: WorkspaceInfo[]; + workspacesById: Map; + hasLoaded: boolean; + showHome: boolean; + threadsByWorkspace: Record; + lastAgentMessageByThread: Record; + threadStatusById: Record; + threadListLoadingByWorkspace: Record; + getWorkspaceGroupName: (workspaceId: string) => string | null | undefined; +}; + +type UseWorkspaceOrderingOrchestrationOptions = { + workspaces: WorkspaceInfo[]; + workspacesById: Map; + updateWorkspaceSettings: ( + workspaceId: string, + settings: Partial, + ) => Promise; +}; + +export function useWorkspaceInsightsOrchestration({ + workspaces, + workspacesById, + hasLoaded, + showHome, + threadsByWorkspace, + lastAgentMessageByThread, + threadStatusById, + threadListLoadingByWorkspace, + getWorkspaceGroupName, +}: UseWorkspaceInsightsOrchestrationOptions) { + const latestAgentRuns = useMemo(() => { + const entries: Array<{ + threadId: string; + message: string; + timestamp: number; + projectName: string; + groupName?: string | null; + workspaceId: string; + isProcessing: boolean; + }> = []; + + workspaces.forEach((workspace) => { + const threads = threadsByWorkspace[workspace.id] ?? []; + threads.forEach((thread) => { + const entry = lastAgentMessageByThread[thread.id]; + if (!entry) { + return; + } + entries.push({ + threadId: thread.id, + message: entry.text, + timestamp: entry.timestamp, + projectName: workspace.name, + groupName: getWorkspaceGroupName(workspace.id), + workspaceId: workspace.id, + isProcessing: threadStatusById[thread.id]?.isProcessing ?? false, + }); + }); + }); + + return entries.sort((a, b) => b.timestamp - a.timestamp).slice(0, 3); + }, [ + getWorkspaceGroupName, + lastAgentMessageByThread, + threadStatusById, + threadsByWorkspace, + workspaces, + ]); + + const isLoadingLatestAgents = useMemo( + () => + !hasLoaded || workspaces.some((workspace) => threadListLoadingByWorkspace[workspace.id] ?? false), + [hasLoaded, threadListLoadingByWorkspace, workspaces], + ); + + const [usageMetric, setUsageMetric] = useState<"tokens" | "time">("tokens"); + const [usageWorkspaceId, setUsageWorkspaceId] = useState(null); + + const usageWorkspaceOptions = useMemo( + () => + workspaces.map((workspace) => { + const groupName = getWorkspaceGroupName(workspace.id); + const label = groupName ? `${groupName} / ${workspace.name}` : workspace.name; + return { id: workspace.id, label }; + }), + [getWorkspaceGroupName, workspaces], + ); + + const usageWorkspacePath = useMemo(() => { + if (!usageWorkspaceId) { + return null; + } + return workspacesById.get(usageWorkspaceId)?.path ?? null; + }, [usageWorkspaceId, workspacesById]); + + useEffect(() => { + if (!usageWorkspaceId) { + return; + } + if (workspaces.some((workspace) => workspace.id === usageWorkspaceId)) { + return; + } + setUsageWorkspaceId(null); + }, [usageWorkspaceId, workspaces]); + + const { + snapshot: localUsageSnapshot, + isLoading: isLoadingLocalUsage, + error: localUsageError, + refresh: refreshLocalUsage, + } = useLocalUsage(showHome, usageWorkspacePath); + + return { + latestAgentRuns, + isLoadingLatestAgents, + usageMetric, + setUsageMetric, + usageWorkspaceId, + setUsageWorkspaceId, + usageWorkspaceOptions, + localUsageSnapshot, + isLoadingLocalUsage, + localUsageError, + refreshLocalUsage, + }; +} + +export function useWorkspaceOrderingOrchestration({ + workspaces, + workspacesById, + updateWorkspaceSettings, +}: UseWorkspaceOrderingOrchestrationOptions) { + const orderValue = useCallback( + (entry: WorkspaceInfo) => + typeof entry.settings.sortOrder === "number" + ? entry.settings.sortOrder + : Number.MAX_SAFE_INTEGER, + [], + ); + + const handleMoveWorkspace = useCallback( + async (workspaceId: string, direction: "up" | "down") => { + const target = workspacesById.get(workspaceId); + if (!target || (target.kind ?? "main") === "worktree") { + return; + } + + const targetGroupId = target.settings.groupId ?? null; + const ordered = workspaces + .filter( + (entry) => + (entry.kind ?? "main") !== "worktree" && + (entry.settings.groupId ?? null) === targetGroupId, + ) + .slice() + .sort((a, b) => { + const orderDiff = orderValue(a) - orderValue(b); + if (orderDiff !== 0) { + return orderDiff; + } + return a.name.localeCompare(b.name); + }); + + const index = ordered.findIndex((entry) => entry.id === workspaceId); + if (index === -1) { + return; + } + + const nextIndex = direction === "up" ? index - 1 : index + 1; + if (nextIndex < 0 || nextIndex >= ordered.length) { + return; + } + + const next = ordered.slice(); + const temp = next[index]; + next[index] = next[nextIndex]; + next[nextIndex] = temp; + + await Promise.all( + next.map((entry, idx) => + updateWorkspaceSettings(entry.id, { + sortOrder: idx, + }), + ), + ); + }, + [orderValue, updateWorkspaceSettings, workspaces, workspacesById], + ); + + return { handleMoveWorkspace }; +} diff --git a/src/features/git/components/GitDiffPanel.test.tsx b/src/features/git/components/GitDiffPanel.test.tsx index c90a32b73..59f09d920 100644 --- a/src/features/git/components/GitDiffPanel.test.tsx +++ b/src/features/git/components/GitDiffPanel.test.tsx @@ -1,5 +1,5 @@ /** @vitest-environment jsdom */ -import { fireEvent, render, screen, waitFor } from "@testing-library/react"; +import { fireEvent, render, screen, waitFor, within } from "@testing-library/react"; import { describe, expect, it, vi } from "vitest"; import type { GitLogEntry } from "../../../types"; import { GitDiffPanel } from "./GitDiffPanel"; @@ -68,6 +68,33 @@ const baseProps = { }; describe("GitDiffPanel", () => { + it("shows an initialize git button when the repo is missing", () => { + const onInitGitRepo = vi.fn(); + const { container } = render( + , + ); + + const initButton = within(container).getByRole("button", { name: "Initialize Git" }); + fireEvent.click(initButton); + expect(onInitGitRepo).toHaveBeenCalledTimes(1); + }); + + it("does not show initialize git when the git root path is invalid", () => { + const { container } = render( + , + ); + + expect(within(container).queryByRole("button", { name: "Initialize Git" })).toBeNull(); + }); + it("enables commit when message exists and only unstaged changes", () => { const onCommit = vi.fn(); render( diff --git a/src/features/git/components/GitDiffPanel.tsx b/src/features/git/components/GitDiffPanel.tsx index b6817d242..3d5107f2f 100644 --- a/src/features/git/components/GitDiffPanel.tsx +++ b/src/features/git/components/GitDiffPanel.tsx @@ -95,6 +95,8 @@ type GitDiffPanelProps = { onSelectGitRoot?: (path: string) => void; onClearGitRoot?: () => void; onPickGitRoot?: () => void | Promise; + onInitGitRepo?: () => void | Promise; + initGitRepoLoading?: boolean; selectedPath?: string | null; onSelectFile?: (path: string) => void; stagedFiles: { @@ -202,6 +204,8 @@ export function GitDiffPanel({ onSelectGitRoot, onClearGitRoot, onPickGitRoot, + onInitGitRepo, + initGitRepoLoading = false, commitMessage = "", commitMessageLoading = false, commitMessageError = null, @@ -699,6 +703,8 @@ export function GitDiffPanel({ gitRootScanDepth={gitRootScanDepth} onGitRootScanDepthChange={onGitRootScanDepthChange} onPickGitRoot={onPickGitRoot} + onInitGitRepo={onInitGitRepo} + initGitRepoLoading={initGitRepoLoading} hasGitRoot={hasGitRoot} onClearGitRoot={onClearGitRoot} gitRootScanError={gitRootScanError} diff --git a/src/features/git/components/GitDiffPanel.utils.ts b/src/features/git/components/GitDiffPanel.utils.ts index 369673d95..9a11f9410 100644 --- a/src/features/git/components/GitDiffPanel.utils.ts +++ b/src/features/git/components/GitDiffPanel.utils.ts @@ -1,4 +1,5 @@ import { isAbsolutePath as isAbsolutePathForPlatform } from "../../../utils/platformPaths"; +export { isGitRootNotFound, isMissingRepo } from "../utils/repoErrors"; export const DEPTH_OPTIONS = [1, 2, 3, 4, 5, 6]; @@ -114,20 +115,6 @@ export function getStatusClass(status: string) { } } -export function isMissingRepo(error: string | null | undefined) { - if (!error) { - return false; - } - const normalized = error.toLowerCase(); - return ( - normalized.includes("could not find repository") || - normalized.includes("not a git repository") || - (normalized.includes("repository") && normalized.includes("notfound")) || - normalized.includes("repository not found") || - normalized.includes("git root not found") - ); -} - export function hasPushSyncConflict(pushError: string | null | undefined) { if (!pushError) { return false; diff --git a/src/features/git/components/GitDiffPanelModeContent.tsx b/src/features/git/components/GitDiffPanelModeContent.tsx index 64b220718..c7b18d66d 100644 --- a/src/features/git/components/GitDiffPanelModeContent.tsx +++ b/src/features/git/components/GitDiffPanelModeContent.tsx @@ -13,7 +13,12 @@ import { type DiffFile, GitLogEntryRow, } from "./GitDiffPanelShared"; -import { DEPTH_OPTIONS, normalizeRootPath } from "./GitDiffPanel.utils"; +import { + DEPTH_OPTIONS, + isGitRootNotFound, + isMissingRepo, + normalizeRootPath, +} from "./GitDiffPanel.utils"; type GitMode = "diff" | "log" | "issues" | "prs"; @@ -169,6 +174,8 @@ type GitDiffModeContentProps = { gitRootScanDepth: number; onGitRootScanDepthChange?: (depth: number) => void; onPickGitRoot?: () => void | Promise; + onInitGitRepo?: () => void | Promise; + initGitRepoLoading: boolean; hasGitRoot: boolean; onClearGitRoot?: () => void; gitRootScanError: string | null | undefined; @@ -223,6 +230,8 @@ export function GitDiffModeContent({ gitRootScanDepth, onGitRootScanDepthChange, onPickGitRoot, + onInitGitRepo, + initGitRepoLoading, hasGitRoot, onClearGitRoot, gitRootScanError, @@ -261,18 +270,40 @@ export function GitDiffModeContent({ onDiffListClick, }: GitDiffModeContentProps) { const normalizedGitRoot = normalizeRootPath(gitRoot); + const missingRepo = isMissingRepo(error); + const gitRootNotFound = isGitRootNotFound(error); + const showInitGitRepo = Boolean(onInitGitRepo) && missingRepo && !gitRootNotFound; + const gitRootTitle = gitRootNotFound + ? "Git root folder not found." + : missingRepo + ? "This workspace isn't a Git repository yet." + : "Choose a repo for this workspace."; return (
{showGitRootPanel && (
-
Choose a repo for this workspace.
+
{gitRootTitle}
+ {showInitGitRepo && ( +
+ +
+ )}
@@ -287,7 +318,7 @@ export function GitDiffModeContent({ onGitRootScanDepthChange?.(value); } }} - disabled={gitRootScanLoading} + disabled={gitRootScanLoading || initGitRepoLoading} > {DEPTH_OPTIONS.map((depth) => (
)}
- {activeSection === "projects" && ( - - )} - {activeSection === "environments" && ( - - )} - {activeSection === "display" && ( - - )} - {activeSection === "composer" && ( - - )} - {activeSection === "dictation" && ( - - )} - {activeSection === "shortcuts" && ( - - )} - {activeSection === "open-apps" && ( - - )} - {activeSection === "git" && ( - - )} - {activeSection === "server" && ( - - )} - {activeSection === "codex" && ( - { - void refreshDefaultModels(); - }} - codexPathDraft={codexPathDraft} - codexArgsDraft={codexArgsDraft} - codexDirty={codexDirty} - isSavingSettings={isSavingSettings} - doctorState={doctorState} - codexUpdateState={codexUpdateState} - globalAgentsMeta={globalAgentsMeta} - globalAgentsError={globalAgentsError} - globalAgentsContent={globalAgentsContent} - globalAgentsLoading={globalAgentsLoading} - globalAgentsRefreshDisabled={globalAgentsRefreshDisabled} - globalAgentsSaveDisabled={globalAgentsSaveDisabled} - globalAgentsSaveLabel={globalAgentsSaveLabel} - globalConfigMeta={globalConfigMeta} - globalConfigError={globalConfigError} - globalConfigContent={globalConfigContent} - globalConfigLoading={globalConfigLoading} - globalConfigRefreshDisabled={globalConfigRefreshDisabled} - globalConfigSaveDisabled={globalConfigSaveDisabled} - globalConfigSaveLabel={globalConfigSaveLabel} - projects={projects} - codexBinOverrideDrafts={codexBinOverrideDrafts} - codexHomeOverrideDrafts={codexHomeOverrideDrafts} - codexArgsOverrideDrafts={codexArgsOverrideDrafts} - onSetCodexPathDraft={setCodexPathDraft} - onSetCodexArgsDraft={setCodexArgsDraft} - onSetGlobalAgentsContent={setGlobalAgentsContent} - onSetGlobalConfigContent={setGlobalConfigContent} - onSetCodexBinOverrideDrafts={setCodexBinOverrideDrafts} - onSetCodexHomeOverrideDrafts={setCodexHomeOverrideDrafts} - onSetCodexArgsOverrideDrafts={setCodexArgsOverrideDrafts} - onBrowseCodex={handleBrowseCodex} - onSaveCodexSettings={handleSaveCodexSettings} - onRunDoctor={handleRunDoctor} - onRunCodexUpdate={handleRunCodexUpdate} - onRefreshGlobalAgents={() => { - void refreshGlobalAgents(); - }} - onSaveGlobalAgents={() => { - void saveGlobalAgents(); - }} - onRefreshGlobalConfig={() => { - void refreshGlobalConfig(); - }} - onSaveGlobalConfig={() => { - void saveGlobalConfig(); - }} - onUpdateWorkspaceCodexBin={onUpdateWorkspaceCodexBin} - onUpdateWorkspaceSettings={onUpdateWorkspaceSettings} - /> - )} - {activeSection === "features" && ( - { - void handleOpenConfig(); - }} - onUpdateAppSettings={onUpdateAppSettings} - /> - )} +
)} -
+
); } diff --git a/src/features/settings/components/sections/SettingsCodexSection.tsx b/src/features/settings/components/sections/SettingsCodexSection.tsx index 2f12cb63e..84ebdd243 100644 --- a/src/features/settings/components/sections/SettingsCodexSection.tsx +++ b/src/features/settings/components/sections/SettingsCodexSection.tsx @@ -7,8 +7,8 @@ import type { CodexUpdateResult, ModelOption, WorkspaceInfo, -} from "../../../../types"; -import { FileEditorCard } from "../../../shared/components/FileEditorCard"; +} from "@/types"; +import { FileEditorCard } from "@/features/shared/components/FileEditorCard"; type SettingsCodexSectionProps = { appSettings: AppSettings; diff --git a/src/features/settings/components/sections/SettingsComposerSection.tsx b/src/features/settings/components/sections/SettingsComposerSection.tsx index 49db34f62..f80abe26d 100644 --- a/src/features/settings/components/sections/SettingsComposerSection.tsx +++ b/src/features/settings/components/sections/SettingsComposerSection.tsx @@ -1,4 +1,4 @@ -import type { AppSettings } from "../../../../types"; +import type { AppSettings } from "@/types"; type ComposerPreset = AppSettings["composerEditorPreset"]; diff --git a/src/features/settings/components/sections/SettingsDictationSection.tsx b/src/features/settings/components/sections/SettingsDictationSection.tsx index e5e7e7ef7..e0b37058e 100644 --- a/src/features/settings/components/sections/SettingsDictationSection.tsx +++ b/src/features/settings/components/sections/SettingsDictationSection.tsx @@ -1,5 +1,5 @@ -import type { AppSettings, DictationModelStatus } from "../../../../types"; -import { formatDownloadSize } from "../../../../utils/formatting"; +import type { AppSettings, DictationModelStatus } from "@/types"; +import { formatDownloadSize } from "@utils/formatting"; type DictationModelOption = { id: string; diff --git a/src/features/settings/components/sections/SettingsDisplaySection.test.tsx b/src/features/settings/components/sections/SettingsDisplaySection.test.tsx index 3fe3741c8..add5aa410 100644 --- a/src/features/settings/components/sections/SettingsDisplaySection.test.tsx +++ b/src/features/settings/components/sections/SettingsDisplaySection.test.tsx @@ -1,7 +1,7 @@ // @vitest-environment jsdom import { fireEvent, render, screen, within } from "@testing-library/react"; import { describe, expect, it, vi } from "vitest"; -import type { AppSettings } from "../../../../types"; +import type { AppSettings } from "@/types"; import { SettingsDisplaySection } from "./SettingsDisplaySection"; describe("SettingsDisplaySection", () => { diff --git a/src/features/settings/components/sections/SettingsDisplaySection.tsx b/src/features/settings/components/sections/SettingsDisplaySection.tsx index ac7b0757d..df1c44c05 100644 --- a/src/features/settings/components/sections/SettingsDisplaySection.tsx +++ b/src/features/settings/components/sections/SettingsDisplaySection.tsx @@ -1,12 +1,12 @@ import type { Dispatch, SetStateAction } from "react"; -import type { AppSettings } from "../../../../types"; +import type { AppSettings } from "@/types"; import { CODE_FONT_SIZE_MAX, CODE_FONT_SIZE_MIN, CODE_FONT_SIZE_DEFAULT, DEFAULT_CODE_FONT_FAMILY, DEFAULT_UI_FONT_FAMILY, -} from "../../../../utils/fonts"; +} from "@utils/fonts"; type SettingsDisplaySectionProps = { appSettings: AppSettings; diff --git a/src/features/settings/components/sections/SettingsEnvironmentsSection.tsx b/src/features/settings/components/sections/SettingsEnvironmentsSection.tsx index fd06faa81..8998f2f22 100644 --- a/src/features/settings/components/sections/SettingsEnvironmentsSection.tsx +++ b/src/features/settings/components/sections/SettingsEnvironmentsSection.tsx @@ -1,6 +1,6 @@ import type { Dispatch, SetStateAction } from "react"; -import type { WorkspaceInfo } from "../../../../types"; -import { pushErrorToast } from "../../../../services/toasts"; +import type { WorkspaceInfo } from "@/types"; +import { pushErrorToast } from "@services/toasts"; type SettingsEnvironmentsSectionProps = { mainWorkspaces: WorkspaceInfo[]; diff --git a/src/features/settings/components/sections/SettingsFeaturesSection.tsx b/src/features/settings/components/sections/SettingsFeaturesSection.tsx index c0bd35dd9..853097aae 100644 --- a/src/features/settings/components/sections/SettingsFeaturesSection.tsx +++ b/src/features/settings/components/sections/SettingsFeaturesSection.tsx @@ -1,5 +1,5 @@ -import type { AppSettings } from "../../../../types"; -import { fileManagerName, openInFileManagerLabel } from "../../../../utils/platformPaths"; +import type { AppSettings } from "@/types"; +import { fileManagerName, openInFileManagerLabel } from "@utils/platformPaths"; type SettingsFeaturesSectionProps = { appSettings: AppSettings; diff --git a/src/features/settings/components/sections/SettingsGitSection.tsx b/src/features/settings/components/sections/SettingsGitSection.tsx index 2e138f3d7..8a6b1c166 100644 --- a/src/features/settings/components/sections/SettingsGitSection.tsx +++ b/src/features/settings/components/sections/SettingsGitSection.tsx @@ -1,4 +1,4 @@ -import type { AppSettings } from "../../../../types"; +import type { AppSettings } from "@/types"; type SettingsGitSectionProps = { appSettings: AppSettings; diff --git a/src/features/settings/components/sections/SettingsOpenAppsSection.tsx b/src/features/settings/components/sections/SettingsOpenAppsSection.tsx index 0916ed5cd..50df69dfa 100644 --- a/src/features/settings/components/sections/SettingsOpenAppsSection.tsx +++ b/src/features/settings/components/sections/SettingsOpenAppsSection.tsx @@ -1,16 +1,16 @@ import ChevronDown from "lucide-react/dist/esm/icons/chevron-down"; import ChevronUp from "lucide-react/dist/esm/icons/chevron-up"; import Trash2 from "lucide-react/dist/esm/icons/trash-2"; -import type { OpenAppTarget } from "../../../../types"; +import type { OpenAppTarget } from "@/types"; import { fileManagerName, isMacPlatform, -} from "../../../../utils/platformPaths"; +} from "@utils/platformPaths"; import { GENERIC_APP_ICON, getKnownOpenAppIcon, -} from "../../../app/utils/openAppIcons"; -import type { OpenAppDraft } from "../settingsTypes"; +} from "@app/utils/openAppIcons"; +import type { OpenAppDraft } from "@settings/components/settingsTypes"; type SettingsOpenAppsSectionProps = { openAppDrafts: OpenAppDraft[]; diff --git a/src/features/settings/components/sections/SettingsProjectsSection.tsx b/src/features/settings/components/sections/SettingsProjectsSection.tsx index 0adc0b2a7..7316ece47 100644 --- a/src/features/settings/components/sections/SettingsProjectsSection.tsx +++ b/src/features/settings/components/sections/SettingsProjectsSection.tsx @@ -2,7 +2,7 @@ import ChevronDown from "lucide-react/dist/esm/icons/chevron-down"; import ChevronUp from "lucide-react/dist/esm/icons/chevron-up"; import Trash2 from "lucide-react/dist/esm/icons/trash-2"; import type { Dispatch, SetStateAction } from "react"; -import type { WorkspaceGroup, WorkspaceInfo } from "../../../../types"; +import type { WorkspaceGroup, WorkspaceInfo } from "@/types"; type GroupedWorkspaces = Array<{ id: string | null; diff --git a/src/features/settings/components/sections/SettingsSectionContainers.tsx b/src/features/settings/components/sections/SettingsSectionContainers.tsx new file mode 100644 index 000000000..0ef434fb9 --- /dev/null +++ b/src/features/settings/components/sections/SettingsSectionContainers.tsx @@ -0,0 +1,58 @@ +import { SettingsCodexSection } from "./SettingsCodexSection"; +import { SettingsComposerSection } from "./SettingsComposerSection"; +import { SettingsDictationSection } from "./SettingsDictationSection"; +import { SettingsDisplaySection } from "./SettingsDisplaySection"; +import { SettingsEnvironmentsSection } from "./SettingsEnvironmentsSection"; +import { SettingsFeaturesSection } from "./SettingsFeaturesSection"; +import { SettingsGitSection } from "./SettingsGitSection"; +import { SettingsOpenAppsSection } from "./SettingsOpenAppsSection"; +import { SettingsProjectsSection } from "./SettingsProjectsSection"; +import { SettingsServerSection } from "./SettingsServerSection"; +import { SettingsShortcutsSection } from "./SettingsShortcutsSection"; +import type { CodexSection } from "@settings/components/settingsTypes"; +import type { SettingsViewOrchestration } from "@settings/hooks/useSettingsViewOrchestration"; + +type SettingsSectionContainersProps = { + activeSection: CodexSection; + orchestration: SettingsViewOrchestration; +}; + +export function SettingsSectionContainers({ + activeSection, + orchestration, +}: SettingsSectionContainersProps) { + if (activeSection === "projects") { + return ; + } + if (activeSection === "environments") { + return ; + } + if (activeSection === "display") { + return ; + } + if (activeSection === "composer") { + return ; + } + if (activeSection === "dictation") { + return ; + } + if (activeSection === "shortcuts") { + return ; + } + if (activeSection === "open-apps") { + return ; + } + if (activeSection === "git") { + return ; + } + if (activeSection === "server") { + return ; + } + if (activeSection === "codex") { + return ; + } + if (activeSection === "features") { + return ; + } + return null; +} diff --git a/src/features/settings/components/sections/SettingsServerSection.tsx b/src/features/settings/components/sections/SettingsServerSection.tsx index 7c8c93d50..d52e35c8d 100644 --- a/src/features/settings/components/sections/SettingsServerSection.tsx +++ b/src/features/settings/components/sections/SettingsServerSection.tsx @@ -4,7 +4,7 @@ import type { TailscaleDaemonCommandPreview, TailscaleStatus, TcpDaemonStatus, -} from "../../../../types"; +} from "@/types"; type SettingsServerSectionProps = { appSettings: AppSettings; diff --git a/src/features/settings/components/sections/SettingsShortcutsSection.tsx b/src/features/settings/components/sections/SettingsShortcutsSection.tsx index 3b910173f..614664a6a 100644 --- a/src/features/settings/components/sections/SettingsShortcutsSection.tsx +++ b/src/features/settings/components/sections/SettingsShortcutsSection.tsx @@ -1,11 +1,11 @@ import type { KeyboardEvent } from "react"; -import { formatShortcut, getDefaultInterruptShortcut } from "../../../../utils/shortcuts"; -import { isMacPlatform } from "../../../../utils/platformPaths"; +import { formatShortcut, getDefaultInterruptShortcut } from "@utils/shortcuts"; +import { isMacPlatform } from "@utils/platformPaths"; import type { ShortcutDraftKey, ShortcutDrafts, ShortcutSettingKey, -} from "../settingsTypes"; +} from "@settings/components/settingsTypes"; type ShortcutItem = { label: string; diff --git a/src/features/settings/components/settingsTypes.ts b/src/features/settings/components/settingsTypes.ts index 921d93fec..3c17f8f81 100644 --- a/src/features/settings/components/settingsTypes.ts +++ b/src/features/settings/components/settingsTypes.ts @@ -5,9 +5,9 @@ import type { OrbitRunnerStatus, OrbitSignInPollResult, OrbitSignOutResult, -} from "../../../types"; +} from "@/types"; -export type SettingsSection = +type SettingsSection = | "projects" | "environments" | "display" diff --git a/src/features/settings/components/settingsViewConstants.ts b/src/features/settings/components/settingsViewConstants.ts index 4289f3177..9abb40dc7 100644 --- a/src/features/settings/components/settingsViewConstants.ts +++ b/src/features/settings/components/settingsViewConstants.ts @@ -1,4 +1,4 @@ -import type { AppSettings } from "../../../types"; +import type { AppSettings } from "@/types"; import { orbitConnectTest, orbitRunnerStart, @@ -7,7 +7,7 @@ import { orbitSignInPoll, orbitSignInStart, orbitSignOut, -} from "../../../services/tauri"; +} from "@services/tauri"; import type { CodexSection, OrbitServiceClient, diff --git a/src/features/settings/components/settingsViewHelpers.ts b/src/features/settings/components/settingsViewHelpers.ts index 00286ae88..ffdc926ea 100644 --- a/src/features/settings/components/settingsViewHelpers.ts +++ b/src/features/settings/components/settingsViewHelpers.ts @@ -6,7 +6,7 @@ import type { OrbitSignInPollResult, OrbitSignOutResult, WorkspaceInfo, -} from "../../../types"; +} from "@/types"; import type { OpenAppDraft, ShortcutDrafts } from "./settingsTypes"; import { SETTINGS_MOBILE_BREAKPOINT_PX } from "./settingsViewConstants"; diff --git a/src/features/settings/hooks/settingsSectionTypes.ts b/src/features/settings/hooks/settingsSectionTypes.ts new file mode 100644 index 000000000..7051af97f --- /dev/null +++ b/src/features/settings/hooks/settingsSectionTypes.ts @@ -0,0 +1,7 @@ +import type { WorkspaceInfo } from "@/types"; + +export type GroupedWorkspaces = Array<{ + id: string | null; + name: string; + workspaces: WorkspaceInfo[]; +}>; diff --git a/src/features/settings/hooks/useAppSettings.test.ts b/src/features/settings/hooks/useAppSettings.test.ts index bac9399c0..c6eb4221e 100644 --- a/src/features/settings/hooks/useAppSettings.test.ts +++ b/src/features/settings/hooks/useAppSettings.test.ts @@ -1,16 +1,16 @@ // @vitest-environment jsdom import { act, cleanup, renderHook, waitFor } from "@testing-library/react"; import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; -import type { AppSettings, CodexDoctorResult } from "../../../types"; +import type { AppSettings, CodexDoctorResult } from "@/types"; import { useAppSettings } from "./useAppSettings"; import { getAppSettings, runCodexDoctor, updateAppSettings, -} from "../../../services/tauri"; -import { UI_SCALE_DEFAULT, UI_SCALE_MAX } from "../../../utils/uiScale"; +} from "@services/tauri"; +import { UI_SCALE_DEFAULT, UI_SCALE_MAX } from "@utils/uiScale"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ getAppSettings: vi.fn(), updateAppSettings: vi.fn(), runCodexDoctor: vi.fn(), diff --git a/src/features/settings/hooks/useAppSettings.ts b/src/features/settings/hooks/useAppSettings.ts index c9b4c7112..684d1e1a7 100644 --- a/src/features/settings/hooks/useAppSettings.ts +++ b/src/features/settings/hooks/useAppSettings.ts @@ -1,23 +1,23 @@ import { useCallback, useEffect, useMemo, useState } from "react"; -import type { AppSettings } from "../../../types"; -import { getAppSettings, runCodexDoctor, updateAppSettings } from "../../../services/tauri"; -import { clampUiScale, UI_SCALE_DEFAULT } from "../../../utils/uiScale"; +import type { AppSettings } from "@/types"; +import { getAppSettings, runCodexDoctor, updateAppSettings } from "@services/tauri"; +import { clampUiScale, UI_SCALE_DEFAULT } from "@utils/uiScale"; import { DEFAULT_CODE_FONT_FAMILY, DEFAULT_UI_FONT_FAMILY, CODE_FONT_SIZE_DEFAULT, clampCodeFontSize, normalizeFontFamily, -} from "../../../utils/fonts"; +} from "@utils/fonts"; import { DEFAULT_OPEN_APP_ID, DEFAULT_OPEN_APP_TARGETS, OPEN_APP_STORAGE_KEY, -} from "../../app/constants"; -import { normalizeOpenAppTargets } from "../../app/utils/openApp"; -import { getDefaultInterruptShortcut, isMacPlatform } from "../../../utils/shortcuts"; -import { isMobilePlatform } from "../../../utils/platformPaths"; -import { DEFAULT_COMMIT_MESSAGE_PROMPT } from "../../../utils/commitMessagePrompt"; +} from "@app/constants"; +import { normalizeOpenAppTargets } from "@app/utils/openApp"; +import { getDefaultInterruptShortcut, isMacPlatform } from "@utils/shortcuts"; +import { isMobilePlatform } from "@utils/platformPaths"; +import { DEFAULT_COMMIT_MESSAGE_PROMPT } from "@utils/commitMessagePrompt"; const allowedThemes = new Set(["system", "light", "dark", "dim"]); const allowedPersonality = new Set(["friendly", "pragmatic"]); diff --git a/src/features/settings/hooks/useGlobalAgentsMd.ts b/src/features/settings/hooks/useGlobalAgentsMd.ts index aaace7d45..5eb44e3e0 100644 --- a/src/features/settings/hooks/useGlobalAgentsMd.ts +++ b/src/features/settings/hooks/useGlobalAgentsMd.ts @@ -1,5 +1,5 @@ -import { readGlobalAgentsMd, writeGlobalAgentsMd } from "../../../services/tauri"; -import { useFileEditor } from "../../shared/hooks/useFileEditor"; +import { readGlobalAgentsMd, writeGlobalAgentsMd } from "@services/tauri"; +import { useFileEditor } from "@/features/shared/hooks/useFileEditor"; export function useGlobalAgentsMd() { return useFileEditor({ diff --git a/src/features/settings/hooks/useGlobalCodexConfigToml.ts b/src/features/settings/hooks/useGlobalCodexConfigToml.ts index 228099cbf..f6743ef66 100644 --- a/src/features/settings/hooks/useGlobalCodexConfigToml.ts +++ b/src/features/settings/hooks/useGlobalCodexConfigToml.ts @@ -1,5 +1,5 @@ -import { readGlobalCodexConfigToml, writeGlobalCodexConfigToml } from "../../../services/tauri"; -import { useFileEditor } from "../../shared/hooks/useFileEditor"; +import { readGlobalCodexConfigToml, writeGlobalCodexConfigToml } from "@services/tauri"; +import { useFileEditor } from "@/features/shared/hooks/useFileEditor"; export function useGlobalCodexConfigToml() { return useFileEditor({ diff --git a/src/features/settings/hooks/useSettingsCodexSection.ts b/src/features/settings/hooks/useSettingsCodexSection.ts new file mode 100644 index 000000000..d755c4303 --- /dev/null +++ b/src/features/settings/hooks/useSettingsCodexSection.ts @@ -0,0 +1,355 @@ +import { useEffect, useState } from "react"; +import type { Dispatch, SetStateAction } from "react"; +import { open } from "@tauri-apps/plugin-dialog"; +import type { + AppSettings, + CodexDoctorResult, + CodexUpdateResult, + WorkspaceSettings, + WorkspaceInfo, +} from "@/types"; +import { useGlobalAgentsMd } from "./useGlobalAgentsMd"; +import { useGlobalCodexConfigToml } from "./useGlobalCodexConfigToml"; +import { useSettingsDefaultModels } from "./useSettingsDefaultModels"; +import { + buildEditorContentMeta, + buildWorkspaceOverrideDrafts, +} from "@settings/components/settingsViewHelpers"; + +type UseSettingsCodexSectionArgs = { + appSettings: AppSettings; + projects: WorkspaceInfo[]; + onUpdateAppSettings: (next: AppSettings) => Promise; + onRunDoctor: ( + codexBin: string | null, + codexArgs: string | null, + ) => Promise; + onRunCodexUpdate?: ( + codexBin: string | null, + codexArgs: string | null, + ) => Promise; + onUpdateWorkspaceCodexBin: (id: string, codexBin: string | null) => Promise; + onUpdateWorkspaceSettings: ( + id: string, + settings: Partial, + ) => Promise; +}; + +export type SettingsCodexSectionProps = { + appSettings: AppSettings; + onUpdateAppSettings: (next: AppSettings) => Promise; + defaultModels: ReturnType["models"]; + defaultModelsLoading: boolean; + defaultModelsError: string | null; + defaultModelsConnectedWorkspaceCount: number; + onRefreshDefaultModels: () => void; + codexPathDraft: string; + codexArgsDraft: string; + codexDirty: boolean; + isSavingSettings: boolean; + doctorState: { + status: "idle" | "running" | "done"; + result: CodexDoctorResult | null; + }; + codexUpdateState: { + status: "idle" | "running" | "done"; + result: CodexUpdateResult | null; + }; + globalAgentsMeta: string; + globalAgentsError: string | null; + globalAgentsContent: string; + globalAgentsLoading: boolean; + globalAgentsRefreshDisabled: boolean; + globalAgentsSaveDisabled: boolean; + globalAgentsSaveLabel: string; + globalConfigMeta: string; + globalConfigError: string | null; + globalConfigContent: string; + globalConfigLoading: boolean; + globalConfigRefreshDisabled: boolean; + globalConfigSaveDisabled: boolean; + globalConfigSaveLabel: string; + projects: WorkspaceInfo[]; + codexBinOverrideDrafts: Record; + codexHomeOverrideDrafts: Record; + codexArgsOverrideDrafts: Record; + onSetCodexPathDraft: Dispatch>; + onSetCodexArgsDraft: Dispatch>; + onSetGlobalAgentsContent: (value: string) => void; + onSetGlobalConfigContent: (value: string) => void; + onSetCodexBinOverrideDrafts: Dispatch>>; + onSetCodexHomeOverrideDrafts: Dispatch>>; + onSetCodexArgsOverrideDrafts: Dispatch>>; + onBrowseCodex: () => Promise; + onSaveCodexSettings: () => Promise; + onRunDoctor: () => Promise; + onRunCodexUpdate: () => Promise; + onRefreshGlobalAgents: () => void; + onSaveGlobalAgents: () => void; + onRefreshGlobalConfig: () => void; + onSaveGlobalConfig: () => void; + onUpdateWorkspaceCodexBin: (id: string, codexBin: string | null) => Promise; + onUpdateWorkspaceSettings: ( + id: string, + settings: Partial, + ) => Promise; +}; + +export const useSettingsCodexSection = ({ + appSettings, + projects, + onUpdateAppSettings, + onRunDoctor, + onRunCodexUpdate, + onUpdateWorkspaceCodexBin, + onUpdateWorkspaceSettings, +}: UseSettingsCodexSectionArgs): SettingsCodexSectionProps => { + const [codexPathDraft, setCodexPathDraft] = useState(appSettings.codexBin ?? ""); + const [codexArgsDraft, setCodexArgsDraft] = useState(appSettings.codexArgs ?? ""); + const [codexBinOverrideDrafts, setCodexBinOverrideDrafts] = useState< + Record + >({}); + const [codexHomeOverrideDrafts, setCodexHomeOverrideDrafts] = useState< + Record + >({}); + const [codexArgsOverrideDrafts, setCodexArgsOverrideDrafts] = useState< + Record + >({}); + const [isSavingSettings, setIsSavingSettings] = useState(false); + const [doctorState, setDoctorState] = useState<{ + status: "idle" | "running" | "done"; + result: CodexDoctorResult | null; + }>({ status: "idle", result: null }); + const [codexUpdateState, setCodexUpdateState] = useState<{ + status: "idle" | "running" | "done"; + result: CodexUpdateResult | null; + }>({ status: "idle", result: null }); + + const { + models: defaultModels, + isLoading: defaultModelsLoading, + error: defaultModelsError, + connectedWorkspaceCount: defaultModelsConnectedWorkspaceCount, + refresh: refreshDefaultModels, + } = useSettingsDefaultModels(projects); + + const { + content: globalAgentsContent, + exists: globalAgentsExists, + truncated: globalAgentsTruncated, + isLoading: globalAgentsLoading, + isSaving: globalAgentsSaving, + error: globalAgentsError, + isDirty: globalAgentsDirty, + setContent: setGlobalAgentsContent, + refresh: refreshGlobalAgents, + save: saveGlobalAgents, + } = useGlobalAgentsMd(); + + const { + content: globalConfigContent, + exists: globalConfigExists, + truncated: globalConfigTruncated, + isLoading: globalConfigLoading, + isSaving: globalConfigSaving, + error: globalConfigError, + isDirty: globalConfigDirty, + setContent: setGlobalConfigContent, + refresh: refreshGlobalConfig, + save: saveGlobalConfig, + } = useGlobalCodexConfigToml(); + + const globalAgentsEditorMeta = buildEditorContentMeta({ + isLoading: globalAgentsLoading, + isSaving: globalAgentsSaving, + exists: globalAgentsExists, + truncated: globalAgentsTruncated, + isDirty: globalAgentsDirty, + }); + + const globalConfigEditorMeta = buildEditorContentMeta({ + isLoading: globalConfigLoading, + isSaving: globalConfigSaving, + exists: globalConfigExists, + truncated: globalConfigTruncated, + isDirty: globalConfigDirty, + }); + + useEffect(() => { + setCodexPathDraft(appSettings.codexBin ?? ""); + }, [appSettings.codexBin]); + + useEffect(() => { + setCodexArgsDraft(appSettings.codexArgs ?? ""); + }, [appSettings.codexArgs]); + + useEffect(() => { + setCodexBinOverrideDrafts((prev) => + buildWorkspaceOverrideDrafts(projects, prev, (workspace) => workspace.codex_bin ?? null), + ); + setCodexHomeOverrideDrafts((prev) => + buildWorkspaceOverrideDrafts( + projects, + prev, + (workspace) => workspace.settings.codexHome ?? null, + ), + ); + setCodexArgsOverrideDrafts((prev) => + buildWorkspaceOverrideDrafts( + projects, + prev, + (workspace) => workspace.settings.codexArgs ?? null, + ), + ); + }, [projects]); + + const nextCodexBin = codexPathDraft.trim() ? codexPathDraft.trim() : null; + const nextCodexArgs = codexArgsDraft.trim() ? codexArgsDraft.trim() : null; + const codexDirty = + nextCodexBin !== (appSettings.codexBin ?? null) || + nextCodexArgs !== (appSettings.codexArgs ?? null); + + const handleBrowseCodex = async () => { + const selection = await open({ multiple: false, directory: false }); + if (!selection || Array.isArray(selection)) { + return; + } + setCodexPathDraft(selection); + }; + + const handleSaveCodexSettings = async () => { + setIsSavingSettings(true); + try { + await onUpdateAppSettings({ + ...appSettings, + codexBin: nextCodexBin, + codexArgs: nextCodexArgs, + }); + } finally { + setIsSavingSettings(false); + } + }; + + const handleRunDoctor = async () => { + setDoctorState({ status: "running", result: null }); + try { + const result = await onRunDoctor(nextCodexBin, nextCodexArgs); + setDoctorState({ status: "done", result }); + } catch (error) { + setDoctorState({ + status: "done", + result: { + ok: false, + codexBin: nextCodexBin, + version: null, + appServerOk: false, + details: error instanceof Error ? error.message : String(error), + path: null, + nodeOk: false, + nodeVersion: null, + nodeDetails: null, + }, + }); + } + }; + + const handleRunCodexUpdate = async () => { + setCodexUpdateState({ status: "running", result: null }); + try { + if (!onRunCodexUpdate) { + setCodexUpdateState({ + status: "done", + result: { + ok: false, + method: "unknown", + package: null, + beforeVersion: null, + afterVersion: null, + upgraded: false, + output: null, + details: "Codex updates are not available in this build.", + }, + }); + return; + } + + const result = await onRunCodexUpdate(nextCodexBin, nextCodexArgs); + setCodexUpdateState({ status: "done", result }); + } catch (error) { + setCodexUpdateState({ + status: "done", + result: { + ok: false, + method: "unknown", + package: null, + beforeVersion: null, + afterVersion: null, + upgraded: false, + output: null, + details: error instanceof Error ? error.message : String(error), + }, + }); + } + }; + + return { + appSettings, + onUpdateAppSettings, + defaultModels, + defaultModelsLoading, + defaultModelsError, + defaultModelsConnectedWorkspaceCount, + onRefreshDefaultModels: () => { + void refreshDefaultModels(); + }, + codexPathDraft, + codexArgsDraft, + codexDirty, + isSavingSettings, + doctorState, + codexUpdateState, + globalAgentsMeta: globalAgentsEditorMeta.meta, + globalAgentsError, + globalAgentsContent, + globalAgentsLoading, + globalAgentsRefreshDisabled: globalAgentsEditorMeta.refreshDisabled, + globalAgentsSaveDisabled: globalAgentsEditorMeta.saveDisabled, + globalAgentsSaveLabel: globalAgentsEditorMeta.saveLabel, + globalConfigMeta: globalConfigEditorMeta.meta, + globalConfigError, + globalConfigContent, + globalConfigLoading, + globalConfigRefreshDisabled: globalConfigEditorMeta.refreshDisabled, + globalConfigSaveDisabled: globalConfigEditorMeta.saveDisabled, + globalConfigSaveLabel: globalConfigEditorMeta.saveLabel, + projects, + codexBinOverrideDrafts, + codexHomeOverrideDrafts, + codexArgsOverrideDrafts, + onSetCodexPathDraft: setCodexPathDraft, + onSetCodexArgsDraft: setCodexArgsDraft, + onSetGlobalAgentsContent: setGlobalAgentsContent, + onSetGlobalConfigContent: setGlobalConfigContent, + onSetCodexBinOverrideDrafts: setCodexBinOverrideDrafts, + onSetCodexHomeOverrideDrafts: setCodexHomeOverrideDrafts, + onSetCodexArgsOverrideDrafts: setCodexArgsOverrideDrafts, + onBrowseCodex: handleBrowseCodex, + onSaveCodexSettings: handleSaveCodexSettings, + onRunDoctor: handleRunDoctor, + onRunCodexUpdate: handleRunCodexUpdate, + onRefreshGlobalAgents: () => { + void refreshGlobalAgents(); + }, + onSaveGlobalAgents: () => { + void saveGlobalAgents(); + }, + onRefreshGlobalConfig: () => { + void refreshGlobalConfig(); + }, + onSaveGlobalConfig: () => { + void saveGlobalConfig(); + }, + onUpdateWorkspaceCodexBin, + onUpdateWorkspaceSettings, + }; +}; diff --git a/src/features/settings/hooks/useSettingsDefaultModels.test.tsx b/src/features/settings/hooks/useSettingsDefaultModels.test.tsx index 36640a518..99e6c0d7c 100644 --- a/src/features/settings/hooks/useSettingsDefaultModels.test.tsx +++ b/src/features/settings/hooks/useSettingsDefaultModels.test.tsx @@ -1,11 +1,11 @@ // @vitest-environment jsdom import { act, renderHook, waitFor } from "@testing-library/react"; import { afterEach, describe, expect, it, vi } from "vitest"; -import type { WorkspaceInfo } from "../../../types"; -import { getModelList } from "../../../services/tauri"; +import type { WorkspaceInfo } from "@/types"; +import { getModelList } from "@services/tauri"; import { useSettingsDefaultModels } from "./useSettingsDefaultModels"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ getModelList: vi.fn(), })); diff --git a/src/features/settings/hooks/useSettingsDefaultModels.ts b/src/features/settings/hooks/useSettingsDefaultModels.ts index b88d25c5d..6ac0a218c 100644 --- a/src/features/settings/hooks/useSettingsDefaultModels.ts +++ b/src/features/settings/hooks/useSettingsDefaultModels.ts @@ -1,7 +1,7 @@ import { useCallback, useEffect, useMemo, useRef, useState } from "react"; -import type { ModelOption, WorkspaceInfo } from "../../../types"; -import { getModelList } from "../../../services/tauri"; -import { parseModelListResponse } from "../../models/utils/modelListResponse"; +import type { ModelOption, WorkspaceInfo } from "@/types"; +import { getModelList } from "@services/tauri"; +import { parseModelListResponse } from "@/features/models/utils/modelListResponse"; type SettingsDefaultModelsState = { models: ModelOption[]; diff --git a/src/features/settings/hooks/useSettingsDisplaySection.ts b/src/features/settings/hooks/useSettingsDisplaySection.ts new file mode 100644 index 000000000..6af03244d --- /dev/null +++ b/src/features/settings/hooks/useSettingsDisplaySection.ts @@ -0,0 +1,173 @@ +import { useEffect, useState } from "react"; +import type { Dispatch, SetStateAction } from "react"; +import type { AppSettings } from "@/types"; +import { clampUiScale } from "@utils/uiScale"; +import { + DEFAULT_CODE_FONT_FAMILY, + DEFAULT_UI_FONT_FAMILY, + clampCodeFontSize, + normalizeFontFamily, +} from "@utils/fonts"; + +type UseSettingsDisplaySectionArgs = { + appSettings: AppSettings; + reduceTransparency: boolean; + onToggleTransparency: (value: boolean) => void; + onUpdateAppSettings: (next: AppSettings) => Promise; + scaleShortcutTitle: string; + scaleShortcutText: string; + onTestNotificationSound: () => void; + onTestSystemNotification: () => void; +}; + +export type SettingsDisplaySectionProps = { + appSettings: AppSettings; + reduceTransparency: boolean; + scaleShortcutTitle: string; + scaleShortcutText: string; + scaleDraft: string; + uiFontDraft: string; + codeFontDraft: string; + codeFontSizeDraft: number; + onUpdateAppSettings: (next: AppSettings) => Promise; + onToggleTransparency: (value: boolean) => void; + onSetScaleDraft: Dispatch>; + onCommitScale: () => Promise; + onResetScale: () => Promise; + onSetUiFontDraft: Dispatch>; + onCommitUiFont: () => Promise; + onSetCodeFontDraft: Dispatch>; + onCommitCodeFont: () => Promise; + onSetCodeFontSizeDraft: Dispatch>; + onCommitCodeFontSize: (nextSize: number) => Promise; + onTestNotificationSound: () => void; + onTestSystemNotification: () => void; +}; + +export const useSettingsDisplaySection = ({ + appSettings, + reduceTransparency, + onToggleTransparency, + onUpdateAppSettings, + scaleShortcutTitle, + scaleShortcutText, + onTestNotificationSound, + onTestSystemNotification, +}: UseSettingsDisplaySectionArgs): SettingsDisplaySectionProps => { + const [scaleDraft, setScaleDraft] = useState( + `${Math.round(clampUiScale(appSettings.uiScale) * 100)}%`, + ); + const [uiFontDraft, setUiFontDraft] = useState(appSettings.uiFontFamily); + const [codeFontDraft, setCodeFontDraft] = useState(appSettings.codeFontFamily); + const [codeFontSizeDraft, setCodeFontSizeDraft] = useState(appSettings.codeFontSize); + + useEffect(() => { + setScaleDraft(`${Math.round(clampUiScale(appSettings.uiScale) * 100)}%`); + }, [appSettings.uiScale]); + + useEffect(() => { + setUiFontDraft(appSettings.uiFontFamily); + }, [appSettings.uiFontFamily]); + + useEffect(() => { + setCodeFontDraft(appSettings.codeFontFamily); + }, [appSettings.codeFontFamily]); + + useEffect(() => { + setCodeFontSizeDraft(appSettings.codeFontSize); + }, [appSettings.codeFontSize]); + + const trimmedScale = scaleDraft.trim(); + const parsedPercent = trimmedScale + ? Number(trimmedScale.replace("%", "")) + : Number.NaN; + const parsedScale = Number.isFinite(parsedPercent) ? parsedPercent / 100 : null; + + const handleCommitScale = async () => { + if (parsedScale === null) { + setScaleDraft(`${Math.round(clampUiScale(appSettings.uiScale) * 100)}%`); + return; + } + const nextScale = clampUiScale(parsedScale); + setScaleDraft(`${Math.round(nextScale * 100)}%`); + if (nextScale === appSettings.uiScale) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + uiScale: nextScale, + }); + }; + + const handleResetScale = async () => { + if (appSettings.uiScale === 1) { + setScaleDraft("100%"); + return; + } + setScaleDraft("100%"); + await onUpdateAppSettings({ + ...appSettings, + uiScale: 1, + }); + }; + + const handleCommitUiFont = async () => { + const nextFont = normalizeFontFamily(uiFontDraft, DEFAULT_UI_FONT_FAMILY); + setUiFontDraft(nextFont); + if (nextFont === appSettings.uiFontFamily) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + uiFontFamily: nextFont, + }); + }; + + const handleCommitCodeFont = async () => { + const nextFont = normalizeFontFamily(codeFontDraft, DEFAULT_CODE_FONT_FAMILY); + setCodeFontDraft(nextFont); + if (nextFont === appSettings.codeFontFamily) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + codeFontFamily: nextFont, + }); + }; + + const handleCommitCodeFontSize = async (nextSize: number) => { + const clampedSize = clampCodeFontSize(nextSize); + setCodeFontSizeDraft(clampedSize); + if (clampedSize === appSettings.codeFontSize) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + codeFontSize: clampedSize, + }); + }; + + return { + appSettings, + reduceTransparency, + scaleShortcutTitle, + scaleShortcutText, + scaleDraft, + uiFontDraft, + codeFontDraft, + codeFontSizeDraft, + onUpdateAppSettings, + onToggleTransparency, + onSetScaleDraft: setScaleDraft, + onCommitScale: handleCommitScale, + onResetScale: handleResetScale, + onSetUiFontDraft: setUiFontDraft, + onCommitUiFont: handleCommitUiFont, + onSetCodeFontDraft: setCodeFontDraft, + onCommitCodeFont: handleCommitCodeFont, + onSetCodeFontSizeDraft: setCodeFontSizeDraft, + onCommitCodeFontSize: handleCommitCodeFontSize, + onTestNotificationSound, + onTestSystemNotification, + }; +}; diff --git a/src/features/settings/hooks/useSettingsEnvironmentsSection.ts b/src/features/settings/hooks/useSettingsEnvironmentsSection.ts new file mode 100644 index 000000000..118ba5251 --- /dev/null +++ b/src/features/settings/hooks/useSettingsEnvironmentsSection.ts @@ -0,0 +1,141 @@ +import { useEffect, useMemo, useState } from "react"; +import type { Dispatch, SetStateAction } from "react"; +import type { WorkspaceInfo } from "@/types"; +import { normalizeWorktreeSetupScript } from "@settings/components/settingsViewHelpers"; + +type UseSettingsEnvironmentsSectionArgs = { + mainWorkspaces: WorkspaceInfo[]; + onUpdateWorkspaceSettings: ( + id: string, + settings: Partial, + ) => Promise; +}; + +export type SettingsEnvironmentsSectionProps = { + mainWorkspaces: WorkspaceInfo[]; + environmentWorkspace: WorkspaceInfo | null; + environmentSaving: boolean; + environmentError: string | null; + environmentDraftScript: string; + environmentSavedScript: string | null; + environmentDirty: boolean; + onSetEnvironmentWorkspaceId: Dispatch>; + onSetEnvironmentDraftScript: Dispatch>; + onSaveEnvironmentSetup: () => Promise; +}; + +export const useSettingsEnvironmentsSection = ({ + mainWorkspaces, + onUpdateWorkspaceSettings, +}: UseSettingsEnvironmentsSectionArgs): SettingsEnvironmentsSectionProps => { + const [environmentWorkspaceId, setEnvironmentWorkspaceId] = useState( + null, + ); + const [environmentDraftScript, setEnvironmentDraftScript] = useState(""); + const [environmentSavedScript, setEnvironmentSavedScript] = useState( + null, + ); + const [environmentLoadedWorkspaceId, setEnvironmentLoadedWorkspaceId] = useState< + string | null + >(null); + const [environmentError, setEnvironmentError] = useState(null); + const [environmentSaving, setEnvironmentSaving] = useState(false); + + const environmentWorkspace = useMemo(() => { + if (mainWorkspaces.length === 0) { + return null; + } + if (environmentWorkspaceId) { + const found = mainWorkspaces.find((workspace) => workspace.id === environmentWorkspaceId); + if (found) { + return found; + } + } + return mainWorkspaces[0] ?? null; + }, [environmentWorkspaceId, mainWorkspaces]); + + const environmentSavedScriptFromWorkspace = useMemo(() => { + return normalizeWorktreeSetupScript(environmentWorkspace?.settings.worktreeSetupScript); + }, [environmentWorkspace?.settings.worktreeSetupScript]); + + const environmentDraftNormalized = useMemo(() => { + return normalizeWorktreeSetupScript(environmentDraftScript); + }, [environmentDraftScript]); + + const environmentDirty = environmentDraftNormalized !== environmentSavedScript; + + useEffect(() => { + if (!environmentWorkspace) { + setEnvironmentWorkspaceId(null); + setEnvironmentLoadedWorkspaceId(null); + setEnvironmentSavedScript(null); + setEnvironmentDraftScript(""); + setEnvironmentError(null); + setEnvironmentSaving(false); + return; + } + + if (environmentWorkspaceId !== environmentWorkspace.id) { + setEnvironmentWorkspaceId(environmentWorkspace.id); + } + }, [environmentWorkspace, environmentWorkspaceId]); + + useEffect(() => { + if (!environmentWorkspace) { + return; + } + + if (environmentLoadedWorkspaceId !== environmentWorkspace.id) { + setEnvironmentLoadedWorkspaceId(environmentWorkspace.id); + setEnvironmentSavedScript(environmentSavedScriptFromWorkspace); + setEnvironmentDraftScript(environmentSavedScriptFromWorkspace ?? ""); + setEnvironmentError(null); + return; + } + + if (!environmentDirty && environmentSavedScript !== environmentSavedScriptFromWorkspace) { + setEnvironmentSavedScript(environmentSavedScriptFromWorkspace); + setEnvironmentDraftScript(environmentSavedScriptFromWorkspace ?? ""); + setEnvironmentError(null); + } + }, [ + environmentDirty, + environmentLoadedWorkspaceId, + environmentSavedScript, + environmentSavedScriptFromWorkspace, + environmentWorkspace, + ]); + + const handleSaveEnvironmentSetup = async () => { + if (!environmentWorkspace || environmentSaving) { + return; + } + const nextScript = environmentDraftNormalized; + setEnvironmentSaving(true); + setEnvironmentError(null); + try { + await onUpdateWorkspaceSettings(environmentWorkspace.id, { + worktreeSetupScript: nextScript, + }); + setEnvironmentSavedScript(nextScript); + setEnvironmentDraftScript(nextScript ?? ""); + } catch (error) { + setEnvironmentError(error instanceof Error ? error.message : String(error)); + } finally { + setEnvironmentSaving(false); + } + }; + + return { + mainWorkspaces, + environmentWorkspace, + environmentSaving, + environmentError, + environmentDraftScript, + environmentSavedScript, + environmentDirty, + onSetEnvironmentWorkspaceId: setEnvironmentWorkspaceId, + onSetEnvironmentDraftScript: setEnvironmentDraftScript, + onSaveEnvironmentSetup: handleSaveEnvironmentSetup, + }; +}; diff --git a/src/features/settings/hooks/useSettingsFeaturesSection.ts b/src/features/settings/hooks/useSettingsFeaturesSection.ts new file mode 100644 index 000000000..f93c71c7f --- /dev/null +++ b/src/features/settings/hooks/useSettingsFeaturesSection.ts @@ -0,0 +1,48 @@ +import { useCallback, useState } from "react"; +import { revealItemInDir } from "@tauri-apps/plugin-opener"; +import type { AppSettings } from "@/types"; +import { getCodexConfigPath } from "@services/tauri"; + +type UseSettingsFeaturesSectionArgs = { + appSettings: AppSettings; + hasCodexHomeOverrides: boolean; + onUpdateAppSettings: (next: AppSettings) => Promise; +}; + +export type SettingsFeaturesSectionProps = { + appSettings: AppSettings; + hasCodexHomeOverrides: boolean; + openConfigError: string | null; + onOpenConfig: () => void; + onUpdateAppSettings: (next: AppSettings) => Promise; +}; + +export const useSettingsFeaturesSection = ({ + appSettings, + hasCodexHomeOverrides, + onUpdateAppSettings, +}: UseSettingsFeaturesSectionArgs): SettingsFeaturesSectionProps => { + const [openConfigError, setOpenConfigError] = useState(null); + + const handleOpenConfig = useCallback(async () => { + setOpenConfigError(null); + try { + const configPath = await getCodexConfigPath(); + await revealItemInDir(configPath); + } catch (error) { + setOpenConfigError( + error instanceof Error ? error.message : "Unable to open config.", + ); + } + }, []); + + return { + appSettings, + hasCodexHomeOverrides, + openConfigError, + onOpenConfig: () => { + void handleOpenConfig(); + }, + onUpdateAppSettings, + }; +}; diff --git a/src/features/settings/hooks/useSettingsGitSection.ts b/src/features/settings/hooks/useSettingsGitSection.ts new file mode 100644 index 000000000..1713ed1e4 --- /dev/null +++ b/src/features/settings/hooks/useSettingsGitSection.ts @@ -0,0 +1,84 @@ +import { useCallback, useEffect, useState } from "react"; +import type { AppSettings } from "@/types"; +import { DEFAULT_COMMIT_MESSAGE_PROMPT } from "@utils/commitMessagePrompt"; + +type UseSettingsGitSectionArgs = { + appSettings: AppSettings; + onUpdateAppSettings: (next: AppSettings) => Promise; +}; + +export type SettingsGitSectionProps = { + appSettings: AppSettings; + onUpdateAppSettings: (next: AppSettings) => Promise; + commitMessagePromptDraft: string; + commitMessagePromptDirty: boolean; + commitMessagePromptSaving: boolean; + onSetCommitMessagePromptDraft: (value: string) => void; + onSaveCommitMessagePrompt: () => Promise; + onResetCommitMessagePrompt: () => Promise; +}; + +export const useSettingsGitSection = ({ + appSettings, + onUpdateAppSettings, +}: UseSettingsGitSectionArgs): SettingsGitSectionProps => { + const [commitMessagePromptDraft, setCommitMessagePromptDraft] = useState( + appSettings.commitMessagePrompt, + ); + const [commitMessagePromptSaving, setCommitMessagePromptSaving] = useState(false); + + useEffect(() => { + setCommitMessagePromptDraft(appSettings.commitMessagePrompt); + }, [appSettings.commitMessagePrompt]); + + const commitMessagePromptDirty = + commitMessagePromptDraft !== appSettings.commitMessagePrompt; + + const handleSaveCommitMessagePrompt = useCallback(async () => { + if (commitMessagePromptSaving || !commitMessagePromptDirty) { + return; + } + setCommitMessagePromptSaving(true); + try { + await onUpdateAppSettings({ + ...appSettings, + commitMessagePrompt: commitMessagePromptDraft, + }); + } finally { + setCommitMessagePromptSaving(false); + } + }, [ + appSettings, + commitMessagePromptDirty, + commitMessagePromptDraft, + commitMessagePromptSaving, + onUpdateAppSettings, + ]); + + const handleResetCommitMessagePrompt = useCallback(async () => { + if (commitMessagePromptSaving) { + return; + } + setCommitMessagePromptDraft(DEFAULT_COMMIT_MESSAGE_PROMPT); + setCommitMessagePromptSaving(true); + try { + await onUpdateAppSettings({ + ...appSettings, + commitMessagePrompt: DEFAULT_COMMIT_MESSAGE_PROMPT, + }); + } finally { + setCommitMessagePromptSaving(false); + } + }, [appSettings, commitMessagePromptSaving, onUpdateAppSettings]); + + return { + appSettings, + onUpdateAppSettings, + commitMessagePromptDraft, + commitMessagePromptDirty, + commitMessagePromptSaving, + onSetCommitMessagePromptDraft: setCommitMessagePromptDraft, + onSaveCommitMessagePrompt: handleSaveCommitMessagePrompt, + onResetCommitMessagePrompt: handleResetCommitMessagePrompt, + }; +}; diff --git a/src/features/settings/hooks/useSettingsOpenAppDrafts.ts b/src/features/settings/hooks/useSettingsOpenAppDrafts.ts index bf35bce70..25249a261 100644 --- a/src/features/settings/hooks/useSettingsOpenAppDrafts.ts +++ b/src/features/settings/hooks/useSettingsOpenAppDrafts.ts @@ -1,14 +1,14 @@ import { useCallback, useEffect, useState } from "react"; -import type { AppSettings, OpenAppTarget } from "../../../types"; -import { DEFAULT_OPEN_APP_ID, OPEN_APP_STORAGE_KEY } from "../../app/constants"; -import type { OpenAppDraft } from "../components/settingsTypes"; +import type { AppSettings, OpenAppTarget } from "@/types"; +import { DEFAULT_OPEN_APP_ID, OPEN_APP_STORAGE_KEY } from "@app/constants"; +import type { OpenAppDraft } from "@settings/components/settingsTypes"; import { buildOpenAppDrafts, createOpenAppId, isOpenAppDraftComplete, isOpenAppTargetComplete, normalizeOpenAppTargets, -} from "../components/settingsViewHelpers"; +} from "@settings/components/settingsViewHelpers"; type UseSettingsOpenAppDraftsParams = { appSettings: AppSettings; diff --git a/src/features/settings/hooks/useSettingsProjectsSection.ts b/src/features/settings/hooks/useSettingsProjectsSection.ts new file mode 100644 index 000000000..a040883ab --- /dev/null +++ b/src/features/settings/hooks/useSettingsProjectsSection.ts @@ -0,0 +1,194 @@ +import { useEffect, useMemo, useState } from "react"; +import type { Dispatch, SetStateAction } from "react"; +import { ask, open } from "@tauri-apps/plugin-dialog"; +import type { AppSettings, WorkspaceGroup, WorkspaceInfo } from "@/types"; +import type { GroupedWorkspaces } from "./settingsSectionTypes"; + +type UseSettingsProjectsSectionArgs = { + appSettings: AppSettings; + workspaceGroups: WorkspaceGroup[]; + groupedWorkspaces: GroupedWorkspaces; + ungroupedLabel: string; + projects: WorkspaceInfo[]; + onUpdateAppSettings: (next: AppSettings) => Promise; + onMoveWorkspace: (id: string, direction: "up" | "down") => void; + onDeleteWorkspace: (id: string) => void; + onCreateWorkspaceGroup: (name: string) => Promise; + onRenameWorkspaceGroup: (id: string, name: string) => Promise; + onMoveWorkspaceGroup: (id: string, direction: "up" | "down") => Promise; + onDeleteWorkspaceGroup: (id: string) => Promise; + onAssignWorkspaceGroup: ( + workspaceId: string, + groupId: string | null, + ) => Promise; +}; + +export type SettingsProjectsSectionProps = { + workspaceGroups: WorkspaceGroup[]; + groupedWorkspaces: GroupedWorkspaces; + ungroupedLabel: string; + groupDrafts: Record; + newGroupName: string; + groupError: string | null; + projects: WorkspaceInfo[]; + canCreateGroup: boolean; + onSetNewGroupName: Dispatch>; + onSetGroupDrafts: Dispatch>>; + onCreateGroup: () => Promise; + onRenameGroup: (group: WorkspaceGroup) => Promise; + onMoveWorkspaceGroup: (id: string, direction: "up" | "down") => Promise; + onDeleteGroup: (group: WorkspaceGroup) => Promise; + onChooseGroupCopiesFolder: (group: WorkspaceGroup) => Promise; + onClearGroupCopiesFolder: (group: WorkspaceGroup) => Promise; + onAssignWorkspaceGroup: ( + workspaceId: string, + groupId: string | null, + ) => Promise; + onMoveWorkspace: (id: string, direction: "up" | "down") => void; + onDeleteWorkspace: (id: string) => void; +}; + +export const useSettingsProjectsSection = ({ + appSettings, + workspaceGroups, + groupedWorkspaces, + ungroupedLabel, + projects, + onUpdateAppSettings, + onMoveWorkspace, + onDeleteWorkspace, + onCreateWorkspaceGroup, + onRenameWorkspaceGroup, + onMoveWorkspaceGroup, + onDeleteWorkspaceGroup, + onAssignWorkspaceGroup, +}: UseSettingsProjectsSectionArgs): SettingsProjectsSectionProps => { + const [groupDrafts, setGroupDrafts] = useState>({}); + const [newGroupName, setNewGroupName] = useState(""); + const [groupError, setGroupError] = useState(null); + + useEffect(() => { + setGroupDrafts((prev) => { + const next: Record = {}; + workspaceGroups.forEach((group) => { + next[group.id] = prev[group.id] ?? group.name; + }); + return next; + }); + }, [workspaceGroups]); + + const trimmedGroupName = useMemo(() => newGroupName.trim(), [newGroupName]); + const canCreateGroup = Boolean(trimmedGroupName); + + const handleCreateGroup = async () => { + setGroupError(null); + try { + const created = await onCreateWorkspaceGroup(newGroupName); + if (created) { + setNewGroupName(""); + } + } catch (error) { + setGroupError(error instanceof Error ? error.message : String(error)); + } + }; + + const handleRenameGroup = async (group: WorkspaceGroup) => { + const draft = groupDrafts[group.id] ?? ""; + const trimmed = draft.trim(); + if (!trimmed || trimmed === group.name) { + setGroupDrafts((prev) => ({ + ...prev, + [group.id]: group.name, + })); + return; + } + setGroupError(null); + try { + await onRenameWorkspaceGroup(group.id, trimmed); + } catch (error) { + setGroupError(error instanceof Error ? error.message : String(error)); + setGroupDrafts((prev) => ({ + ...prev, + [group.id]: group.name, + })); + } + }; + + const updateGroupCopiesFolder = async ( + groupId: string, + copiesFolder: string | null, + ) => { + setGroupError(null); + try { + await onUpdateAppSettings({ + ...appSettings, + workspaceGroups: appSettings.workspaceGroups.map((entry) => + entry.id === groupId ? { ...entry, copiesFolder } : entry, + ), + }); + } catch (error) { + setGroupError(error instanceof Error ? error.message : String(error)); + } + }; + + const handleChooseGroupCopiesFolder = async (group: WorkspaceGroup) => { + const selection = await open({ multiple: false, directory: true }); + if (!selection || Array.isArray(selection)) { + return; + } + await updateGroupCopiesFolder(group.id, selection); + }; + + const handleClearGroupCopiesFolder = async (group: WorkspaceGroup) => { + if (!group.copiesFolder) { + return; + } + await updateGroupCopiesFolder(group.id, null); + }; + + const handleDeleteGroup = async (group: WorkspaceGroup) => { + const groupProjects = + groupedWorkspaces.find((entry) => entry.id === group.id)?.workspaces ?? []; + const detail = + groupProjects.length > 0 + ? `\n\nProjects in this group will move to "${ungroupedLabel}".` + : ""; + const confirmed = await ask(`Delete "${group.name}"?${detail}`, { + title: "Delete Group", + kind: "warning", + okLabel: "Delete", + cancelLabel: "Cancel", + }); + if (!confirmed) { + return; + } + setGroupError(null); + try { + await onDeleteWorkspaceGroup(group.id); + } catch (error) { + setGroupError(error instanceof Error ? error.message : String(error)); + } + }; + + return { + workspaceGroups, + groupedWorkspaces, + ungroupedLabel, + groupDrafts, + newGroupName, + groupError, + projects, + canCreateGroup, + onSetNewGroupName: setNewGroupName, + onSetGroupDrafts: setGroupDrafts, + onCreateGroup: handleCreateGroup, + onRenameGroup: handleRenameGroup, + onMoveWorkspaceGroup, + onDeleteGroup: handleDeleteGroup, + onChooseGroupCopiesFolder: handleChooseGroupCopiesFolder, + onClearGroupCopiesFolder: handleClearGroupCopiesFolder, + onAssignWorkspaceGroup, + onMoveWorkspace, + onDeleteWorkspace, + }; +}; diff --git a/src/features/settings/hooks/useSettingsServerSection.ts b/src/features/settings/hooks/useSettingsServerSection.ts new file mode 100644 index 000000000..100e17d22 --- /dev/null +++ b/src/features/settings/hooks/useSettingsServerSection.ts @@ -0,0 +1,693 @@ +import { useCallback, useEffect, useMemo, useRef, useState } from "react"; +import type { Dispatch, SetStateAction } from "react"; +import type { + AppSettings, + TailscaleDaemonCommandPreview, + TailscaleStatus, + TcpDaemonStatus, +} from "@/types"; +import { + listWorkspaces, + tailscaleDaemonCommandPreview as fetchTailscaleDaemonCommandPreview, + tailscaleDaemonStart, + tailscaleDaemonStatus, + tailscaleDaemonStop, + tailscaleStatus as fetchTailscaleStatus, +} from "@services/tauri"; +import { isMobilePlatform } from "@utils/platformPaths"; +import type { OrbitServiceClient } from "@settings/components/settingsTypes"; +import { + DEFAULT_REMOTE_HOST, + ORBIT_DEFAULT_POLL_INTERVAL_SECONDS, + ORBIT_MAX_INLINE_POLL_SECONDS, +} from "@settings/components/settingsViewConstants"; +import { + delay, + getOrbitStatusText, + normalizeOverrideValue, + type OrbitActionResult, +} from "@settings/components/settingsViewHelpers"; + +type UseSettingsServerSectionArgs = { + appSettings: AppSettings; + onUpdateAppSettings: (next: AppSettings) => Promise; + onMobileConnectSuccess?: () => Promise | void; + orbitServiceClient: OrbitServiceClient; +}; + +export type SettingsServerSectionProps = { + appSettings: AppSettings; + onUpdateAppSettings: (next: AppSettings) => Promise; + isMobilePlatform: boolean; + mobileConnectBusy: boolean; + mobileConnectStatusText: string | null; + mobileConnectStatusError: boolean; + remoteHostDraft: string; + remoteTokenDraft: string; + orbitWsUrlDraft: string; + orbitAuthUrlDraft: string; + orbitRunnerNameDraft: string; + orbitAccessClientIdDraft: string; + orbitAccessClientSecretRefDraft: string; + orbitStatusText: string | null; + orbitAuthCode: string | null; + orbitVerificationUrl: string | null; + orbitBusyAction: string | null; + tailscaleStatus: TailscaleStatus | null; + tailscaleStatusBusy: boolean; + tailscaleStatusError: string | null; + tailscaleCommandPreview: TailscaleDaemonCommandPreview | null; + tailscaleCommandBusy: boolean; + tailscaleCommandError: string | null; + tcpDaemonStatus: TcpDaemonStatus | null; + tcpDaemonBusyAction: "start" | "stop" | "status" | null; + onSetRemoteHostDraft: Dispatch>; + onSetRemoteTokenDraft: Dispatch>; + onSetOrbitWsUrlDraft: Dispatch>; + onSetOrbitAuthUrlDraft: Dispatch>; + onSetOrbitRunnerNameDraft: Dispatch>; + onSetOrbitAccessClientIdDraft: Dispatch>; + onSetOrbitAccessClientSecretRefDraft: Dispatch>; + onCommitRemoteHost: () => Promise; + onCommitRemoteToken: () => Promise; + onChangeRemoteProvider: (provider: AppSettings["remoteBackendProvider"]) => Promise; + onRefreshTailscaleStatus: () => void; + onRefreshTailscaleCommandPreview: () => void; + onUseSuggestedTailscaleHost: () => Promise; + onTcpDaemonStart: () => Promise; + onTcpDaemonStop: () => Promise; + onTcpDaemonStatus: () => Promise; + onCommitOrbitWsUrl: () => Promise; + onCommitOrbitAuthUrl: () => Promise; + onCommitOrbitRunnerName: () => Promise; + onCommitOrbitAccessClientId: () => Promise; + onCommitOrbitAccessClientSecretRef: () => Promise; + onOrbitConnectTest: () => void; + onOrbitSignIn: () => void; + onOrbitSignOut: () => void; + onOrbitRunnerStart: () => void; + onOrbitRunnerStop: () => void; + onOrbitRunnerStatus: () => void; + onMobileConnectTest: () => void; +}; + +const formatErrorMessage = (error: unknown, fallback: string) => { + if (error instanceof Error) { + return error.message; + } + if (typeof error === "string") { + return error; + } + if (error && typeof error === "object" && "message" in error) { + const message = (error as { message?: unknown }).message; + if (typeof message === "string") { + return message; + } + } + return fallback; +}; + +export const useSettingsServerSection = ({ + appSettings, + onUpdateAppSettings, + onMobileConnectSuccess, + orbitServiceClient, +}: UseSettingsServerSectionArgs): SettingsServerSectionProps => { + const [remoteHostDraft, setRemoteHostDraft] = useState(appSettings.remoteBackendHost); + const [remoteTokenDraft, setRemoteTokenDraft] = useState(appSettings.remoteBackendToken ?? ""); + const [orbitWsUrlDraft, setOrbitWsUrlDraft] = useState(appSettings.orbitWsUrl ?? ""); + const [orbitAuthUrlDraft, setOrbitAuthUrlDraft] = useState(appSettings.orbitAuthUrl ?? ""); + const [orbitRunnerNameDraft, setOrbitRunnerNameDraft] = useState( + appSettings.orbitRunnerName ?? "", + ); + const [orbitAccessClientIdDraft, setOrbitAccessClientIdDraft] = useState( + appSettings.orbitAccessClientId ?? "", + ); + const [orbitAccessClientSecretRefDraft, setOrbitAccessClientSecretRefDraft] = + useState(appSettings.orbitAccessClientSecretRef ?? ""); + const [orbitStatusText, setOrbitStatusText] = useState(null); + const [orbitAuthCode, setOrbitAuthCode] = useState(null); + const [orbitVerificationUrl, setOrbitVerificationUrl] = useState( + null, + ); + const [orbitBusyAction, setOrbitBusyAction] = useState(null); + const [tailscaleStatus, setTailscaleStatus] = useState(null); + const [tailscaleStatusBusy, setTailscaleStatusBusy] = useState(false); + const [tailscaleStatusError, setTailscaleStatusError] = useState(null); + const [tailscaleCommandPreview, setTailscaleCommandPreview] = + useState(null); + const [tailscaleCommandBusy, setTailscaleCommandBusy] = useState(false); + const [tailscaleCommandError, setTailscaleCommandError] = useState(null); + const [tcpDaemonStatus, setTcpDaemonStatus] = useState(null); + const [tcpDaemonBusyAction, setTcpDaemonBusyAction] = useState< + "start" | "stop" | "status" | null + >(null); + const [mobileConnectBusy, setMobileConnectBusy] = useState(false); + const [mobileConnectStatusText, setMobileConnectStatusText] = useState( + null, + ); + const [mobileConnectStatusError, setMobileConnectStatusError] = useState(false); + const mobilePlatform = useMemo(() => isMobilePlatform(), []); + + const latestSettingsRef = useRef(appSettings); + + useEffect(() => { + latestSettingsRef.current = appSettings; + }, [appSettings]); + + useEffect(() => { + setRemoteHostDraft(appSettings.remoteBackendHost); + }, [appSettings.remoteBackendHost]); + + useEffect(() => { + setRemoteTokenDraft(appSettings.remoteBackendToken ?? ""); + }, [appSettings.remoteBackendToken]); + + useEffect(() => { + setOrbitWsUrlDraft(appSettings.orbitWsUrl ?? ""); + }, [appSettings.orbitWsUrl]); + + useEffect(() => { + setOrbitAuthUrlDraft(appSettings.orbitAuthUrl ?? ""); + }, [appSettings.orbitAuthUrl]); + + useEffect(() => { + setOrbitRunnerNameDraft(appSettings.orbitRunnerName ?? ""); + }, [appSettings.orbitRunnerName]); + + useEffect(() => { + setOrbitAccessClientIdDraft(appSettings.orbitAccessClientId ?? ""); + }, [appSettings.orbitAccessClientId]); + + useEffect(() => { + setOrbitAccessClientSecretRefDraft(appSettings.orbitAccessClientSecretRef ?? ""); + }, [appSettings.orbitAccessClientSecretRef]); + + const updateRemoteBackendSettings = useCallback( + async ({ + host, + token, + provider, + orbitWsUrl, + }: { + host?: string; + token?: string | null; + provider?: AppSettings["remoteBackendProvider"]; + orbitWsUrl?: string | null; + }) => { + const latestSettings = latestSettingsRef.current; + const nextHost = host ?? latestSettings.remoteBackendHost; + const nextToken = + token === undefined ? latestSettings.remoteBackendToken : token; + const nextProvider = provider ?? latestSettings.remoteBackendProvider; + const nextOrbitWsUrl = + orbitWsUrl === undefined ? latestSettings.orbitWsUrl : orbitWsUrl; + const nextSettings: AppSettings = { + ...latestSettings, + remoteBackendHost: nextHost, + remoteBackendToken: nextToken, + remoteBackendProvider: nextProvider, + orbitWsUrl: nextOrbitWsUrl, + ...(mobilePlatform + ? { + backendMode: "remote", + } + : {}), + }; + const unchanged = + nextSettings.remoteBackendHost === latestSettings.remoteBackendHost && + nextSettings.remoteBackendToken === latestSettings.remoteBackendToken && + nextSettings.orbitWsUrl === latestSettings.orbitWsUrl && + nextSettings.backendMode === latestSettings.backendMode && + nextSettings.remoteBackendProvider === latestSettings.remoteBackendProvider; + if (unchanged) { + return; + } + await onUpdateAppSettings(nextSettings); + latestSettingsRef.current = nextSettings; + }, + [mobilePlatform, onUpdateAppSettings], + ); + + const applyRemoteHost = async (rawValue: string) => { + const nextHost = rawValue.trim() || DEFAULT_REMOTE_HOST; + setRemoteHostDraft(nextHost); + await updateRemoteBackendSettings({ host: nextHost }); + }; + + const handleCommitRemoteHost = async () => { + await applyRemoteHost(remoteHostDraft); + }; + + const handleCommitRemoteToken = async () => { + const nextToken = remoteTokenDraft.trim() ? remoteTokenDraft.trim() : null; + setRemoteTokenDraft(nextToken ?? ""); + await updateRemoteBackendSettings({ token: nextToken }); + }; + + const handleMobileConnectTest = () => { + void (async () => { + const provider = latestSettingsRef.current.remoteBackendProvider; + const nextToken = remoteTokenDraft.trim() ? remoteTokenDraft.trim() : null; + setRemoteTokenDraft(nextToken ?? ""); + setMobileConnectBusy(true); + setMobileConnectStatusText(null); + setMobileConnectStatusError(false); + try { + if (provider === "tcp") { + const nextHost = remoteHostDraft.trim() || DEFAULT_REMOTE_HOST; + setRemoteHostDraft(nextHost); + await updateRemoteBackendSettings({ + host: nextHost, + token: nextToken, + }); + } else { + const nextOrbitWsUrl = normalizeOverrideValue(orbitWsUrlDraft); + setOrbitWsUrlDraft(nextOrbitWsUrl ?? ""); + if (!nextOrbitWsUrl) { + throw new Error("Orbit websocket URL is required."); + } + await updateRemoteBackendSettings({ + token: nextToken, + orbitWsUrl: nextOrbitWsUrl, + }); + } + const workspaces = await listWorkspaces(); + const workspaceCount = workspaces.length; + const workspaceWord = workspaceCount === 1 ? "workspace" : "workspaces"; + setMobileConnectStatusText( + `Connected. ${workspaceCount} ${workspaceWord} reachable on the remote backend.`, + ); + await onMobileConnectSuccess?.(); + } catch (error) { + setMobileConnectStatusError(true); + setMobileConnectStatusText( + error instanceof Error ? error.message : "Unable to connect to remote backend.", + ); + } finally { + setMobileConnectBusy(false); + } + })(); + }; + + useEffect(() => { + if (!mobilePlatform) { + return; + } + setMobileConnectStatusText(null); + setMobileConnectStatusError(false); + }, [ + appSettings.remoteBackendProvider, + mobilePlatform, + orbitWsUrlDraft, + remoteHostDraft, + remoteTokenDraft, + ]); + + const handleChangeRemoteProvider = async ( + provider: AppSettings["remoteBackendProvider"], + ) => { + if (provider === latestSettingsRef.current.remoteBackendProvider) { + return; + } + await updateRemoteBackendSettings({ + provider, + }); + }; + + const handleRefreshTailscaleStatus = useCallback(() => { + void (async () => { + setTailscaleStatusBusy(true); + setTailscaleStatusError(null); + try { + const status = await fetchTailscaleStatus(); + setTailscaleStatus(status); + } catch (error) { + setTailscaleStatusError( + formatErrorMessage(error, "Unable to load Tailscale status."), + ); + } finally { + setTailscaleStatusBusy(false); + } + })(); + }, []); + + const handleRefreshTailscaleCommandPreview = useCallback(() => { + void (async () => { + setTailscaleCommandBusy(true); + setTailscaleCommandError(null); + try { + const preview = await fetchTailscaleDaemonCommandPreview(); + setTailscaleCommandPreview(preview); + } catch (error) { + setTailscaleCommandError( + formatErrorMessage(error, "Unable to build Tailscale daemon command."), + ); + } finally { + setTailscaleCommandBusy(false); + } + })(); + }, []); + + const handleUseSuggestedTailscaleHost = async () => { + const suggestedHost = tailscaleStatus?.suggestedRemoteHost ?? null; + if (!suggestedHost) { + return; + } + await applyRemoteHost(suggestedHost); + }; + + const runTcpDaemonAction = useCallback( + async ( + action: "start" | "stop" | "status", + run: () => Promise, + ) => { + setTcpDaemonBusyAction(action); + try { + const status = await run(); + setTcpDaemonStatus(status); + } catch (error) { + const errorMessage = + error instanceof Error + ? error.message + : typeof error === "string" + ? error + : "Unable to update mobile access daemon status."; + setTcpDaemonStatus((prev) => ({ + state: "error", + pid: null, + startedAtMs: null, + lastError: errorMessage, + listenAddr: prev?.listenAddr ?? null, + })); + } finally { + setTcpDaemonBusyAction(null); + } + }, + [], + ); + + const handleTcpDaemonStart = useCallback(async () => { + await runTcpDaemonAction("start", tailscaleDaemonStart); + }, [runTcpDaemonAction]); + + const handleTcpDaemonStop = useCallback(async () => { + await runTcpDaemonAction("stop", tailscaleDaemonStop); + }, [runTcpDaemonAction]); + + const handleTcpDaemonStatus = useCallback(async () => { + await runTcpDaemonAction("status", tailscaleDaemonStatus); + }, [runTcpDaemonAction]); + + const handleCommitOrbitWsUrl = async () => { + const nextValue = normalizeOverrideValue(orbitWsUrlDraft); + setOrbitWsUrlDraft(nextValue ?? ""); + await updateRemoteBackendSettings({ + orbitWsUrl: nextValue, + }); + }; + + const handleCommitOrbitAuthUrl = async () => { + const nextValue = normalizeOverrideValue(orbitAuthUrlDraft); + setOrbitAuthUrlDraft(nextValue ?? ""); + if (nextValue === appSettings.orbitAuthUrl) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + orbitAuthUrl: nextValue, + }); + }; + + const handleCommitOrbitRunnerName = async () => { + const nextValue = normalizeOverrideValue(orbitRunnerNameDraft); + setOrbitRunnerNameDraft(nextValue ?? ""); + if (nextValue === appSettings.orbitRunnerName) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + orbitRunnerName: nextValue, + }); + }; + + const handleCommitOrbitAccessClientId = async () => { + const nextValue = normalizeOverrideValue(orbitAccessClientIdDraft); + setOrbitAccessClientIdDraft(nextValue ?? ""); + if (nextValue === appSettings.orbitAccessClientId) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + orbitAccessClientId: nextValue, + }); + }; + + const handleCommitOrbitAccessClientSecretRef = async () => { + const nextValue = normalizeOverrideValue(orbitAccessClientSecretRefDraft); + setOrbitAccessClientSecretRefDraft(nextValue ?? ""); + if (nextValue === appSettings.orbitAccessClientSecretRef) { + return; + } + await onUpdateAppSettings({ + ...appSettings, + orbitAccessClientSecretRef: nextValue, + }); + }; + + const runOrbitAction = async ( + actionKey: string, + actionLabel: string, + action: () => Promise, + successFallback: string, + ): Promise => { + setOrbitBusyAction(actionKey); + setOrbitStatusText(`${actionLabel}...`); + try { + const result = await action(); + setOrbitStatusText(getOrbitStatusText(result, successFallback)); + return result; + } catch (error) { + const message = error instanceof Error ? error.message : "Unknown Orbit error"; + setOrbitStatusText(`${actionLabel} failed: ${message}`); + return null; + } finally { + setOrbitBusyAction(null); + } + }; + + const syncRemoteBackendToken = async (nextToken: string | null) => { + const normalizedToken = nextToken?.trim() ? nextToken.trim() : null; + setRemoteTokenDraft(normalizedToken ?? ""); + const latestSettings = latestSettingsRef.current; + if (normalizedToken === latestSettings.remoteBackendToken) { + return; + } + const nextSettings = { + ...latestSettings, + remoteBackendToken: normalizedToken, + }; + await onUpdateAppSettings({ + ...nextSettings, + }); + latestSettingsRef.current = nextSettings; + }; + + const handleOrbitConnectTest = () => { + void runOrbitAction( + "connect-test", + "Connect test", + orbitServiceClient.orbitConnectTest, + "Orbit connection test succeeded.", + ); + }; + + const handleOrbitSignIn = () => { + void (async () => { + setOrbitBusyAction("sign-in"); + setOrbitStatusText("Starting Orbit sign in..."); + setOrbitAuthCode(null); + setOrbitVerificationUrl(null); + try { + const startResult = await orbitServiceClient.orbitSignInStart(); + setOrbitAuthCode(startResult.userCode ?? startResult.deviceCode); + setOrbitVerificationUrl( + startResult.verificationUriComplete ?? startResult.verificationUri, + ); + setOrbitStatusText( + "Orbit sign in started. Finish authorization in the browser window, then keep this dialog open while we poll for completion.", + ); + + const maxPollWindowSeconds = Math.max( + 1, + Math.min(startResult.expiresInSeconds, ORBIT_MAX_INLINE_POLL_SECONDS), + ); + const deadlineMs = Date.now() + maxPollWindowSeconds * 1000; + let pollIntervalSeconds = Math.max( + 1, + startResult.intervalSeconds || ORBIT_DEFAULT_POLL_INTERVAL_SECONDS, + ); + + while (Date.now() < deadlineMs) { + await delay(pollIntervalSeconds * 1000); + const pollResult = await orbitServiceClient.orbitSignInPoll( + startResult.deviceCode, + ); + setOrbitStatusText( + getOrbitStatusText(pollResult, "Orbit sign in status refreshed."), + ); + + if (pollResult.status === "pending") { + if (typeof pollResult.intervalSeconds === "number") { + pollIntervalSeconds = Math.max(1, pollResult.intervalSeconds); + } + continue; + } + + if (pollResult.status === "authorized") { + if (pollResult.token) { + await syncRemoteBackendToken(pollResult.token); + } + } + return; + } + + setOrbitStatusText( + "Orbit sign in is still pending. Leave this window open and try Sign In again if authorization just completed.", + ); + } catch (error) { + const message = error instanceof Error ? error.message : "Unknown Orbit error"; + setOrbitStatusText(`Sign In failed: ${message}`); + } finally { + setOrbitBusyAction(null); + } + })(); + }; + + const handleOrbitSignOut = () => { + void (async () => { + const result = await runOrbitAction( + "sign-out", + "Sign Out", + orbitServiceClient.orbitSignOut, + "Signed out from Orbit.", + ); + if (result !== null) { + try { + await syncRemoteBackendToken(null); + setOrbitAuthCode(null); + setOrbitVerificationUrl(null); + } catch (error) { + const message = error instanceof Error ? error.message : "Unknown Orbit error"; + setOrbitStatusText(`Sign Out failed: ${message}`); + } + } + })(); + }; + + const handleOrbitRunnerStart = () => { + void runOrbitAction( + "runner-start", + "Start Runner", + orbitServiceClient.orbitRunnerStart, + "Orbit runner started.", + ); + }; + + const handleOrbitRunnerStop = () => { + void runOrbitAction( + "runner-stop", + "Stop Runner", + orbitServiceClient.orbitRunnerStop, + "Orbit runner stopped.", + ); + }; + + const handleOrbitRunnerStatus = () => { + void runOrbitAction( + "runner-status", + "Refresh Status", + orbitServiceClient.orbitRunnerStatus, + "Orbit runner status refreshed.", + ); + }; + + useEffect(() => { + if (appSettings.remoteBackendProvider !== "tcp") { + return; + } + if (!mobilePlatform) { + handleRefreshTailscaleCommandPreview(); + void handleTcpDaemonStatus(); + } + if (tailscaleStatus === null && !tailscaleStatusBusy && !tailscaleStatusError) { + handleRefreshTailscaleStatus(); + } + }, [ + appSettings.remoteBackendProvider, + appSettings.remoteBackendToken, + handleRefreshTailscaleCommandPreview, + handleRefreshTailscaleStatus, + handleTcpDaemonStatus, + mobilePlatform, + tailscaleStatus, + tailscaleStatusBusy, + tailscaleStatusError, + ]); + + return { + appSettings, + onUpdateAppSettings, + remoteHostDraft, + remoteTokenDraft, + orbitWsUrlDraft, + orbitAuthUrlDraft, + orbitRunnerNameDraft, + orbitAccessClientIdDraft, + orbitAccessClientSecretRefDraft, + orbitStatusText, + orbitAuthCode, + orbitVerificationUrl, + orbitBusyAction, + tailscaleStatus, + tailscaleStatusBusy, + tailscaleStatusError, + tailscaleCommandPreview, + tailscaleCommandBusy, + tailscaleCommandError, + tcpDaemonStatus, + tcpDaemonBusyAction, + onSetRemoteHostDraft: setRemoteHostDraft, + onSetRemoteTokenDraft: setRemoteTokenDraft, + onSetOrbitWsUrlDraft: setOrbitWsUrlDraft, + onSetOrbitAuthUrlDraft: setOrbitAuthUrlDraft, + onSetOrbitRunnerNameDraft: setOrbitRunnerNameDraft, + onSetOrbitAccessClientIdDraft: setOrbitAccessClientIdDraft, + onSetOrbitAccessClientSecretRefDraft: setOrbitAccessClientSecretRefDraft, + onCommitRemoteHost: handleCommitRemoteHost, + onCommitRemoteToken: handleCommitRemoteToken, + onChangeRemoteProvider: handleChangeRemoteProvider, + onRefreshTailscaleStatus: handleRefreshTailscaleStatus, + onRefreshTailscaleCommandPreview: handleRefreshTailscaleCommandPreview, + onUseSuggestedTailscaleHost: handleUseSuggestedTailscaleHost, + onTcpDaemonStart: handleTcpDaemonStart, + onTcpDaemonStop: handleTcpDaemonStop, + onTcpDaemonStatus: handleTcpDaemonStatus, + onCommitOrbitWsUrl: handleCommitOrbitWsUrl, + onCommitOrbitAuthUrl: handleCommitOrbitAuthUrl, + onCommitOrbitRunnerName: handleCommitOrbitRunnerName, + onCommitOrbitAccessClientId: handleCommitOrbitAccessClientId, + onCommitOrbitAccessClientSecretRef: handleCommitOrbitAccessClientSecretRef, + onOrbitConnectTest: handleOrbitConnectTest, + onOrbitSignIn: handleOrbitSignIn, + onOrbitSignOut: handleOrbitSignOut, + onOrbitRunnerStart: handleOrbitRunnerStart, + onOrbitRunnerStop: handleOrbitRunnerStop, + onOrbitRunnerStatus: handleOrbitRunnerStatus, + isMobilePlatform: mobilePlatform, + mobileConnectBusy, + mobileConnectStatusText, + mobileConnectStatusError, + onMobileConnectTest: handleMobileConnectTest, + }; +}; diff --git a/src/features/settings/hooks/useSettingsShortcutDrafts.ts b/src/features/settings/hooks/useSettingsShortcutDrafts.ts index 26b86428c..74e876175 100644 --- a/src/features/settings/hooks/useSettingsShortcutDrafts.ts +++ b/src/features/settings/hooks/useSettingsShortcutDrafts.ts @@ -1,10 +1,10 @@ import { useEffect, useState } from "react"; import type { KeyboardEvent as ReactKeyboardEvent } from "react"; -import type { AppSettings } from "../../../types"; -import { buildShortcutValue } from "../../../utils/shortcuts"; -import type { ShortcutSettingKey } from "../components/settingsTypes"; -import { SHORTCUT_DRAFT_KEY_BY_SETTING } from "../components/settingsViewConstants"; -import { buildShortcutDrafts } from "../components/settingsViewHelpers"; +import type { AppSettings } from "@/types"; +import { buildShortcutValue } from "@utils/shortcuts"; +import type { ShortcutSettingKey } from "@settings/components/settingsTypes"; +import { SHORTCUT_DRAFT_KEY_BY_SETTING } from "@settings/components/settingsViewConstants"; +import { buildShortcutDrafts } from "@settings/components/settingsViewHelpers"; type UseSettingsShortcutDraftsParams = { appSettings: AppSettings; diff --git a/src/features/settings/hooks/useSettingsViewNavigation.ts b/src/features/settings/hooks/useSettingsViewNavigation.ts index 206005959..1450c5a86 100644 --- a/src/features/settings/hooks/useSettingsViewNavigation.ts +++ b/src/features/settings/hooks/useSettingsViewNavigation.ts @@ -1,7 +1,7 @@ import { useCallback, useEffect, useState } from "react"; -import type { CodexSection } from "../components/settingsTypes"; -import { SETTINGS_MOBILE_BREAKPOINT_PX } from "../components/settingsViewConstants"; -import { isNarrowSettingsViewport } from "../components/settingsViewHelpers"; +import type { CodexSection } from "@settings/components/settingsTypes"; +import { SETTINGS_MOBILE_BREAKPOINT_PX } from "@settings/components/settingsViewConstants"; +import { isNarrowSettingsViewport } from "@settings/components/settingsViewHelpers"; type UseSettingsViewNavigationParams = { initialSection?: CodexSection; diff --git a/src/features/settings/hooks/useSettingsViewOrchestration.ts b/src/features/settings/hooks/useSettingsViewOrchestration.ts new file mode 100644 index 000000000..1c1b8a7bf --- /dev/null +++ b/src/features/settings/hooks/useSettingsViewOrchestration.ts @@ -0,0 +1,271 @@ +import { useMemo } from "react"; +import type { + AppSettings, + CodexDoctorResult, + CodexUpdateResult, + DictationModelStatus, + WorkspaceGroup, + WorkspaceSettings, +} from "@/types"; +import { isMacPlatform, isWindowsPlatform } from "@utils/platformPaths"; +import { useSettingsOpenAppDrafts } from "./useSettingsOpenAppDrafts"; +import { useSettingsShortcutDrafts } from "./useSettingsShortcutDrafts"; +import { useSettingsCodexSection } from "./useSettingsCodexSection"; +import { useSettingsDisplaySection } from "./useSettingsDisplaySection"; +import { useSettingsEnvironmentsSection } from "./useSettingsEnvironmentsSection"; +import { useSettingsFeaturesSection } from "./useSettingsFeaturesSection"; +import { useSettingsGitSection } from "./useSettingsGitSection"; +import { useSettingsProjectsSection } from "./useSettingsProjectsSection"; +import { useSettingsServerSection } from "./useSettingsServerSection"; +import type { GroupedWorkspaces } from "./settingsSectionTypes"; +import type { OrbitServiceClient } from "@settings/components/settingsTypes"; +import { + COMPOSER_PRESET_CONFIGS, + COMPOSER_PRESET_LABELS, + DICTATION_MODELS, +} from "@settings/components/settingsViewConstants"; + +type UseSettingsViewOrchestrationArgs = { + workspaceGroups: WorkspaceGroup[]; + groupedWorkspaces: GroupedWorkspaces; + ungroupedLabel: string; + reduceTransparency: boolean; + onToggleTransparency: (value: boolean) => void; + appSettings: AppSettings; + openAppIconById: Record; + onUpdateAppSettings: (next: AppSettings) => Promise; + onRunDoctor: ( + codexBin: string | null, + codexArgs: string | null, + ) => Promise; + onRunCodexUpdate?: ( + codexBin: string | null, + codexArgs: string | null, + ) => Promise; + onUpdateWorkspaceCodexBin: (id: string, codexBin: string | null) => Promise; + onUpdateWorkspaceSettings: ( + id: string, + settings: Partial, + ) => Promise; + scaleShortcutTitle: string; + scaleShortcutText: string; + onTestNotificationSound: () => void; + onTestSystemNotification: () => void; + onMobileConnectSuccess?: () => Promise | void; + onMoveWorkspace: (id: string, direction: "up" | "down") => void; + onDeleteWorkspace: (id: string) => void; + onCreateWorkspaceGroup: (name: string) => Promise; + onRenameWorkspaceGroup: (id: string, name: string) => Promise; + onMoveWorkspaceGroup: (id: string, direction: "up" | "down") => Promise; + onDeleteWorkspaceGroup: (id: string) => Promise; + onAssignWorkspaceGroup: ( + workspaceId: string, + groupId: string | null, + ) => Promise; + dictationModelStatus?: DictationModelStatus | null; + onDownloadDictationModel?: () => void; + onCancelDictationDownload?: () => void; + onRemoveDictationModel?: () => void; + orbitServiceClient: OrbitServiceClient; +}; + +export function useSettingsViewOrchestration({ + workspaceGroups, + groupedWorkspaces, + ungroupedLabel, + reduceTransparency, + onToggleTransparency, + appSettings, + openAppIconById, + onUpdateAppSettings, + onRunDoctor, + onRunCodexUpdate, + onUpdateWorkspaceCodexBin, + onUpdateWorkspaceSettings, + scaleShortcutTitle, + scaleShortcutText, + onTestNotificationSound, + onTestSystemNotification, + onMobileConnectSuccess, + onMoveWorkspace, + onDeleteWorkspace, + onCreateWorkspaceGroup, + onRenameWorkspaceGroup, + onMoveWorkspaceGroup, + onDeleteWorkspaceGroup, + onAssignWorkspaceGroup, + dictationModelStatus, + onDownloadDictationModel, + onCancelDictationDownload, + onRemoveDictationModel, + orbitServiceClient, +}: UseSettingsViewOrchestrationArgs) { + const projects = useMemo( + () => groupedWorkspaces.flatMap((group) => group.workspaces), + [groupedWorkspaces], + ); + const mainWorkspaces = useMemo( + () => projects.filter((workspace) => (workspace.kind ?? "main") !== "worktree"), + [projects], + ); + const hasCodexHomeOverrides = useMemo( + () => projects.some((workspace) => workspace.settings.codexHome != null), + [projects], + ); + + const optionKeyLabel = isMacPlatform() ? "Option" : "Alt"; + const metaKeyLabel = isMacPlatform() + ? "Command" + : isWindowsPlatform() + ? "Windows" + : "Meta"; + + const selectedDictationModel = useMemo(() => { + return ( + DICTATION_MODELS.find( + (model) => model.id === appSettings.dictationModelId, + ) ?? DICTATION_MODELS[1] + ); + }, [appSettings.dictationModelId]); + + const dictationReady = dictationModelStatus?.state === "ready"; + + const { + openAppDrafts, + openAppSelectedId, + handleOpenAppDraftChange, + handleOpenAppKindChange, + handleCommitOpenAppsDrafts, + handleMoveOpenApp, + handleDeleteOpenApp, + handleAddOpenApp, + handleSelectOpenAppDefault, + } = useSettingsOpenAppDrafts({ + appSettings, + onUpdateAppSettings, + }); + + const { shortcutDrafts, handleShortcutKeyDown, clearShortcut } = + useSettingsShortcutDrafts({ + appSettings, + onUpdateAppSettings, + }); + + const projectsSectionProps = useSettingsProjectsSection({ + appSettings, + workspaceGroups, + groupedWorkspaces, + ungroupedLabel, + projects, + onUpdateAppSettings, + onMoveWorkspace, + onDeleteWorkspace, + onCreateWorkspaceGroup, + onRenameWorkspaceGroup, + onMoveWorkspaceGroup, + onDeleteWorkspaceGroup, + onAssignWorkspaceGroup, + }); + + const environmentsSectionProps = useSettingsEnvironmentsSection({ + mainWorkspaces, + onUpdateWorkspaceSettings, + }); + + const displaySectionProps = useSettingsDisplaySection({ + appSettings, + reduceTransparency, + onToggleTransparency, + onUpdateAppSettings, + scaleShortcutTitle, + scaleShortcutText, + onTestNotificationSound, + onTestSystemNotification, + }); + + const gitSectionProps = useSettingsGitSection({ + appSettings, + onUpdateAppSettings, + }); + + const serverSectionProps = useSettingsServerSection({ + appSettings, + onUpdateAppSettings, + onMobileConnectSuccess, + orbitServiceClient, + }); + + const codexSectionProps = useSettingsCodexSection({ + appSettings, + projects, + onUpdateAppSettings, + onRunDoctor, + onRunCodexUpdate, + onUpdateWorkspaceCodexBin, + onUpdateWorkspaceSettings, + }); + + const featuresSectionProps = useSettingsFeaturesSection({ + appSettings, + hasCodexHomeOverrides, + onUpdateAppSettings, + }); + + return { + projectsSectionProps, + environmentsSectionProps, + displaySectionProps, + composerSectionProps: { + appSettings, + optionKeyLabel, + composerPresetLabels: COMPOSER_PRESET_LABELS, + onComposerPresetChange: ( + preset: AppSettings["composerEditorPreset"], + ) => { + const config = COMPOSER_PRESET_CONFIGS[preset]; + void onUpdateAppSettings({ + ...appSettings, + composerEditorPreset: preset, + ...config, + }); + }, + onUpdateAppSettings, + }, + dictationSectionProps: { + appSettings, + optionKeyLabel, + metaKeyLabel, + dictationModels: DICTATION_MODELS, + selectedDictationModel, + dictationModelStatus, + dictationReady, + onUpdateAppSettings, + onDownloadDictationModel, + onCancelDictationDownload, + onRemoveDictationModel, + }, + shortcutsSectionProps: { + shortcutDrafts, + onShortcutKeyDown: handleShortcutKeyDown, + onClearShortcut: clearShortcut, + }, + openAppsSectionProps: { + openAppDrafts, + openAppSelectedId, + openAppIconById, + onOpenAppDraftChange: handleOpenAppDraftChange, + onOpenAppKindChange: handleOpenAppKindChange, + onCommitOpenApps: handleCommitOpenAppsDrafts, + onMoveOpenApp: handleMoveOpenApp, + onDeleteOpenApp: handleDeleteOpenApp, + onAddOpenApp: handleAddOpenApp, + onSelectOpenAppDefault: handleSelectOpenAppDefault, + }, + gitSectionProps, + serverSectionProps, + codexSectionProps, + featuresSectionProps, + }; +} + +export type SettingsViewOrchestration = ReturnType; diff --git a/src/features/threads/hooks/threadReducer/common.ts b/src/features/threads/hooks/threadReducer/common.ts new file mode 100644 index 000000000..011f1ba39 --- /dev/null +++ b/src/features/threads/hooks/threadReducer/common.ts @@ -0,0 +1,200 @@ +import type { ConversationItem } from "@/types"; +import type { ThreadState } from "../useThreadsReducer"; + +const MAX_THREAD_NAME_LENGTH = 38; + +function formatThreadName(text: string) { + const trimmed = text.trim(); + if (!trimmed) { + return null; + } + return trimmed.length > MAX_THREAD_NAME_LENGTH + ? `${trimmed.slice(0, MAX_THREAD_NAME_LENGTH)}…` + : trimmed; +} + +export function looksAutoGeneratedThreadName(name: string) { + return name === "New Agent" || name.startsWith("Agent ") || /^[a-f0-9]{4,8}$/i.test(name); +} + +export function extractRenameText(text: string) { + if (!text) { + return ""; + } + const withoutImages = text.replace(/\[image(?: x\d+)?\]/gi, " "); + const withoutSkills = withoutImages.replace(/(^|\s)\$[A-Za-z0-9_-]+(?=\s|$)/g, " "); + return withoutSkills.replace(/\s+/g, " ").trim(); +} + +function getAssistantTextForRename( + items: ConversationItem[], + itemId?: string, +): string { + if (itemId) { + const match = items.find( + (item) => + item.kind === "message" && + item.role === "assistant" && + item.id === itemId, + ); + if (match && match.kind === "message") { + return match.text; + } + } + for (let index = items.length - 1; index >= 0; index -= 1) { + const item = items[index]; + if (item.kind === "message" && item.role === "assistant") { + return item.text; + } + } + return ""; +} + +export function maybeRenameThreadFromAgent({ + workspaceId, + threadId, + items, + itemId, + hasCustomName, + threadsByWorkspace, +}: { + workspaceId: string; + threadId: string; + items: ConversationItem[]; + itemId?: string; + hasCustomName: boolean; + threadsByWorkspace: ThreadState["threadsByWorkspace"]; +}) { + const threads = threadsByWorkspace[workspaceId] ?? []; + if (!threads.length) { + return threadsByWorkspace; + } + const hasUserMessage = items.some( + (item) => item.kind === "message" && item.role === "user", + ); + if (hasUserMessage) { + return threadsByWorkspace; + } + if (hasCustomName) { + return threadsByWorkspace; + } + const nextName = formatThreadName(getAssistantTextForRename(items, itemId)); + if (!nextName) { + return threadsByWorkspace; + } + let didChange = false; + const nextThreads = threads.map((thread) => { + if ( + thread.id !== threadId || + thread.name === nextName || + !looksAutoGeneratedThreadName(thread.name) + ) { + return thread; + } + didChange = true; + return { ...thread, name: nextName }; + }); + return didChange + ? { ...threadsByWorkspace, [workspaceId]: nextThreads } + : threadsByWorkspace; +} + +export function mergeStreamingText(existing: string, delta: string) { + if (!delta) { + return existing; + } + if (!existing) { + return delta; + } + if (delta === existing) { + return existing; + } + if (delta.startsWith(existing)) { + return delta; + } + if (existing.startsWith(delta)) { + return existing; + } + const maxOverlap = Math.min(existing.length, delta.length); + for (let length = maxOverlap; length > 0; length -= 1) { + if (existing.endsWith(delta.slice(0, length))) { + return `${existing}${delta.slice(length)}`; + } + } + return `${existing}${delta}`; +} + +export function addSummaryBoundary(existing: string) { + if (!existing) { + return existing; + } + if (existing.endsWith("\n\n")) { + return existing; + } + if (existing.endsWith("\n")) { + return `${existing}\n`; + } + return `${existing}\n\n`; +} + +export function dropLatestLocalReviewStart(list: ConversationItem[]) { + for (let index = list.length - 1; index >= 0; index -= 1) { + const item = list[index]; + if ( + item.kind === "review" && + item.state === "started" && + item.id.startsWith("review-start-") + ) { + return [...list.slice(0, index), ...list.slice(index + 1)]; + } + } + return list; +} + +export function findMatchingReview( + list: ConversationItem[], + target: Extract, +) { + const normalizedText = target.text.trim(); + return list.find( + (item) => + item.kind === "review" && + item.state === target.state && + item.text.trim() === normalizedText, + ); +} + +export function ensureUniqueReviewId(list: ConversationItem[], item: ConversationItem) { + if (item.kind !== "review") { + return item; + } + if (!list.some((entry) => entry.id === item.id)) { + return item; + } + const existingIds = new Set(list.map((entry) => entry.id)); + let suffix = 1; + let candidate = `${item.id}-${suffix}`; + while (existingIds.has(candidate)) { + suffix += 1; + candidate = `${item.id}-${suffix}`; + } + return { ...item, id: candidate }; +} + +export function isDuplicateReviewById( + list: ConversationItem[], + target: Extract, +) { + const normalizedText = target.text.trim(); + return list.some( + (item) => + item.kind === "review" && + item.id === target.id && + item.state === target.state && + item.text.trim() === normalizedText, + ); +} + +export function prefersUpdatedSort(state: ThreadState, workspaceId: string) { + return (state.threadSortKeyByWorkspace[workspaceId] ?? "updated_at") === "updated_at"; +} diff --git a/src/features/threads/hooks/threadReducer/threadItemsSlice.ts b/src/features/threads/hooks/threadReducer/threadItemsSlice.ts new file mode 100644 index 000000000..86673a314 --- /dev/null +++ b/src/features/threads/hooks/threadReducer/threadItemsSlice.ts @@ -0,0 +1,333 @@ +import type { ConversationItem } from "@/types"; +import { normalizeItem, prepareThreadItems, upsertItem } from "@utils/threadItems"; +import type { ThreadAction, ThreadState } from "../useThreadsReducer"; +import { + addSummaryBoundary, + dropLatestLocalReviewStart, + ensureUniqueReviewId, + extractRenameText, + findMatchingReview, + isDuplicateReviewById, + looksAutoGeneratedThreadName, + maybeRenameThreadFromAgent, + mergeStreamingText, + prefersUpdatedSort, +} from "./common"; + +export function reduceThreadItems(state: ThreadState, action: ThreadAction): ThreadState { + switch (action.type) { + case "addAssistantMessage": { + const list = state.itemsByThread[action.threadId] ?? []; + const message: ConversationItem = { + id: `${Date.now()}-assistant`, + kind: "message", + role: "assistant", + text: action.text, + }; + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: prepareThreadItems([...list, message]), + }, + }; + } + case "appendAgentDelta": { + const list = [...(state.itemsByThread[action.threadId] ?? [])]; + const index = list.findIndex((msg) => msg.id === action.itemId); + if (index >= 0 && list[index].kind === "message") { + const existing = list[index]; + list[index] = { + ...existing, + text: mergeStreamingText(existing.text, action.delta), + }; + } else { + list.push({ + id: action.itemId, + kind: "message", + role: "assistant", + text: action.delta, + }); + } + const updatedItems = prepareThreadItems(list); + const nextThreadsByWorkspace = maybeRenameThreadFromAgent({ + workspaceId: action.workspaceId, + threadId: action.threadId, + items: updatedItems, + itemId: action.itemId, + hasCustomName: action.hasCustomName, + threadsByWorkspace: state.threadsByWorkspace, + }); + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: updatedItems, + }, + threadsByWorkspace: nextThreadsByWorkspace, + }; + } + case "completeAgentMessage": { + const list = [...(state.itemsByThread[action.threadId] ?? [])]; + const index = list.findIndex((msg) => msg.id === action.itemId); + if (index >= 0 && list[index].kind === "message") { + const existing = list[index]; + list[index] = { + ...existing, + text: action.text || existing.text, + }; + } else { + list.push({ + id: action.itemId, + kind: "message", + role: "assistant", + text: action.text, + }); + } + const updatedItems = prepareThreadItems(list); + const nextThreadsByWorkspace = maybeRenameThreadFromAgent({ + workspaceId: action.workspaceId, + threadId: action.threadId, + items: updatedItems, + itemId: action.itemId, + hasCustomName: action.hasCustomName, + threadsByWorkspace: state.threadsByWorkspace, + }); + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: updatedItems, + }, + threadsByWorkspace: nextThreadsByWorkspace, + }; + } + case "upsertItem": { + let list = state.itemsByThread[action.threadId] ?? []; + const item = normalizeItem(action.item); + const isUserMessage = item.kind === "message" && item.role === "user"; + const hadUserMessage = isUserMessage + ? list.some((entry) => entry.kind === "message" && entry.role === "user") + : false; + const renameText = isUserMessage ? extractRenameText(item.text) : ""; + if ( + item.kind === "review" && + item.state === "started" && + !item.id.startsWith("review-start-") + ) { + list = dropLatestLocalReviewStart(list); + } + if (item.kind === "review" && isDuplicateReviewById(list, item)) { + return state; + } + if (item.kind === "review") { + const existing = findMatchingReview(list, item); + if (existing && existing.id !== item.id) { + return state; + } + } + const nextItem = ensureUniqueReviewId(list, item); + const updatedItems = prepareThreadItems(upsertItem(list, nextItem)); + let nextThreadsByWorkspace = state.threadsByWorkspace; + if (isUserMessage) { + const threads = state.threadsByWorkspace[action.workspaceId] ?? []; + const textValue = renameText; + const updatedThreads = threads.map((thread) => { + if (thread.id !== action.threadId) { + return thread; + } + const looksAutoGenerated = looksAutoGeneratedThreadName(thread.name); + const shouldRename = + !hadUserMessage && + textValue.length > 0 && + looksAutoGenerated && + !action.hasCustomName; + const nextName = + shouldRename && textValue.length > 38 + ? `${textValue.slice(0, 38)}…` + : shouldRename + ? textValue + : thread.name; + return { ...thread, name: nextName }; + }); + const bumpedThreads = + prefersUpdatedSort(state, action.workspaceId) && updatedThreads.length + ? [ + ...updatedThreads.filter((thread) => thread.id === action.threadId), + ...updatedThreads.filter((thread) => thread.id !== action.threadId), + ] + : updatedThreads; + nextThreadsByWorkspace = { + ...state.threadsByWorkspace, + [action.workspaceId]: bumpedThreads, + }; + } + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: updatedItems, + }, + threadsByWorkspace: nextThreadsByWorkspace, + }; + } + case "setThreadItems": + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: prepareThreadItems(action.items), + }, + }; + case "appendReasoningSummary": { + const list = state.itemsByThread[action.threadId] ?? []; + const index = list.findIndex((entry) => entry.id === action.itemId); + const base = + index >= 0 && list[index].kind === "reasoning" + ? (list[index] as ConversationItem) + : { + id: action.itemId, + kind: "reasoning", + summary: "", + content: "", + }; + const updated: ConversationItem = { + ...base, + summary: mergeStreamingText( + "summary" in base ? base.summary : "", + action.delta, + ), + } as ConversationItem; + const next = index >= 0 ? [...list] : [...list, updated]; + if (index >= 0) { + next[index] = updated; + } + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: prepareThreadItems(next), + }, + }; + } + case "appendReasoningSummaryBoundary": { + const list = state.itemsByThread[action.threadId] ?? []; + const index = list.findIndex((entry) => entry.id === action.itemId); + const base = + index >= 0 && list[index].kind === "reasoning" + ? (list[index] as ConversationItem) + : { + id: action.itemId, + kind: "reasoning", + summary: "", + content: "", + }; + const updated: ConversationItem = { + ...base, + summary: addSummaryBoundary("summary" in base ? base.summary : ""), + } as ConversationItem; + const next = index >= 0 ? [...list] : [...list, updated]; + if (index >= 0) { + next[index] = updated; + } + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: prepareThreadItems(next), + }, + }; + } + case "appendReasoningContent": { + const list = state.itemsByThread[action.threadId] ?? []; + const index = list.findIndex((entry) => entry.id === action.itemId); + const base = + index >= 0 && list[index].kind === "reasoning" + ? (list[index] as ConversationItem) + : { + id: action.itemId, + kind: "reasoning", + summary: "", + content: "", + }; + const updated: ConversationItem = { + ...base, + content: mergeStreamingText( + "content" in base ? base.content : "", + action.delta, + ), + } as ConversationItem; + const next = index >= 0 ? [...list] : [...list, updated]; + if (index >= 0) { + next[index] = updated; + } + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: prepareThreadItems(next), + }, + }; + } + case "appendPlanDelta": { + const list = state.itemsByThread[action.threadId] ?? []; + const index = list.findIndex((entry) => entry.id === action.itemId); + const base = + index >= 0 && list[index].kind === "tool" + ? list[index] + : { + id: action.itemId, + kind: "tool", + toolType: "plan", + title: "Plan", + detail: "", + status: "in_progress", + output: "", + }; + const existingOutput = base.kind === "tool" ? (base.output ?? "") : ""; + const updated: ConversationItem = { + ...(base as ConversationItem), + kind: "tool", + toolType: "plan", + title: "Plan", + detail: "Generating plan...", + status: "in_progress", + output: mergeStreamingText(existingOutput, action.delta), + } as ConversationItem; + const next = index >= 0 ? [...list] : [...list, updated]; + if (index >= 0) { + next[index] = updated; + } + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: prepareThreadItems(next), + }, + }; + } + case "appendToolOutput": { + const list = state.itemsByThread[action.threadId] ?? []; + const index = list.findIndex((entry) => entry.id === action.itemId); + if (index < 0 || list[index].kind !== "tool") { + return state; + } + const existing = list[index]; + const updated: ConversationItem = { + ...existing, + output: mergeStreamingText(existing.output ?? "", action.delta), + } as ConversationItem; + const next = [...list]; + next[index] = updated; + return { + ...state, + itemsByThread: { + ...state.itemsByThread, + [action.threadId]: prepareThreadItems(next), + }, + }; + } + default: + return state; + } +} diff --git a/src/features/threads/hooks/threadReducer/threadLifecycleSlice.ts b/src/features/threads/hooks/threadReducer/threadLifecycleSlice.ts new file mode 100644 index 000000000..71999b619 --- /dev/null +++ b/src/features/threads/hooks/threadReducer/threadLifecycleSlice.ts @@ -0,0 +1,362 @@ +import type { ThreadSummary } from "@/types"; +import type { ThreadAction, ThreadState } from "../useThreadsReducer"; +import { prefersUpdatedSort } from "./common"; + +type ThreadStatus = ThreadState["threadStatusById"][string]; + +function statusEquals(previous: ThreadStatus, nextStatus: ThreadStatus) { + return ( + previous.isProcessing === nextStatus.isProcessing && + previous.hasUnread === nextStatus.hasUnread && + previous.isReviewing === nextStatus.isReviewing && + previous.processingStartedAt === nextStatus.processingStartedAt && + previous.lastDurationMs === nextStatus.lastDurationMs + ); +} + +export function reduceThreadLifecycle( + state: ThreadState, + action: ThreadAction, +): ThreadState { + switch (action.type) { + case "setActiveThreadId": + return { + ...state, + activeThreadIdByWorkspace: { + ...state.activeThreadIdByWorkspace, + [action.workspaceId]: action.threadId, + }, + threadStatusById: action.threadId + ? { + ...state.threadStatusById, + [action.threadId]: { + isProcessing: + state.threadStatusById[action.threadId]?.isProcessing ?? false, + hasUnread: false, + isReviewing: + state.threadStatusById[action.threadId]?.isReviewing ?? false, + processingStartedAt: + state.threadStatusById[action.threadId]?.processingStartedAt ?? + null, + lastDurationMs: + state.threadStatusById[action.threadId]?.lastDurationMs ?? null, + }, + } + : state.threadStatusById, + }; + case "ensureThread": { + const hidden = + state.hiddenThreadIdsByWorkspace[action.workspaceId]?.[action.threadId] ?? + false; + if (hidden) { + return state; + } + const list = state.threadsByWorkspace[action.workspaceId] ?? []; + if (list.some((thread) => thread.id === action.threadId)) { + return state; + } + const thread: ThreadSummary = { + id: action.threadId, + name: "New Agent", + updatedAt: 0, + }; + return { + ...state, + threadsByWorkspace: { + ...state.threadsByWorkspace, + [action.workspaceId]: [thread, ...list], + }, + threadStatusById: { + ...state.threadStatusById, + [action.threadId]: { + isProcessing: false, + hasUnread: false, + isReviewing: false, + processingStartedAt: null, + lastDurationMs: null, + }, + }, + activeThreadIdByWorkspace: { + ...state.activeThreadIdByWorkspace, + [action.workspaceId]: + state.activeThreadIdByWorkspace[action.workspaceId] ?? action.threadId, + }, + }; + } + case "hideThread": { + const hiddenForWorkspace = + state.hiddenThreadIdsByWorkspace[action.workspaceId] ?? {}; + if (hiddenForWorkspace[action.threadId]) { + return state; + } + + const nextHiddenForWorkspace = { + ...hiddenForWorkspace, + [action.threadId]: true as const, + }; + + const list = state.threadsByWorkspace[action.workspaceId] ?? []; + const filtered = list.filter((thread) => thread.id !== action.threadId); + const nextActive = + state.activeThreadIdByWorkspace[action.workspaceId] === action.threadId + ? filtered[0]?.id ?? null + : state.activeThreadIdByWorkspace[action.workspaceId] ?? null; + + return { + ...state, + hiddenThreadIdsByWorkspace: { + ...state.hiddenThreadIdsByWorkspace, + [action.workspaceId]: nextHiddenForWorkspace, + }, + threadsByWorkspace: { + ...state.threadsByWorkspace, + [action.workspaceId]: filtered, + }, + activeThreadIdByWorkspace: { + ...state.activeThreadIdByWorkspace, + [action.workspaceId]: nextActive, + }, + }; + } + case "removeThread": { + const list = state.threadsByWorkspace[action.workspaceId] ?? []; + const filtered = list.filter((thread) => thread.id !== action.threadId); + const nextActive = + state.activeThreadIdByWorkspace[action.workspaceId] === action.threadId + ? filtered[0]?.id ?? null + : state.activeThreadIdByWorkspace[action.workspaceId] ?? null; + const { [action.threadId]: _, ...restItems } = state.itemsByThread; + const { [action.threadId]: __, ...restStatus } = state.threadStatusById; + const { [action.threadId]: ___, ...restTurns } = state.activeTurnIdByThread; + const { [action.threadId]: ____, ...restDiffs } = state.turnDiffByThread; + const { [action.threadId]: _____, ...restPlans } = state.planByThread; + const { [action.threadId]: ______, ...restParents } = state.threadParentById; + return { + ...state, + threadsByWorkspace: { + ...state.threadsByWorkspace, + [action.workspaceId]: filtered, + }, + itemsByThread: restItems, + threadStatusById: restStatus, + activeTurnIdByThread: restTurns, + turnDiffByThread: restDiffs, + planByThread: restPlans, + threadParentById: restParents, + activeThreadIdByWorkspace: { + ...state.activeThreadIdByWorkspace, + [action.workspaceId]: nextActive, + }, + }; + } + case "setThreadParent": { + if (!action.parentId || action.parentId === action.threadId) { + return state; + } + if (state.threadParentById[action.threadId] === action.parentId) { + return state; + } + return { + ...state, + threadParentById: { + ...state.threadParentById, + [action.threadId]: action.parentId, + }, + }; + } + case "markProcessing": { + const previous = state.threadStatusById[action.threadId]; + const wasProcessing = previous?.isProcessing ?? false; + const startedAt = previous?.processingStartedAt ?? null; + const lastDurationMs = previous?.lastDurationMs ?? null; + const hasUnread = previous?.hasUnread ?? false; + const isReviewing = previous?.isReviewing ?? false; + if (action.isProcessing) { + const nextStartedAt = + wasProcessing && startedAt ? startedAt : action.timestamp; + const nextStatus: ThreadStatus = { + isProcessing: true, + hasUnread, + isReviewing, + processingStartedAt: nextStartedAt, + lastDurationMs, + }; + if (previous && statusEquals(previous, nextStatus)) { + return state; + } + return { + ...state, + threadStatusById: { + ...state.threadStatusById, + [action.threadId]: nextStatus, + }, + }; + } + const nextDuration = + wasProcessing && startedAt + ? Math.max(0, action.timestamp - startedAt) + : lastDurationMs ?? null; + const nextStatus: ThreadStatus = { + isProcessing: false, + hasUnread, + isReviewing, + processingStartedAt: null, + lastDurationMs: nextDuration, + }; + if (previous && statusEquals(previous, nextStatus)) { + return state; + } + return { + ...state, + threadStatusById: { + ...state.threadStatusById, + [action.threadId]: nextStatus, + }, + }; + } + case "setActiveTurnId": + return { + ...state, + activeTurnIdByThread: { + ...state.activeTurnIdByThread, + [action.threadId]: action.turnId, + }, + }; + case "markReviewing": { + const previous = state.threadStatusById[action.threadId]; + const nextStatus: ThreadStatus = { + isProcessing: previous?.isProcessing ?? false, + hasUnread: previous?.hasUnread ?? false, + isReviewing: action.isReviewing, + processingStartedAt: previous?.processingStartedAt ?? null, + lastDurationMs: previous?.lastDurationMs ?? null, + }; + if (previous && statusEquals(previous, nextStatus)) { + return state; + } + return { + ...state, + threadStatusById: { + ...state.threadStatusById, + [action.threadId]: nextStatus, + }, + }; + } + case "markUnread": { + const previous = state.threadStatusById[action.threadId]; + const nextStatus: ThreadStatus = { + isProcessing: previous?.isProcessing ?? false, + hasUnread: action.hasUnread, + isReviewing: previous?.isReviewing ?? false, + processingStartedAt: previous?.processingStartedAt ?? null, + lastDurationMs: previous?.lastDurationMs ?? null, + }; + if (previous && statusEquals(previous, nextStatus)) { + return state; + } + return { + ...state, + threadStatusById: { + ...state.threadStatusById, + [action.threadId]: nextStatus, + }, + }; + } + case "setThreadName": { + const list = state.threadsByWorkspace[action.workspaceId] ?? []; + const next = list.map((thread) => + thread.id === action.threadId ? { ...thread, name: action.name } : thread, + ); + return { + ...state, + threadsByWorkspace: { + ...state.threadsByWorkspace, + [action.workspaceId]: next, + }, + }; + } + case "setThreadTimestamp": { + const list = state.threadsByWorkspace[action.workspaceId] ?? []; + if (!list.length) { + return state; + } + let didChange = false; + const next = list.map((thread) => { + if (thread.id !== action.threadId) { + return thread; + } + const current = thread.updatedAt ?? 0; + if (current >= action.timestamp) { + return thread; + } + didChange = true; + return { ...thread, updatedAt: action.timestamp }; + }); + if (!didChange) { + return state; + } + const sorted = prefersUpdatedSort(state, action.workspaceId) + ? [ + ...next.filter((thread) => thread.id === action.threadId), + ...next.filter((thread) => thread.id !== action.threadId), + ] + : next; + return { + ...state, + threadsByWorkspace: { + ...state.threadsByWorkspace, + [action.workspaceId]: sorted, + }, + }; + } + case "setThreads": { + const hidden = state.hiddenThreadIdsByWorkspace[action.workspaceId] ?? {}; + const visibleThreads = action.threads.filter((thread) => !hidden[thread.id]); + return { + ...state, + threadsByWorkspace: { + ...state.threadsByWorkspace, + [action.workspaceId]: visibleThreads, + }, + threadSortKeyByWorkspace: { + ...state.threadSortKeyByWorkspace, + [action.workspaceId]: action.sortKey, + }, + }; + } + case "setThreadListLoading": + return { + ...state, + threadListLoadingByWorkspace: { + ...state.threadListLoadingByWorkspace, + [action.workspaceId]: action.isLoading, + }, + }; + case "setThreadResumeLoading": + return { + ...state, + threadResumeLoadingById: { + ...state.threadResumeLoadingById, + [action.threadId]: action.isLoading, + }, + }; + case "setThreadListPaging": + return { + ...state, + threadListPagingByWorkspace: { + ...state.threadListPagingByWorkspace, + [action.workspaceId]: action.isLoading, + }, + }; + case "setThreadListCursor": + return { + ...state, + threadListCursorByWorkspace: { + ...state.threadListCursorByWorkspace, + [action.workspaceId]: action.cursor, + }, + }; + default: + return state; + } +} diff --git a/src/features/threads/hooks/threadReducer/threadQueueSlice.ts b/src/features/threads/hooks/threadReducer/threadQueueSlice.ts new file mode 100644 index 000000000..2e3c3ead4 --- /dev/null +++ b/src/features/threads/hooks/threadReducer/threadQueueSlice.ts @@ -0,0 +1,51 @@ +import type { ThreadAction, ThreadState } from "../useThreadsReducer"; + +export function reduceThreadQueue(state: ThreadState, action: ThreadAction): ThreadState { + switch (action.type) { + case "addApproval": { + const exists = state.approvals.some( + (item) => + item.request_id === action.approval.request_id && + item.workspace_id === action.approval.workspace_id, + ); + if (exists) { + return state; + } + return { ...state, approvals: [...state.approvals, action.approval] }; + } + case "removeApproval": + return { + ...state, + approvals: state.approvals.filter( + (item) => + item.request_id !== action.requestId || + item.workspace_id !== action.workspaceId, + ), + }; + case "addUserInputRequest": { + const exists = state.userInputRequests.some( + (item) => + item.request_id === action.request.request_id && + item.workspace_id === action.request.workspace_id, + ); + if (exists) { + return state; + } + return { + ...state, + userInputRequests: [...state.userInputRequests, action.request], + }; + } + case "removeUserInputRequest": + return { + ...state, + userInputRequests: state.userInputRequests.filter( + (item) => + item.request_id !== action.requestId || + item.workspace_id !== action.workspaceId, + ), + }; + default: + return state; + } +} diff --git a/src/features/threads/hooks/threadReducer/threadSnapshotSlice.ts b/src/features/threads/hooks/threadReducer/threadSnapshotSlice.ts new file mode 100644 index 000000000..86799d005 --- /dev/null +++ b/src/features/threads/hooks/threadReducer/threadSnapshotSlice.ts @@ -0,0 +1,72 @@ +import type { ThreadAction, ThreadState } from "../useThreadsReducer"; + +export function reduceThreadSnapshots( + state: ThreadState, + action: ThreadAction, +): ThreadState { + switch (action.type) { + case "setLastAgentMessage": + if ( + state.lastAgentMessageByThread[action.threadId]?.timestamp >= action.timestamp + ) { + return state; + } + return { + ...state, + lastAgentMessageByThread: { + ...state.lastAgentMessageByThread, + [action.threadId]: { text: action.text, timestamp: action.timestamp }, + }, + }; + case "setThreadTokenUsage": + return { + ...state, + tokenUsageByThread: { + ...state.tokenUsageByThread, + [action.threadId]: action.tokenUsage, + }, + }; + case "setRateLimits": + return { + ...state, + rateLimitsByWorkspace: { + ...state.rateLimitsByWorkspace, + [action.workspaceId]: action.rateLimits, + }, + }; + case "setAccountInfo": + return { + ...state, + accountByWorkspace: { + ...state.accountByWorkspace, + [action.workspaceId]: action.account, + }, + }; + case "setThreadTurnDiff": + return { + ...state, + turnDiffByThread: { + ...state.turnDiffByThread, + [action.threadId]: action.diff, + }, + }; + case "setThreadPlan": + return { + ...state, + planByThread: { + ...state.planByThread, + [action.threadId]: action.plan, + }, + }; + case "clearThreadPlan": + return { + ...state, + planByThread: { + ...state.planByThread, + [action.threadId]: null, + }, + }; + default: + return state; + } +} diff --git a/src/features/threads/hooks/useCopyThread.ts b/src/features/threads/hooks/useCopyThread.ts index 1d9453a60..e43ca1219 100644 --- a/src/features/threads/hooks/useCopyThread.ts +++ b/src/features/threads/hooks/useCopyThread.ts @@ -1,6 +1,6 @@ import { useCallback } from "react"; -import { buildThreadTranscript } from "../../../utils/threadText"; -import type { ConversationItem, DebugEntry } from "../../../types"; +import { buildThreadTranscript } from "@utils/threadText"; +import type { ConversationItem, DebugEntry } from "@/types"; type CopyThreadOptions = { activeItems: ConversationItem[]; diff --git a/src/features/threads/hooks/useQueuedSend.test.tsx b/src/features/threads/hooks/useQueuedSend.test.tsx index 46e0c289c..7ddc49c07 100644 --- a/src/features/threads/hooks/useQueuedSend.test.tsx +++ b/src/features/threads/hooks/useQueuedSend.test.tsx @@ -1,7 +1,7 @@ // @vitest-environment jsdom import { act, renderHook } from "@testing-library/react"; import { describe, expect, it, vi } from "vitest"; -import type { WorkspaceInfo } from "../../../types"; +import type { WorkspaceInfo } from "@/types"; import { useQueuedSend } from "./useQueuedSend"; const workspace: WorkspaceInfo = { diff --git a/src/features/threads/hooks/useQueuedSend.ts b/src/features/threads/hooks/useQueuedSend.ts index 335809a9d..f6208788b 100644 --- a/src/features/threads/hooks/useQueuedSend.ts +++ b/src/features/threads/hooks/useQueuedSend.ts @@ -1,5 +1,5 @@ import { useCallback, useEffect, useMemo, useState } from "react"; -import type { AppMention, QueuedMessage, WorkspaceInfo } from "../../../types"; +import type { AppMention, QueuedMessage, WorkspaceInfo } from "@/types"; type UseQueuedSendOptions = { activeThreadId: string | null; diff --git a/src/features/threads/hooks/useRenameThreadPrompt.ts b/src/features/threads/hooks/useRenameThreadPrompt.ts index 23d9630e8..5e158665a 100644 --- a/src/features/threads/hooks/useRenameThreadPrompt.ts +++ b/src/features/threads/hooks/useRenameThreadPrompt.ts @@ -1,5 +1,5 @@ import { useCallback, useState } from "react"; -import type { ThreadSummary } from "../../../types"; +import type { ThreadSummary } from "@/types"; type RenamePromptState = { workspaceId: string; diff --git a/src/features/threads/hooks/useReviewPrompt.ts b/src/features/threads/hooks/useReviewPrompt.ts index a69b5dfcd..38fa63568 100644 --- a/src/features/threads/hooks/useReviewPrompt.ts +++ b/src/features/threads/hooks/useReviewPrompt.ts @@ -5,8 +5,8 @@ import type { GitLogEntry, ReviewTarget, WorkspaceInfo, -} from "../../../types"; -import { getGitLog, listGitBranches } from "../../../services/tauri"; +} from "@/types"; +import { getGitLog, listGitBranches } from "@services/tauri"; export type ReviewPromptStep = "preset" | "baseBranch" | "commit" | "custom"; diff --git a/src/features/threads/hooks/useThreadAccountInfo.test.tsx b/src/features/threads/hooks/useThreadAccountInfo.test.tsx index 353dda9af..2ff241a3d 100644 --- a/src/features/threads/hooks/useThreadAccountInfo.test.tsx +++ b/src/features/threads/hooks/useThreadAccountInfo.test.tsx @@ -1,10 +1,10 @@ // @vitest-environment jsdom import { renderHook, waitFor } from "@testing-library/react"; import { describe, expect, it, vi } from "vitest"; -import { getAccountInfo } from "../../../services/tauri"; +import { getAccountInfo } from "@services/tauri"; import { useThreadAccountInfo } from "./useThreadAccountInfo"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ getAccountInfo: vi.fn(), })); diff --git a/src/features/threads/hooks/useThreadAccountInfo.ts b/src/features/threads/hooks/useThreadAccountInfo.ts index 02dc08772..8795ff02f 100644 --- a/src/features/threads/hooks/useThreadAccountInfo.ts +++ b/src/features/threads/hooks/useThreadAccountInfo.ts @@ -1,6 +1,6 @@ import { useCallback, useEffect } from "react"; -import type { AccountSnapshot, DebugEntry } from "../../../types"; -import { getAccountInfo } from "../../../services/tauri"; +import type { AccountSnapshot, DebugEntry } from "@/types"; +import { getAccountInfo } from "@services/tauri"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadAccountInfoOptions = { diff --git a/src/features/threads/hooks/useThreadActions.test.tsx b/src/features/threads/hooks/useThreadActions.test.tsx index 9e70d8bbe..fd602cc4c 100644 --- a/src/features/threads/hooks/useThreadActions.test.tsx +++ b/src/features/threads/hooks/useThreadActions.test.tsx @@ -1,14 +1,14 @@ // @vitest-environment jsdom import { act, renderHook } from "@testing-library/react"; import { beforeEach, describe, expect, it, vi } from "vitest"; -import type { ConversationItem, WorkspaceInfo } from "../../../types"; +import type { ConversationItem, WorkspaceInfo } from "@/types"; import { archiveThread, forkThread, listThreads, resumeThread, startThread, -} from "../../../services/tauri"; +} from "@services/tauri"; import { buildItemsFromThread, getThreadCreatedTimestamp, @@ -16,11 +16,11 @@ import { isReviewingFromThread, mergeThreadItems, previewThreadName, -} from "../../../utils/threadItems"; -import { saveThreadActivity } from "../utils/threadStorage"; +} from "@utils/threadItems"; +import { saveThreadActivity } from "@threads/utils/threadStorage"; import { useThreadActions } from "./useThreadActions"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ startThread: vi.fn(), forkThread: vi.fn(), resumeThread: vi.fn(), @@ -28,7 +28,7 @@ vi.mock("../../../services/tauri", () => ({ archiveThread: vi.fn(), })); -vi.mock("../../../utils/threadItems", () => ({ +vi.mock("@utils/threadItems", () => ({ buildItemsFromThread: vi.fn(), getThreadCreatedTimestamp: vi.fn(), getThreadTimestamp: vi.fn(), @@ -37,7 +37,7 @@ vi.mock("../../../utils/threadItems", () => ({ previewThreadName: vi.fn(), })); -vi.mock("../utils/threadStorage", () => ({ +vi.mock("@threads/utils/threadStorage", () => ({ saveThreadActivity: vi.fn(), })); diff --git a/src/features/threads/hooks/useThreadActions.ts b/src/features/threads/hooks/useThreadActions.ts index 21c003f74..9c5719652 100644 --- a/src/features/threads/hooks/useThreadActions.ts +++ b/src/features/threads/hooks/useThreadActions.ts @@ -6,14 +6,14 @@ import type { ThreadListSortKey, ThreadSummary, WorkspaceInfo, -} from "../../../types"; +} from "@/types"; import { archiveThread as archiveThreadService, forkThread as forkThreadService, listThreads as listThreadsService, resumeThread as resumeThreadService, startThread as startThreadService, -} from "../../../services/tauri"; +} from "@services/tauri"; import { buildItemsFromThread, getThreadCreatedTimestamp, @@ -21,12 +21,16 @@ import { isReviewingFromThread, mergeThreadItems, previewThreadName, -} from "../../../utils/threadItems"; +} from "@utils/threadItems"; import { asString, normalizeRootPath, -} from "../utils/threadNormalize"; -import { saveThreadActivity } from "../utils/threadStorage"; +} from "@threads/utils/threadNormalize"; +import { + getParentThreadIdFromSource, + getResumedActiveTurnId, +} from "@threads/utils/threadRpc"; +import { saveThreadActivity } from "@threads/utils/threadStorage"; import type { ThreadAction, ThreadState } from "./useThreadsReducer"; const THREAD_LIST_TARGET_COUNT = 20; @@ -35,68 +39,6 @@ const THREAD_LIST_MAX_PAGES_WITH_ACTIVITY = 8; const THREAD_LIST_MAX_PAGES_WITHOUT_ACTIVITY = 3; const THREAD_LIST_MAX_PAGES_OLDER = 6; -function asRecord(value: unknown): Record | null { - if (!value || typeof value !== "object") { - return null; - } - return value as Record; -} - -function getParentThreadIdFromSource(source: unknown): string | null { - const sourceRecord = asRecord(source); - if (!sourceRecord) { - return null; - } - const subAgent = asRecord(sourceRecord.subAgent ?? sourceRecord.sub_agent); - if (!subAgent) { - return null; - } - const threadSpawn = asRecord(subAgent.thread_spawn ?? subAgent.threadSpawn); - if (!threadSpawn) { - return null; - } - const parentId = asString( - threadSpawn.parent_thread_id ?? threadSpawn.parentThreadId, - ); - return parentId || null; -} - -function normalizeTurnStatus(value: unknown): string { - return String(value ?? "") - .trim() - .toLowerCase() - .replace(/[\s_-]/g, ""); -} - -function getResumedActiveTurnId(thread: Record): string | null { - const turns = Array.isArray(thread.turns) - ? (thread.turns as Array>) - : []; - for (let index = turns.length - 1; index >= 0; index -= 1) { - const turn = turns[index]; - if (!turn || typeof turn !== "object") { - continue; - } - const status = normalizeTurnStatus( - turn.status ?? turn.turnStatus ?? turn.turn_status, - ); - const isInProgress = - status === "inprogress" || - status === "running" || - status === "processing" || - status === "pending" || - status === "started"; - if (!isInProgress) { - continue; - } - const turnId = asString(turn.id ?? turn.turnId ?? turn.turn_id); - if (turnId) { - return turnId; - } - } - return null; -} - type UseThreadActionsOptions = { dispatch: Dispatch; itemsByThread: ThreadState["itemsByThread"]; diff --git a/src/features/threads/hooks/useThreadApprovalEvents.test.tsx b/src/features/threads/hooks/useThreadApprovalEvents.test.tsx index 33152036e..ff7442388 100644 --- a/src/features/threads/hooks/useThreadApprovalEvents.test.tsx +++ b/src/features/threads/hooks/useThreadApprovalEvents.test.tsx @@ -1,19 +1,19 @@ // @vitest-environment jsdom import { act, renderHook } from "@testing-library/react"; import { beforeEach, describe, expect, it, vi } from "vitest"; -import type { ApprovalRequest } from "../../../types"; -import { respondToServerRequest } from "../../../services/tauri"; +import type { ApprovalRequest } from "@/types"; +import { respondToServerRequest } from "@services/tauri"; import { getApprovalCommandInfo, matchesCommandPrefix, -} from "../../../utils/approvalRules"; +} from "@utils/approvalRules"; import { useThreadApprovalEvents } from "./useThreadApprovalEvents"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ respondToServerRequest: vi.fn(), })); -vi.mock("../../../utils/approvalRules", () => ({ +vi.mock("@utils/approvalRules", () => ({ getApprovalCommandInfo: vi.fn(), matchesCommandPrefix: vi.fn(), })); diff --git a/src/features/threads/hooks/useThreadApprovalEvents.ts b/src/features/threads/hooks/useThreadApprovalEvents.ts index 5cea2709d..038bf3892 100644 --- a/src/features/threads/hooks/useThreadApprovalEvents.ts +++ b/src/features/threads/hooks/useThreadApprovalEvents.ts @@ -1,11 +1,11 @@ import { useCallback } from "react"; import type { Dispatch, MutableRefObject } from "react"; -import type { ApprovalRequest } from "../../../types"; +import type { ApprovalRequest } from "@/types"; import { getApprovalCommandInfo, matchesCommandPrefix, -} from "../../../utils/approvalRules"; -import { respondToServerRequest } from "../../../services/tauri"; +} from "@utils/approvalRules"; +import { respondToServerRequest } from "@services/tauri"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadApprovalEventsOptions = { diff --git a/src/features/threads/hooks/useThreadApprovals.ts b/src/features/threads/hooks/useThreadApprovals.ts index 5d18f2a2a..2aa0d9bd6 100644 --- a/src/features/threads/hooks/useThreadApprovals.ts +++ b/src/features/threads/hooks/useThreadApprovals.ts @@ -1,11 +1,11 @@ import { useCallback, useRef } from "react"; import type { Dispatch } from "react"; -import type { ApprovalRequest, DebugEntry } from "../../../types"; -import { normalizeCommandTokens } from "../../../utils/approvalRules"; +import type { ApprovalRequest, DebugEntry } from "@/types"; +import { normalizeCommandTokens } from "@utils/approvalRules"; import { rememberApprovalRule, respondToServerRequest, -} from "../../../services/tauri"; +} from "@services/tauri"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadApprovalsOptions = { diff --git a/src/features/threads/hooks/useThreadCodexParams.test.tsx b/src/features/threads/hooks/useThreadCodexParams.test.tsx index 397b2eb0d..037c27e01 100644 --- a/src/features/threads/hooks/useThreadCodexParams.test.tsx +++ b/src/features/threads/hooks/useThreadCodexParams.test.tsx @@ -1,7 +1,7 @@ // @vitest-environment jsdom import { act, renderHook, waitFor } from "@testing-library/react"; import { beforeEach, describe, expect, it } from "vitest"; -import { STORAGE_KEY_THREAD_CODEX_PARAMS } from "../utils/threadStorage"; +import { STORAGE_KEY_THREAD_CODEX_PARAMS } from "@threads/utils/threadStorage"; import { useThreadCodexParams } from "./useThreadCodexParams"; describe("useThreadCodexParams", () => { diff --git a/src/features/threads/hooks/useThreadCodexParams.ts b/src/features/threads/hooks/useThreadCodexParams.ts index 2ca063b1f..9bdd4cbfa 100644 --- a/src/features/threads/hooks/useThreadCodexParams.ts +++ b/src/features/threads/hooks/useThreadCodexParams.ts @@ -1,5 +1,5 @@ import { useCallback, useEffect, useMemo, useRef, useState } from "react"; -import type { AccessMode } from "../../../types"; +import type { AccessMode } from "@/types"; import { STORAGE_KEY_THREAD_CODEX_PARAMS, type ThreadCodexParams, @@ -7,9 +7,9 @@ import { loadThreadCodexParams, makeThreadCodexParamsKey, saveThreadCodexParams, -} from "../utils/threadStorage"; +} from "@threads/utils/threadStorage"; -export type ThreadCodexParamsPatch = Partial< +type ThreadCodexParamsPatch = Partial< Pick >; @@ -122,4 +122,3 @@ export function useThreadCodexParams(): UseThreadCodexParamsResult { [deleteThreadCodexParams, getThreadCodexParams, patchThreadCodexParams, version], ); } - diff --git a/src/features/threads/hooks/useThreadEventHandlers.ts b/src/features/threads/hooks/useThreadEventHandlers.ts index 714fa4ef0..a4044fd08 100644 --- a/src/features/threads/hooks/useThreadEventHandlers.ts +++ b/src/features/threads/hooks/useThreadEventHandlers.ts @@ -1,7 +1,7 @@ import { useCallback, useMemo } from "react"; import type { Dispatch, MutableRefObject } from "react"; -import type { AppServerEvent, DebugEntry, TurnPlan } from "../../../types"; -import { getAppServerRawMethod } from "../../../utils/appServerEvents"; +import type { AppServerEvent, DebugEntry, TurnPlan } from "@/types"; +import { getAppServerRawMethod } from "@utils/appServerEvents"; import { useThreadApprovalEvents } from "./useThreadApprovalEvents"; import { useThreadItemEvents } from "./useThreadItemEvents"; import { useThreadTurnEvents } from "./useThreadTurnEvents"; diff --git a/src/features/threads/hooks/useThreadItemEvents.test.ts b/src/features/threads/hooks/useThreadItemEvents.test.ts index 0ad8216dd..98c7bca16 100644 --- a/src/features/threads/hooks/useThreadItemEvents.test.ts +++ b/src/features/threads/hooks/useThreadItemEvents.test.ts @@ -1,10 +1,10 @@ // @vitest-environment jsdom import { act, renderHook } from "@testing-library/react"; import { beforeEach, describe, expect, it, vi } from "vitest"; -import { buildConversationItem } from "../../../utils/threadItems"; +import { buildConversationItem } from "@utils/threadItems"; import { useThreadItemEvents } from "./useThreadItemEvents"; -vi.mock("../../../utils/threadItems", () => ({ +vi.mock("@utils/threadItems", () => ({ buildConversationItem: vi.fn(), })); diff --git a/src/features/threads/hooks/useThreadItemEvents.ts b/src/features/threads/hooks/useThreadItemEvents.ts index 133c155aa..e3b69d44c 100644 --- a/src/features/threads/hooks/useThreadItemEvents.ts +++ b/src/features/threads/hooks/useThreadItemEvents.ts @@ -1,7 +1,7 @@ import { useCallback } from "react"; import type { Dispatch } from "react"; -import { buildConversationItem } from "../../../utils/threadItems"; -import { asString } from "../utils/threadNormalize"; +import { buildConversationItem } from "@utils/threadItems"; +import { asString } from "@threads/utils/threadNormalize"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadItemEventsOptions = { diff --git a/src/features/threads/hooks/useThreadLinking.ts b/src/features/threads/hooks/useThreadLinking.ts index e9b2666ce..adbc64d75 100644 --- a/src/features/threads/hooks/useThreadLinking.ts +++ b/src/features/threads/hooks/useThreadLinking.ts @@ -1,7 +1,7 @@ import { useCallback } from "react"; import type { Dispatch } from "react"; import type { ThreadAction } from "./useThreadsReducer"; -import { asString, normalizeStringList } from "../utils/threadNormalize"; +import { asString, normalizeStringList } from "@threads/utils/threadNormalize"; type UseThreadLinkingOptions = { dispatch: Dispatch; diff --git a/src/features/threads/hooks/useThreadMessaging.test.tsx b/src/features/threads/hooks/useThreadMessaging.test.tsx index de219d1f2..2d726c325 100644 --- a/src/features/threads/hooks/useThreadMessaging.test.tsx +++ b/src/features/threads/hooks/useThreadMessaging.test.tsx @@ -10,8 +10,8 @@ import { getAppsList as getAppsListService, listMcpServerStatus as listMcpServerStatusService, compactThread as compactThreadService, -} from "../../../services/tauri"; -import type { WorkspaceInfo } from "../../../types"; +} from "@services/tauri"; +import type { WorkspaceInfo } from "@/types"; import { useThreadMessaging } from "./useThreadMessaging"; vi.mock("@sentry/react", () => ({ @@ -20,7 +20,7 @@ vi.mock("@sentry/react", () => ({ }, })); -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ sendUserMessage: vi.fn(), steerTurn: vi.fn(), startReview: vi.fn(), diff --git a/src/features/threads/hooks/useThreadMessaging.ts b/src/features/threads/hooks/useThreadMessaging.ts index 9efac4907..2b7fdeeb8 100644 --- a/src/features/threads/hooks/useThreadMessaging.ts +++ b/src/features/threads/hooks/useThreadMessaging.ts @@ -9,7 +9,7 @@ import type { DebugEntry, ReviewTarget, WorkspaceInfo, -} from "../../../types"; +} from "@/types"; import { compactThread as compactThreadService, sendUserMessage as sendUserMessageService, @@ -18,17 +18,18 @@ import { interruptTurn as interruptTurnService, getAppsList as getAppsListService, listMcpServerStatus as listMcpServerStatusService, -} from "../../../services/tauri"; -import { expandCustomPromptText } from "../../../utils/customPrompts"; +} from "@services/tauri"; +import { expandCustomPromptText } from "@utils/customPrompts"; import { asString, extractReviewThreadId, extractRpcErrorMessage, parseReviewTarget, -} from "../utils/threadNormalize"; +} from "@threads/utils/threadNormalize"; +import { isUnsupportedTurnSteerError } from "@threads/utils/threadRpc"; import type { ThreadAction, ThreadState } from "./useThreadsReducer"; import { useReviewPrompt } from "./useReviewPrompt"; -import { formatRelativeTime } from "../../../utils/time"; +import { formatRelativeTime } from "@utils/time"; type SendMessageOptions = { skipPromptExpansion?: boolean; @@ -78,16 +79,6 @@ type UseThreadMessagingOptions = { ) => void; }; -function isUnsupportedTurnSteerError(message: string): boolean { - const normalized = message.toLowerCase(); - const mentionsSteerMethod = - normalized.includes("turn/steer") || normalized.includes("turn_steer"); - return normalized.includes("unknown variant `turn/steer`") - || normalized.includes("unknown variant \"turn/steer\"") - || (normalized.includes("unknown request") && mentionsSteerMethod) - || (normalized.includes("unknown method") && mentionsSteerMethod); -} - export function useThreadMessaging({ activeWorkspace, activeThreadId, diff --git a/src/features/threads/hooks/useThreadRateLimits.test.tsx b/src/features/threads/hooks/useThreadRateLimits.test.tsx index 3af511b97..d04b6619a 100644 --- a/src/features/threads/hooks/useThreadRateLimits.test.tsx +++ b/src/features/threads/hooks/useThreadRateLimits.test.tsx @@ -1,11 +1,11 @@ // @vitest-environment jsdom import { act, renderHook, waitFor } from "@testing-library/react"; import { beforeEach, describe, expect, it, vi } from "vitest"; -import { getAccountRateLimits } from "../../../services/tauri"; -import { normalizeRateLimits } from "../utils/threadNormalize"; +import { getAccountRateLimits } from "@services/tauri"; +import { normalizeRateLimits } from "@threads/utils/threadNormalize"; import { useThreadRateLimits } from "./useThreadRateLimits"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ getAccountRateLimits: vi.fn(), })); diff --git a/src/features/threads/hooks/useThreadRateLimits.ts b/src/features/threads/hooks/useThreadRateLimits.ts index 47d14e29b..707f73e70 100644 --- a/src/features/threads/hooks/useThreadRateLimits.ts +++ b/src/features/threads/hooks/useThreadRateLimits.ts @@ -1,7 +1,7 @@ import { useCallback, useEffect } from "react"; -import type { DebugEntry } from "../../../types"; -import { getAccountRateLimits } from "../../../services/tauri"; -import { normalizeRateLimits } from "../utils/threadNormalize"; +import type { DebugEntry } from "@/types"; +import { getAccountRateLimits } from "@services/tauri"; +import { normalizeRateLimits } from "@threads/utils/threadNormalize"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadRateLimitsOptions = { diff --git a/src/features/threads/hooks/useThreadSelectors.test.tsx b/src/features/threads/hooks/useThreadSelectors.test.tsx index 3063bc01e..6b5647f0c 100644 --- a/src/features/threads/hooks/useThreadSelectors.test.tsx +++ b/src/features/threads/hooks/useThreadSelectors.test.tsx @@ -1,7 +1,7 @@ // @vitest-environment jsdom import { renderHook } from "@testing-library/react"; import { describe, expect, it } from "vitest"; -import type { ConversationItem } from "../../../types"; +import type { ConversationItem } from "@/types"; import { useThreadSelectors } from "./useThreadSelectors"; const messageItem: ConversationItem = { diff --git a/src/features/threads/hooks/useThreadSelectors.ts b/src/features/threads/hooks/useThreadSelectors.ts index a8b7c5c67..a63a588ef 100644 --- a/src/features/threads/hooks/useThreadSelectors.ts +++ b/src/features/threads/hooks/useThreadSelectors.ts @@ -1,5 +1,5 @@ import { useMemo } from "react"; -import type { ConversationItem } from "../../../types"; +import type { ConversationItem } from "@/types"; import type { ThreadState } from "./useThreadsReducer"; type UseThreadSelectorsOptions = { diff --git a/src/features/threads/hooks/useThreadStorage.test.tsx b/src/features/threads/hooks/useThreadStorage.test.tsx index a3fc11e27..60deac8df 100644 --- a/src/features/threads/hooks/useThreadStorage.test.tsx +++ b/src/features/threads/hooks/useThreadStorage.test.tsx @@ -9,10 +9,10 @@ import { loadThreadActivity, savePinnedThreads, saveThreadActivity, -} from "../utils/threadStorage"; +} from "@threads/utils/threadStorage"; import { useThreadStorage } from "./useThreadStorage"; -vi.mock("../utils/threadStorage", () => ({ +vi.mock("@threads/utils/threadStorage", () => ({ MAX_PINS_SOFT_LIMIT: 2, STORAGE_KEY_CUSTOM_NAMES: "custom-names", STORAGE_KEY_PINNED_THREADS: "pinned-threads", diff --git a/src/features/threads/hooks/useThreadStorage.ts b/src/features/threads/hooks/useThreadStorage.ts index 4dbd00fc9..54812e236 100644 --- a/src/features/threads/hooks/useThreadStorage.ts +++ b/src/features/threads/hooks/useThreadStorage.ts @@ -14,9 +14,9 @@ import { makePinKey, savePinnedThreads, saveThreadActivity, -} from "../utils/threadStorage"; +} from "@threads/utils/threadStorage"; -export type UseThreadStorageResult = { +type UseThreadStorageResult = { customNamesRef: MutableRefObject; pinnedThreadsRef: MutableRefObject; threadActivityRef: MutableRefObject; diff --git a/src/features/threads/hooks/useThreadTitleAutogeneration.test.tsx b/src/features/threads/hooks/useThreadTitleAutogeneration.test.tsx index 98fcad5e0..982a44d6e 100644 --- a/src/features/threads/hooks/useThreadTitleAutogeneration.test.tsx +++ b/src/features/threads/hooks/useThreadTitleAutogeneration.test.tsx @@ -1,11 +1,11 @@ // @vitest-environment jsdom import { act, renderHook } from "@testing-library/react"; import { describe, expect, it, vi, beforeEach } from "vitest"; -import type { ConversationItem, ThreadSummary } from "../../../types"; -import { generateRunMetadata } from "../../../services/tauri"; +import type { ConversationItem, ThreadSummary } from "@/types"; +import { generateRunMetadata } from "@services/tauri"; import { useThreadTitleAutogeneration } from "./useThreadTitleAutogeneration"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ generateRunMetadata: vi.fn(), })); diff --git a/src/features/threads/hooks/useThreadTitleAutogeneration.ts b/src/features/threads/hooks/useThreadTitleAutogeneration.ts index 3891f7a42..e90bc141e 100644 --- a/src/features/threads/hooks/useThreadTitleAutogeneration.ts +++ b/src/features/threads/hooks/useThreadTitleAutogeneration.ts @@ -1,7 +1,7 @@ import { useCallback, useRef } from "react"; import type { MutableRefObject } from "react"; -import type { ConversationItem, DebugEntry, ThreadSummary } from "../../../types"; -import { generateRunMetadata } from "../../../services/tauri"; +import type { ConversationItem, DebugEntry, ThreadSummary } from "@/types"; +import { generateRunMetadata } from "@services/tauri"; const MAX_THREAD_NAME_LENGTH = 38; const MAX_PROMPT_CHARS = 1200; diff --git a/src/features/threads/hooks/useThreadTurnEvents.test.tsx b/src/features/threads/hooks/useThreadTurnEvents.test.tsx index 2212a4b5d..5513cb88a 100644 --- a/src/features/threads/hooks/useThreadTurnEvents.test.tsx +++ b/src/features/threads/hooks/useThreadTurnEvents.test.tsx @@ -1,20 +1,20 @@ // @vitest-environment jsdom import { act, renderHook } from "@testing-library/react"; import { beforeEach, describe, expect, it, vi } from "vitest"; -import type { TurnPlan } from "../../../types"; -import { interruptTurn } from "../../../services/tauri"; +import type { TurnPlan } from "@/types"; +import { interruptTurn } from "@services/tauri"; import { normalizePlanUpdate, normalizeRateLimits, normalizeTokenUsage, -} from "../utils/threadNormalize"; +} from "@threads/utils/threadNormalize"; import { useThreadTurnEvents } from "./useThreadTurnEvents"; -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ interruptTurn: vi.fn(), })); -vi.mock("../utils/threadNormalize", () => ({ +vi.mock("@threads/utils/threadNormalize", () => ({ asString: (value: unknown) => typeof value === "string" ? value : value ? String(value) : "", normalizePlanUpdate: vi.fn(), diff --git a/src/features/threads/hooks/useThreadTurnEvents.ts b/src/features/threads/hooks/useThreadTurnEvents.ts index dac3dbda5..f3cec47a2 100644 --- a/src/features/threads/hooks/useThreadTurnEvents.ts +++ b/src/features/threads/hooks/useThreadTurnEvents.ts @@ -1,14 +1,14 @@ import { useCallback } from "react"; import type { Dispatch, MutableRefObject } from "react"; -import type { TurnPlan } from "../../../types"; -import { interruptTurn as interruptTurnService } from "../../../services/tauri"; -import { getThreadTimestamp } from "../../../utils/threadItems"; +import type { TurnPlan } from "@/types"; +import { interruptTurn as interruptTurnService } from "@services/tauri"; +import { getThreadTimestamp } from "@utils/threadItems"; import { asString, normalizePlanUpdate, normalizeRateLimits, normalizeTokenUsage, -} from "../utils/threadNormalize"; +} from "@threads/utils/threadNormalize"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadTurnEventsOptions = { diff --git a/src/features/threads/hooks/useThreadUserInput.ts b/src/features/threads/hooks/useThreadUserInput.ts index db5bf193f..fd69ca5ba 100644 --- a/src/features/threads/hooks/useThreadUserInput.ts +++ b/src/features/threads/hooks/useThreadUserInput.ts @@ -1,7 +1,7 @@ import { useCallback } from "react"; import type { Dispatch } from "react"; -import type { RequestUserInputRequest, RequestUserInputResponse } from "../../../types"; -import { respondToUserInputRequest } from "../../../services/tauri"; +import type { RequestUserInputRequest, RequestUserInputResponse } from "@/types"; +import { respondToUserInputRequest } from "@services/tauri"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadUserInputOptions = { diff --git a/src/features/threads/hooks/useThreadUserInputEvents.ts b/src/features/threads/hooks/useThreadUserInputEvents.ts index c2a164af3..6c772e218 100644 --- a/src/features/threads/hooks/useThreadUserInputEvents.ts +++ b/src/features/threads/hooks/useThreadUserInputEvents.ts @@ -1,6 +1,6 @@ import { useCallback } from "react"; import type { Dispatch } from "react"; -import type { RequestUserInputRequest } from "../../../types"; +import type { RequestUserInputRequest } from "@/types"; import type { ThreadAction } from "./useThreadsReducer"; type UseThreadUserInputEventsOptions = { diff --git a/src/features/threads/hooks/useThreads.integration.test.tsx b/src/features/threads/hooks/useThreads.integration.test.tsx index ba0e25e7b..c3e3fa263 100644 --- a/src/features/threads/hooks/useThreads.integration.test.tsx +++ b/src/features/threads/hooks/useThreads.integration.test.tsx @@ -1,30 +1,30 @@ // @vitest-environment jsdom import { act, renderHook, waitFor } from "@testing-library/react"; import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; -import type { WorkspaceInfo } from "../../../types"; -import type { useAppServerEvents } from "../../app/hooks/useAppServerEvents"; -import { useThreadRows } from "../../app/hooks/useThreadRows"; +import type { WorkspaceInfo } from "@/types"; +import type { useAppServerEvents } from "@app/hooks/useAppServerEvents"; +import { useThreadRows } from "@app/hooks/useThreadRows"; import { interruptTurn, listThreads, resumeThread, setThreadName, startReview, -} from "../../../services/tauri"; -import { STORAGE_KEY_DETACHED_REVIEW_LINKS } from "../utils/threadStorage"; +} from "@services/tauri"; +import { STORAGE_KEY_DETACHED_REVIEW_LINKS } from "@threads/utils/threadStorage"; import { useThreads } from "./useThreads"; type AppServerHandlers = Parameters[0]; let handlers: AppServerHandlers | null = null; -vi.mock("../../app/hooks/useAppServerEvents", () => ({ +vi.mock("@app/hooks/useAppServerEvents", () => ({ useAppServerEvents: (incoming: AppServerHandlers) => { handlers = incoming; }, })); -vi.mock("../../../services/tauri", () => ({ +vi.mock("@services/tauri", () => ({ respondToServerRequest: vi.fn(), respondToUserInputRequest: vi.fn(), rememberApprovalRule: vi.fn(), diff --git a/src/features/threads/hooks/useThreads.ts b/src/features/threads/hooks/useThreads.ts index e6a3d39c8..b0aa72edc 100644 --- a/src/features/threads/hooks/useThreads.ts +++ b/src/features/threads/hooks/useThreads.ts @@ -5,8 +5,8 @@ import type { DebugEntry, ThreadListSortKey, WorkspaceInfo, -} from "../../../types"; -import { useAppServerEvents } from "../../app/hooks/useAppServerEvents"; +} from "@/types"; +import { useAppServerEvents } from "@app/hooks/useAppServerEvents"; import { initialState, threadReducer } from "./useThreadsReducer"; import { useThreadStorage } from "./useThreadStorage"; import { useThreadLinking } from "./useThreadLinking"; @@ -20,13 +20,13 @@ import { useThreadSelectors } from "./useThreadSelectors"; import { useThreadStatus } from "./useThreadStatus"; import { useThreadUserInput } from "./useThreadUserInput"; import { useThreadTitleAutogeneration } from "./useThreadTitleAutogeneration"; -import { setThreadName as setThreadNameService } from "../../../services/tauri"; +import { setThreadName as setThreadNameService } from "@services/tauri"; import { loadDetachedReviewLinks, makeCustomNameKey, saveCustomName, saveDetachedReviewLinks, -} from "../utils/threadStorage"; +} from "@threads/utils/threadStorage"; type UseThreadsOptions = { activeWorkspace: WorkspaceInfo | null; diff --git a/src/features/threads/hooks/useThreadsReducer.test.ts b/src/features/threads/hooks/useThreadsReducer.test.ts index 5744e652f..a438668a6 100644 --- a/src/features/threads/hooks/useThreadsReducer.test.ts +++ b/src/features/threads/hooks/useThreadsReducer.test.ts @@ -1,5 +1,5 @@ import { describe, expect, it } from "vitest"; -import type { ConversationItem, ThreadSummary } from "../../../types"; +import type { ConversationItem, ThreadSummary } from "@/types"; import { initialState, threadReducer } from "./useThreadsReducer"; import type { ThreadState } from "./useThreadsReducer"; diff --git a/src/features/threads/hooks/useThreadsReducer.ts b/src/features/threads/hooks/useThreadsReducer.ts index 71b33733b..34c2cf1af 100644 --- a/src/features/threads/hooks/useThreadsReducer.ts +++ b/src/features/threads/hooks/useThreadsReducer.ts @@ -8,106 +8,11 @@ import type { ThreadSummary, ThreadTokenUsage, TurnPlan, -} from "../../../types"; -import { normalizeItem, prepareThreadItems, upsertItem } from "../../../utils/threadItems"; - -const MAX_THREAD_NAME_LENGTH = 38; - -function formatThreadName(text: string) { - const trimmed = text.trim(); - if (!trimmed) { - return null; - } - return trimmed.length > MAX_THREAD_NAME_LENGTH - ? `${trimmed.slice(0, MAX_THREAD_NAME_LENGTH)}…` - : trimmed; -} - -function looksAutoGeneratedThreadName(name: string) { - return name === "New Agent" || name.startsWith("Agent ") || /^[a-f0-9]{4,8}$/i.test(name); -} - -function extractRenameText(text: string) { - if (!text) { - return ""; - } - const withoutImages = text.replace(/\[image(?: x\d+)?\]/gi, " "); - const withoutSkills = withoutImages.replace(/(^|\s)\$[A-Za-z0-9_-]+(?=\s|$)/g, " "); - return withoutSkills.replace(/\s+/g, " ").trim(); -} - -function getAssistantTextForRename( - items: ConversationItem[], - itemId?: string, -): string { - if (itemId) { - const match = items.find( - (item) => - item.kind === "message" && - item.role === "assistant" && - item.id === itemId, - ); - if (match && match.kind === "message") { - return match.text; - } - } - for (let index = items.length - 1; index >= 0; index -= 1) { - const item = items[index]; - if (item.kind === "message" && item.role === "assistant") { - return item.text; - } - } - return ""; -} - -function maybeRenameThreadFromAgent({ - workspaceId, - threadId, - items, - itemId, - hasCustomName, - threadsByWorkspace, -}: { - workspaceId: string; - threadId: string; - items: ConversationItem[]; - itemId?: string; - hasCustomName: boolean; - threadsByWorkspace: Record; -}) { - const threads = threadsByWorkspace[workspaceId] ?? []; - if (!threads.length) { - return threadsByWorkspace; - } - const hasUserMessage = items.some( - (item) => item.kind === "message" && item.role === "user", - ); - if (hasUserMessage) { - return threadsByWorkspace; - } - if (hasCustomName) { - return threadsByWorkspace; - } - const nextName = formatThreadName(getAssistantTextForRename(items, itemId)); - if (!nextName) { - return threadsByWorkspace; - } - let didChange = false; - const nextThreads = threads.map((thread) => { - if ( - thread.id !== threadId || - thread.name === nextName || - !looksAutoGeneratedThreadName(thread.name) - ) { - return thread; - } - didChange = true; - return { ...thread, name: nextName }; - }); - return didChange - ? { ...threadsByWorkspace, [workspaceId]: nextThreads } - : threadsByWorkspace; -} +} from "@/types"; +import { reduceThreadItems } from "./threadReducer/threadItemsSlice"; +import { reduceThreadLifecycle } from "./threadReducer/threadLifecycleSlice"; +import { reduceThreadQueue } from "./threadReducer/threadQueueSlice"; +import { reduceThreadSnapshots } from "./threadReducer/threadSnapshotSlice"; type ThreadActivityStatus = { isProcessing: boolean; @@ -281,890 +186,21 @@ export const initialState: ThreadState = { lastAgentMessageByThread: {}, }; -function mergeStreamingText(existing: string, delta: string) { - if (!delta) { - return existing; - } - if (!existing) { - return delta; - } - if (delta === existing) { - return existing; - } - if (delta.startsWith(existing)) { - return delta; - } - if (existing.startsWith(delta)) { - return existing; - } - const maxOverlap = Math.min(existing.length, delta.length); - for (let length = maxOverlap; length > 0; length -= 1) { - if (existing.endsWith(delta.slice(0, length))) { - return `${existing}${delta.slice(length)}`; - } - } - return `${existing}${delta}`; -} +type ThreadSliceReducer = (state: ThreadState, action: ThreadAction) => ThreadState; -function addSummaryBoundary(existing: string) { - if (!existing) { - return existing; - } - if (existing.endsWith("\n\n")) { - return existing; - } - if (existing.endsWith("\n")) { - return `${existing}\n`; - } - return `${existing}\n\n`; -} - -function dropLatestLocalReviewStart(list: ConversationItem[]) { - for (let index = list.length - 1; index >= 0; index -= 1) { - const item = list[index]; - if ( - item.kind === "review" && - item.state === "started" && - item.id.startsWith("review-start-") - ) { - return [...list.slice(0, index), ...list.slice(index + 1)]; - } - } - return list; -} - -function findMatchingReview( - list: ConversationItem[], - target: Extract, -) { - const normalizedText = target.text.trim(); - return list.find( - (item) => - item.kind === "review" && - item.state === target.state && - item.text.trim() === normalizedText, - ); -} - -function ensureUniqueReviewId(list: ConversationItem[], item: ConversationItem) { - if (item.kind !== "review") { - return item; - } - if (!list.some((entry) => entry.id === item.id)) { - return item; - } - const existingIds = new Set(list.map((entry) => entry.id)); - let suffix = 1; - let candidate = `${item.id}-${suffix}`; - while (existingIds.has(candidate)) { - suffix += 1; - candidate = `${item.id}-${suffix}`; - } - return { ...item, id: candidate }; -} - -function isDuplicateReviewById( - list: ConversationItem[], - target: Extract, -) { - const normalizedText = target.text.trim(); - return list.some( - (item) => - item.kind === "review" && - item.id === target.id && - item.state === target.state && - item.text.trim() === normalizedText, - ); -} - -function prefersUpdatedSort(state: ThreadState, workspaceId: string) { - return (state.threadSortKeyByWorkspace[workspaceId] ?? "updated_at") === "updated_at"; -} +const threadSliceReducers: ThreadSliceReducer[] = [ + reduceThreadLifecycle, + reduceThreadItems, + reduceThreadQueue, + reduceThreadSnapshots, +]; export function threadReducer(state: ThreadState, action: ThreadAction): ThreadState { - switch (action.type) { - case "setActiveThreadId": - return { - ...state, - activeThreadIdByWorkspace: { - ...state.activeThreadIdByWorkspace, - [action.workspaceId]: action.threadId, - }, - threadStatusById: action.threadId - ? { - ...state.threadStatusById, - [action.threadId]: { - isProcessing: - state.threadStatusById[action.threadId]?.isProcessing ?? false, - hasUnread: false, - isReviewing: - state.threadStatusById[action.threadId]?.isReviewing ?? false, - processingStartedAt: - state.threadStatusById[action.threadId]?.processingStartedAt ?? - null, - lastDurationMs: - state.threadStatusById[action.threadId]?.lastDurationMs ?? null, - }, - } - : state.threadStatusById, - }; - case "ensureThread": { - const hidden = - state.hiddenThreadIdsByWorkspace[action.workspaceId]?.[action.threadId] ?? - false; - if (hidden) { - return state; - } - const list = state.threadsByWorkspace[action.workspaceId] ?? []; - if (list.some((thread) => thread.id === action.threadId)) { - return state; - } - const thread: ThreadSummary = { - id: action.threadId, - name: "New Agent", - updatedAt: 0, - }; - return { - ...state, - threadsByWorkspace: { - ...state.threadsByWorkspace, - [action.workspaceId]: [thread, ...list], - }, - threadStatusById: { - ...state.threadStatusById, - [action.threadId]: { - isProcessing: false, - hasUnread: false, - isReviewing: false, - processingStartedAt: null, - lastDurationMs: null, - }, - }, - activeThreadIdByWorkspace: { - ...state.activeThreadIdByWorkspace, - [action.workspaceId]: - state.activeThreadIdByWorkspace[action.workspaceId] ?? action.threadId, - }, - }; - } - case "hideThread": { - const hiddenForWorkspace = - state.hiddenThreadIdsByWorkspace[action.workspaceId] ?? {}; - if (hiddenForWorkspace[action.threadId]) { - return state; - } - - const nextHiddenForWorkspace = { - ...hiddenForWorkspace, - [action.threadId]: true as const, - }; - - const list = state.threadsByWorkspace[action.workspaceId] ?? []; - const filtered = list.filter((thread) => thread.id !== action.threadId); - const nextActive = - state.activeThreadIdByWorkspace[action.workspaceId] === action.threadId - ? filtered[0]?.id ?? null - : state.activeThreadIdByWorkspace[action.workspaceId] ?? null; - - return { - ...state, - hiddenThreadIdsByWorkspace: { - ...state.hiddenThreadIdsByWorkspace, - [action.workspaceId]: nextHiddenForWorkspace, - }, - threadsByWorkspace: { - ...state.threadsByWorkspace, - [action.workspaceId]: filtered, - }, - activeThreadIdByWorkspace: { - ...state.activeThreadIdByWorkspace, - [action.workspaceId]: nextActive, - }, - }; - } - case "removeThread": { - const list = state.threadsByWorkspace[action.workspaceId] ?? []; - const filtered = list.filter((thread) => thread.id !== action.threadId); - const nextActive = - state.activeThreadIdByWorkspace[action.workspaceId] === action.threadId - ? filtered[0]?.id ?? null - : state.activeThreadIdByWorkspace[action.workspaceId] ?? null; - const { [action.threadId]: _, ...restItems } = state.itemsByThread; - const { [action.threadId]: __, ...restStatus } = state.threadStatusById; - const { [action.threadId]: ___, ...restTurns } = state.activeTurnIdByThread; - const { [action.threadId]: ____, ...restDiffs } = state.turnDiffByThread; - const { [action.threadId]: _____, ...restPlans } = state.planByThread; - const { [action.threadId]: ______, ...restParents } = state.threadParentById; - return { - ...state, - threadsByWorkspace: { - ...state.threadsByWorkspace, - [action.workspaceId]: filtered, - }, - itemsByThread: restItems, - threadStatusById: restStatus, - activeTurnIdByThread: restTurns, - turnDiffByThread: restDiffs, - planByThread: restPlans, - threadParentById: restParents, - activeThreadIdByWorkspace: { - ...state.activeThreadIdByWorkspace, - [action.workspaceId]: nextActive, - }, - }; - } - case "setThreadParent": { - if (!action.parentId || action.parentId === action.threadId) { - return state; - } - if (state.threadParentById[action.threadId] === action.parentId) { - return state; - } - return { - ...state, - threadParentById: { - ...state.threadParentById, - [action.threadId]: action.parentId, - }, - }; - } - case "markProcessing": { - const previous = state.threadStatusById[action.threadId]; - const wasProcessing = previous?.isProcessing ?? false; - const startedAt = previous?.processingStartedAt ?? null; - const lastDurationMs = previous?.lastDurationMs ?? null; - const hasUnread = previous?.hasUnread ?? false; - const isReviewing = previous?.isReviewing ?? false; - if (action.isProcessing) { - const nextStartedAt = - wasProcessing && startedAt ? startedAt : action.timestamp; - const nextStatus: ThreadActivityStatus = { - isProcessing: true, - hasUnread, - isReviewing, - processingStartedAt: nextStartedAt, - lastDurationMs, - }; - if ( - previous && - previous.isProcessing === nextStatus.isProcessing && - previous.hasUnread === nextStatus.hasUnread && - previous.isReviewing === nextStatus.isReviewing && - previous.processingStartedAt === nextStatus.processingStartedAt && - previous.lastDurationMs === nextStatus.lastDurationMs - ) { - return state; - } - return { - ...state, - threadStatusById: { - ...state.threadStatusById, - [action.threadId]: nextStatus, - }, - }; - } - const nextDuration = - wasProcessing && startedAt - ? Math.max(0, action.timestamp - startedAt) - : lastDurationMs ?? null; - const nextStatus: ThreadActivityStatus = { - isProcessing: false, - hasUnread, - isReviewing, - processingStartedAt: null, - lastDurationMs: nextDuration, - }; - if ( - previous && - previous.isProcessing === nextStatus.isProcessing && - previous.hasUnread === nextStatus.hasUnread && - previous.isReviewing === nextStatus.isReviewing && - previous.processingStartedAt === nextStatus.processingStartedAt && - previous.lastDurationMs === nextStatus.lastDurationMs - ) { - return state; - } - return { - ...state, - threadStatusById: { - ...state.threadStatusById, - [action.threadId]: nextStatus, - }, - }; - } - case "setActiveTurnId": - return { - ...state, - activeTurnIdByThread: { - ...state.activeTurnIdByThread, - [action.threadId]: action.turnId, - }, - }; - case "markReviewing": { - const previous = state.threadStatusById[action.threadId]; - const nextStatus: ThreadActivityStatus = { - isProcessing: previous?.isProcessing ?? false, - hasUnread: previous?.hasUnread ?? false, - isReviewing: action.isReviewing, - processingStartedAt: previous?.processingStartedAt ?? null, - lastDurationMs: previous?.lastDurationMs ?? null, - }; - if ( - previous && - previous.isProcessing === nextStatus.isProcessing && - previous.hasUnread === nextStatus.hasUnread && - previous.isReviewing === nextStatus.isReviewing && - previous.processingStartedAt === nextStatus.processingStartedAt && - previous.lastDurationMs === nextStatus.lastDurationMs - ) { - return state; - } - return { - ...state, - threadStatusById: { - ...state.threadStatusById, - [action.threadId]: nextStatus, - }, - }; - } - case "markUnread": { - const previous = state.threadStatusById[action.threadId]; - const nextStatus: ThreadActivityStatus = { - isProcessing: previous?.isProcessing ?? false, - hasUnread: action.hasUnread, - isReviewing: previous?.isReviewing ?? false, - processingStartedAt: previous?.processingStartedAt ?? null, - lastDurationMs: previous?.lastDurationMs ?? null, - }; - if ( - previous && - previous.isProcessing === nextStatus.isProcessing && - previous.hasUnread === nextStatus.hasUnread && - previous.isReviewing === nextStatus.isReviewing && - previous.processingStartedAt === nextStatus.processingStartedAt && - previous.lastDurationMs === nextStatus.lastDurationMs - ) { - return state; - } - return { - ...state, - threadStatusById: { - ...state.threadStatusById, - [action.threadId]: nextStatus, - }, - }; - } - case "addAssistantMessage": { - const list = state.itemsByThread[action.threadId] ?? []; - const message: ConversationItem = { - id: `${Date.now()}-assistant`, - kind: "message", - role: "assistant", - text: action.text, - }; - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: prepareThreadItems([...list, message]), - }, - }; - } - case "setThreadName": { - const list = state.threadsByWorkspace[action.workspaceId] ?? []; - const next = list.map((thread) => - thread.id === action.threadId ? { ...thread, name: action.name } : thread, - ); - return { - ...state, - threadsByWorkspace: { - ...state.threadsByWorkspace, - [action.workspaceId]: next, - }, - }; - } - case "setThreadTimestamp": { - const list = state.threadsByWorkspace[action.workspaceId] ?? []; - if (!list.length) { - return state; - } - let didChange = false; - const next = list.map((thread) => { - if (thread.id !== action.threadId) { - return thread; - } - const current = thread.updatedAt ?? 0; - if (current >= action.timestamp) { - return thread; - } - didChange = true; - return { ...thread, updatedAt: action.timestamp }; - }); - if (!didChange) { - return state; - } - const sorted = prefersUpdatedSort(state, action.workspaceId) - ? [ - ...next.filter((thread) => thread.id === action.threadId), - ...next.filter((thread) => thread.id !== action.threadId), - ] - : next; - return { - ...state, - threadsByWorkspace: { - ...state.threadsByWorkspace, - [action.workspaceId]: sorted, - }, - }; - } - case "appendAgentDelta": { - const list = [...(state.itemsByThread[action.threadId] ?? [])]; - const index = list.findIndex((msg) => msg.id === action.itemId); - if (index >= 0 && list[index].kind === "message") { - const existing = list[index]; - list[index] = { - ...existing, - text: mergeStreamingText(existing.text, action.delta), - }; - } else { - list.push({ - id: action.itemId, - kind: "message", - role: "assistant", - text: action.delta, - }); - } - const updatedItems = prepareThreadItems(list); - const nextThreadsByWorkspace = maybeRenameThreadFromAgent({ - workspaceId: action.workspaceId, - threadId: action.threadId, - items: updatedItems, - itemId: action.itemId, - hasCustomName: action.hasCustomName, - threadsByWorkspace: state.threadsByWorkspace, - }); - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: updatedItems, - }, - threadsByWorkspace: nextThreadsByWorkspace, - }; - } - case "completeAgentMessage": { - const list = [...(state.itemsByThread[action.threadId] ?? [])]; - const index = list.findIndex((msg) => msg.id === action.itemId); - if (index >= 0 && list[index].kind === "message") { - const existing = list[index]; - list[index] = { - ...existing, - text: action.text || existing.text, - }; - } else { - list.push({ - id: action.itemId, - kind: "message", - role: "assistant", - text: action.text, - }); - } - const updatedItems = prepareThreadItems(list); - const nextThreadsByWorkspace = maybeRenameThreadFromAgent({ - workspaceId: action.workspaceId, - threadId: action.threadId, - items: updatedItems, - itemId: action.itemId, - hasCustomName: action.hasCustomName, - threadsByWorkspace: state.threadsByWorkspace, - }); - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: updatedItems, - }, - threadsByWorkspace: nextThreadsByWorkspace, - }; - } - case "upsertItem": { - let list = state.itemsByThread[action.threadId] ?? []; - const item = normalizeItem(action.item); - const isUserMessage = item.kind === "message" && item.role === "user"; - const hadUserMessage = isUserMessage - ? list.some((entry) => entry.kind === "message" && entry.role === "user") - : false; - const renameText = isUserMessage ? extractRenameText(item.text) : ""; - if ( - item.kind === "review" && - item.state === "started" && - !item.id.startsWith("review-start-") - ) { - list = dropLatestLocalReviewStart(list); - } - if (item.kind === "review" && isDuplicateReviewById(list, item)) { - return state; - } - if (item.kind === "review") { - const existing = findMatchingReview(list, item); - if (existing && existing.id !== item.id) { - return state; - } - } - const nextItem = ensureUniqueReviewId(list, item); - const updatedItems = prepareThreadItems(upsertItem(list, nextItem)); - let nextThreadsByWorkspace = state.threadsByWorkspace; - if (isUserMessage) { - const threads = state.threadsByWorkspace[action.workspaceId] ?? []; - const textValue = renameText; - const updatedThreads = threads.map((thread) => { - if (thread.id !== action.threadId) { - return thread; - } - const looksAutoGenerated = looksAutoGeneratedThreadName(thread.name); - const shouldRename = - !hadUserMessage && - textValue.length > 0 && - looksAutoGenerated && - !action.hasCustomName; - const nextName = - shouldRename && textValue.length > 38 - ? `${textValue.slice(0, 38)}…` - : shouldRename - ? textValue - : thread.name; - return { ...thread, name: nextName }; - }); - const bumpedThreads = - prefersUpdatedSort(state, action.workspaceId) && updatedThreads.length - ? [ - ...updatedThreads.filter((thread) => thread.id === action.threadId), - ...updatedThreads.filter((thread) => thread.id !== action.threadId), - ] - : updatedThreads; - nextThreadsByWorkspace = { - ...state.threadsByWorkspace, - [action.workspaceId]: bumpedThreads, - }; - } - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: updatedItems, - }, - threadsByWorkspace: nextThreadsByWorkspace, - }; - } - case "setThreadItems": - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: prepareThreadItems(action.items), - }, - }; - case "setLastAgentMessage": - if ( - state.lastAgentMessageByThread[action.threadId]?.timestamp >= action.timestamp - ) { - return state; - } - return { - ...state, - lastAgentMessageByThread: { - ...state.lastAgentMessageByThread, - [action.threadId]: { text: action.text, timestamp: action.timestamp }, - }, - }; - case "appendReasoningSummary": { - const list = state.itemsByThread[action.threadId] ?? []; - const index = list.findIndex((entry) => entry.id === action.itemId); - const base = - index >= 0 && list[index].kind === "reasoning" - ? (list[index] as ConversationItem) - : { - id: action.itemId, - kind: "reasoning", - summary: "", - content: "", - }; - const updated: ConversationItem = { - ...base, - summary: mergeStreamingText( - "summary" in base ? base.summary : "", - action.delta, - ), - } as ConversationItem; - const next = index >= 0 ? [...list] : [...list, updated]; - if (index >= 0) { - next[index] = updated; - } - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: prepareThreadItems(next), - }, - }; - } - case "appendReasoningSummaryBoundary": { - const list = state.itemsByThread[action.threadId] ?? []; - const index = list.findIndex((entry) => entry.id === action.itemId); - const base = - index >= 0 && list[index].kind === "reasoning" - ? (list[index] as ConversationItem) - : { - id: action.itemId, - kind: "reasoning", - summary: "", - content: "", - }; - const updated: ConversationItem = { - ...base, - summary: addSummaryBoundary("summary" in base ? base.summary : ""), - } as ConversationItem; - const next = index >= 0 ? [...list] : [...list, updated]; - if (index >= 0) { - next[index] = updated; - } - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: prepareThreadItems(next), - }, - }; - } - case "appendReasoningContent": { - const list = state.itemsByThread[action.threadId] ?? []; - const index = list.findIndex((entry) => entry.id === action.itemId); - const base = - index >= 0 && list[index].kind === "reasoning" - ? (list[index] as ConversationItem) - : { - id: action.itemId, - kind: "reasoning", - summary: "", - content: "", - }; - const updated: ConversationItem = { - ...base, - content: mergeStreamingText( - "content" in base ? base.content : "", - action.delta, - ), - } as ConversationItem; - const next = index >= 0 ? [...list] : [...list, updated]; - if (index >= 0) { - next[index] = updated; - } - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: prepareThreadItems(next), - }, - }; - } - case "appendPlanDelta": { - const list = state.itemsByThread[action.threadId] ?? []; - const index = list.findIndex((entry) => entry.id === action.itemId); - const base = - index >= 0 && list[index].kind === "tool" - ? list[index] - : { - id: action.itemId, - kind: "tool", - toolType: "plan", - title: "Plan", - detail: "", - status: "in_progress", - output: "", - }; - const existingOutput = - base.kind === "tool" ? (base.output ?? "") : ""; - const updated: ConversationItem = { - ...(base as ConversationItem), - kind: "tool", - toolType: "plan", - title: "Plan", - detail: "Generating plan...", - status: "in_progress", - output: mergeStreamingText(existingOutput, action.delta), - } as ConversationItem; - const next = index >= 0 ? [...list] : [...list, updated]; - if (index >= 0) { - next[index] = updated; - } - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: prepareThreadItems(next), - }, - }; - } - case "appendToolOutput": { - const list = state.itemsByThread[action.threadId] ?? []; - const index = list.findIndex((entry) => entry.id === action.itemId); - if (index < 0 || list[index].kind !== "tool") { - return state; - } - const existing = list[index]; - const updated: ConversationItem = { - ...existing, - output: mergeStreamingText(existing.output ?? "", action.delta), - } as ConversationItem; - const next = [...list]; - next[index] = updated; - return { - ...state, - itemsByThread: { - ...state.itemsByThread, - [action.threadId]: prepareThreadItems(next), - }, - }; - } - case "addApproval": { - const exists = state.approvals.some( - (item) => - item.request_id === action.approval.request_id && - item.workspace_id === action.approval.workspace_id, - ); - if (exists) { - return state; - } - return { ...state, approvals: [...state.approvals, action.approval] }; - } - case "removeApproval": - return { - ...state, - approvals: state.approvals.filter( - (item) => - item.request_id !== action.requestId || - item.workspace_id !== action.workspaceId, - ), - }; - case "addUserInputRequest": { - const exists = state.userInputRequests.some( - (item) => - item.request_id === action.request.request_id && - item.workspace_id === action.request.workspace_id, - ); - if (exists) { - return state; - } - return { - ...state, - userInputRequests: [...state.userInputRequests, action.request], - }; - } - case "removeUserInputRequest": - return { - ...state, - userInputRequests: state.userInputRequests.filter( - (item) => - item.request_id !== action.requestId || - item.workspace_id !== action.workspaceId, - ), - }; - case "setThreads": { - const hidden = state.hiddenThreadIdsByWorkspace[action.workspaceId] ?? {}; - const visibleThreads = action.threads.filter((thread) => !hidden[thread.id]); - return { - ...state, - threadsByWorkspace: { - ...state.threadsByWorkspace, - [action.workspaceId]: visibleThreads, - }, - threadSortKeyByWorkspace: { - ...state.threadSortKeyByWorkspace, - [action.workspaceId]: action.sortKey, - }, - }; + for (const reduceSlice of threadSliceReducers) { + const nextState = reduceSlice(state, action); + if (nextState !== state) { + return nextState; } - case "setThreadListLoading": - return { - ...state, - threadListLoadingByWorkspace: { - ...state.threadListLoadingByWorkspace, - [action.workspaceId]: action.isLoading, - }, - }; - case "setThreadResumeLoading": - return { - ...state, - threadResumeLoadingById: { - ...state.threadResumeLoadingById, - [action.threadId]: action.isLoading, - }, - }; - case "setThreadListPaging": - return { - ...state, - threadListPagingByWorkspace: { - ...state.threadListPagingByWorkspace, - [action.workspaceId]: action.isLoading, - }, - }; - case "setThreadListCursor": - return { - ...state, - threadListCursorByWorkspace: { - ...state.threadListCursorByWorkspace, - [action.workspaceId]: action.cursor, - }, - }; - case "setThreadTokenUsage": - return { - ...state, - tokenUsageByThread: { - ...state.tokenUsageByThread, - [action.threadId]: action.tokenUsage, - }, - }; - case "setRateLimits": - return { - ...state, - rateLimitsByWorkspace: { - ...state.rateLimitsByWorkspace, - [action.workspaceId]: action.rateLimits, - }, - }; - case "setAccountInfo": - return { - ...state, - accountByWorkspace: { - ...state.accountByWorkspace, - [action.workspaceId]: action.account, - }, - }; - case "setThreadTurnDiff": - return { - ...state, - turnDiffByThread: { - ...state.turnDiffByThread, - [action.threadId]: action.diff, - }, - }; - case "setThreadPlan": - return { - ...state, - planByThread: { - ...state.planByThread, - [action.threadId]: action.plan, - }, - }; - case "clearThreadPlan": - return { - ...state, - planByThread: { - ...state.planByThread, - [action.threadId]: null, - }, - }; - default: - return state; } + return state; } diff --git a/src/features/threads/utils/threadCodexParamsSeed.ts b/src/features/threads/utils/threadCodexParamsSeed.ts index 761f2a406..de9fa728a 100644 --- a/src/features/threads/utils/threadCodexParamsSeed.ts +++ b/src/features/threads/utils/threadCodexParamsSeed.ts @@ -1,8 +1,8 @@ -import type { AccessMode } from "../../../types"; +import type { AccessMode } from "@/types"; import type { ThreadCodexParams } from "./threadStorage"; import { makeThreadCodexParamsKey } from "./threadStorage"; -export const NO_THREAD_SCOPE_SUFFIX = "__no_thread__"; +const NO_THREAD_SCOPE_SUFFIX = "__no_thread__"; export type PendingNewThreadSeed = { workspaceId: string; @@ -20,7 +20,7 @@ type ResolveThreadCodexStateInput = { pendingSeed: PendingNewThreadSeed | null; }; -export type ResolvedThreadCodexState = { +type ResolvedThreadCodexState = { scopeKey: string; accessMode: AccessMode; preferredModelId: string | null; @@ -28,7 +28,7 @@ export type ResolvedThreadCodexState = { preferredCollabModeId: string | null; }; -export type ThreadCodexSeedPatch = { +type ThreadCodexSeedPatch = { modelId: string | null; effort: string | null; accessMode: AccessMode; diff --git a/src/features/threads/utils/threadNormalize.test.ts b/src/features/threads/utils/threadNormalize.test.ts index 5cf50491f..5f1e83eb9 100644 --- a/src/features/threads/utils/threadNormalize.test.ts +++ b/src/features/threads/utils/threadNormalize.test.ts @@ -29,4 +29,3 @@ describe("normalizePlanUpdate", () => { expect(normalizePlanUpdate("turn-3", "", { steps: [] })).toBeNull(); }); }); - diff --git a/src/features/threads/utils/threadNormalize.ts b/src/features/threads/utils/threadNormalize.ts index 4a62be296..87d257cff 100644 --- a/src/features/threads/utils/threadNormalize.ts +++ b/src/features/threads/utils/threadNormalize.ts @@ -5,13 +5,13 @@ import type { TurnPlan, TurnPlanStep, TurnPlanStepStatus, -} from "../../../types"; +} from "@/types"; export function asString(value: unknown) { return typeof value === "string" ? value : value ? String(value) : ""; } -export function asNumber(value: unknown): number { +function asNumber(value: unknown): number { if (typeof value === "number" && Number.isFinite(value)) { return value; } @@ -189,7 +189,7 @@ export function normalizeRateLimits(raw: Record): RateLimitSnap }; } -export function normalizePlanStepStatus(value: unknown): TurnPlanStepStatus { +function normalizePlanStepStatus(value: unknown): TurnPlanStepStatus { const raw = typeof value === "string" ? value : ""; const normalized = raw.replace(/[_\s-]/g, "").toLowerCase(); if (normalized === "inprogress") { diff --git a/src/features/threads/utils/threadRpc.ts b/src/features/threads/utils/threadRpc.ts new file mode 100644 index 000000000..47230e4bf --- /dev/null +++ b/src/features/threads/utils/threadRpc.ts @@ -0,0 +1,73 @@ +import { asString } from "./threadNormalize"; + +function asRecord(value: unknown): Record | null { + if (!value || typeof value !== "object") { + return null; + } + return value as Record; +} + +export function getParentThreadIdFromSource(source: unknown): string | null { + const sourceRecord = asRecord(source); + if (!sourceRecord) { + return null; + } + const subAgent = asRecord(sourceRecord.subAgent ?? sourceRecord.sub_agent); + if (!subAgent) { + return null; + } + const threadSpawn = asRecord(subAgent.thread_spawn ?? subAgent.threadSpawn); + if (!threadSpawn) { + return null; + } + const parentId = asString( + threadSpawn.parent_thread_id ?? threadSpawn.parentThreadId, + ); + return parentId || null; +} + +function normalizeTurnStatus(value: unknown): string { + return String(value ?? "") + .trim() + .toLowerCase() + .replace(/[\s_-]/g, ""); +} + +export function getResumedActiveTurnId(thread: Record): string | null { + const turns = Array.isArray(thread.turns) + ? (thread.turns as Array>) + : []; + for (let index = turns.length - 1; index >= 0; index -= 1) { + const turn = turns[index]; + if (!turn || typeof turn !== "object") { + continue; + } + const status = normalizeTurnStatus( + turn.status ?? turn.turnStatus ?? turn.turn_status, + ); + const isInProgress = + status === "inprogress" || + status === "running" || + status === "processing" || + status === "pending" || + status === "started"; + if (!isInProgress) { + continue; + } + const turnId = asString(turn.id ?? turn.turnId ?? turn.turn_id); + if (turnId) { + return turnId; + } + } + return null; +} + +export function isUnsupportedTurnSteerError(message: string): boolean { + const normalized = message.toLowerCase(); + const mentionsSteerMethod = + normalized.includes("turn/steer") || normalized.includes("turn_steer"); + return normalized.includes("unknown variant `turn/steer`") + || normalized.includes("unknown variant \"turn/steer\"") + || (normalized.includes("unknown request") && mentionsSteerMethod) + || (normalized.includes("unknown method") && mentionsSteerMethod); +} diff --git a/src/features/threads/utils/threadStorage.ts b/src/features/threads/utils/threadStorage.ts index e24cdbc5e..df0dd154a 100644 --- a/src/features/threads/utils/threadStorage.ts +++ b/src/features/threads/utils/threadStorage.ts @@ -1,6 +1,6 @@ -import type { AccessMode } from "../../../types"; +import type { AccessMode } from "@/types"; -export const STORAGE_KEY_THREAD_ACTIVITY = "codexmonitor.threadLastUserActivity"; +const STORAGE_KEY_THREAD_ACTIVITY = "codexmonitor.threadLastUserActivity"; export const STORAGE_KEY_PINNED_THREADS = "codexmonitor.pinnedThreads"; export const STORAGE_KEY_CUSTOM_NAMES = "codexmonitor.threadCustomNames"; export const STORAGE_KEY_THREAD_CODEX_PARAMS = "codexmonitor.threadCodexParams"; @@ -10,7 +10,7 @@ export const MAX_PINS_SOFT_LIMIT = 5; export type ThreadActivityMap = Record>; export type PinnedThreadsMap = Record; export type CustomNamesMap = Record; -export type DetachedReviewLinksMap = Record>; +type DetachedReviewLinksMap = Record>; // Per-thread Codex parameter overrides. Keyed by `${workspaceId}:${threadId}`. // These are UI-level preferences (not server state) and are best-effort persisted. diff --git a/src/features/workspaces/components/WorkspaceHome.tsx b/src/features/workspaces/components/WorkspaceHome.tsx index 89aa6c69e..6108481e9 100644 --- a/src/features/workspaces/components/WorkspaceHome.tsx +++ b/src/features/workspaces/components/WorkspaceHome.tsx @@ -30,6 +30,7 @@ import { isComposingEvent } from "../../../utils/keys"; import { FileEditorCard } from "../../shared/components/FileEditorCard"; import { WorkspaceHomeRunControls } from "./WorkspaceHomeRunControls"; import { WorkspaceHomeHistory } from "./WorkspaceHomeHistory"; +import { WorkspaceHomeGitInitBanner } from "./WorkspaceHomeGitInitBanner"; import { buildIconPath } from "./workspaceHomeHelpers"; import { useWorkspaceHomeSuggestionsStyle } from "../hooks/useWorkspaceHomeSuggestionsStyle"; @@ -40,6 +41,9 @@ type ThreadStatus = { type WorkspaceHomeProps = { workspace: WorkspaceInfo; + showGitInitBanner: boolean; + initGitRepoLoading: boolean; + onInitGitRepo: () => void | Promise; runs: WorkspaceHomeRun[]; recentThreadInstances: WorkspaceHomeRunInstance[]; recentThreadsUpdatedAt: number | null; @@ -99,6 +103,9 @@ type WorkspaceHomeProps = { export function WorkspaceHome({ workspace, + showGitInitBanner, + initGitRepoLoading, + onInitGitRepo, runs, recentThreadInstances, recentThreadsUpdatedAt, @@ -359,6 +366,13 @@ export function WorkspaceHome({
+ {showGitInitBanner && ( + + )} +
{ + it("calls onInitGitRepo when clicked", () => { + const onInitGitRepo = vi.fn(); + render(); + + fireEvent.click(screen.getByRole("button", { name: "Initialize Git" })); + expect(onInitGitRepo).toHaveBeenCalledTimes(1); + }); + + it("disables the button when loading", () => { + render(); + + const button = screen.getByRole("button", { name: "Initializing..." }); + expect((button as HTMLButtonElement).disabled).toBe(true); + }); +}); + diff --git a/src/features/workspaces/components/WorkspaceHomeGitInitBanner.tsx b/src/features/workspaces/components/WorkspaceHomeGitInitBanner.tsx new file mode 100644 index 000000000..fa2c77716 --- /dev/null +++ b/src/features/workspaces/components/WorkspaceHomeGitInitBanner.tsx @@ -0,0 +1,28 @@ +type WorkspaceHomeGitInitBannerProps = { + isLoading: boolean; + onInitGitRepo: () => void | Promise; +}; + +export function WorkspaceHomeGitInitBanner({ + isLoading, + onInitGitRepo, +}: WorkspaceHomeGitInitBannerProps) { + return ( +
+
+ Git is not initialized for this project. +
+
+ +
+
+ ); +} + diff --git a/src/services/tauri.test.ts b/src/services/tauri.test.ts index b13acf6f4..a75e58d22 100644 --- a/src/services/tauri.test.ts +++ b/src/services/tauri.test.ts @@ -5,6 +5,7 @@ import * as notification from "@tauri-apps/plugin-notification"; import { addWorkspace, compactThread, + createGitHubRepo, fetchGit, forkThread, getAppsList, @@ -121,6 +122,20 @@ describe("tauri invoke wrappers", () => { }); }); + it("maps args for createGitHubRepo", async () => { + const invokeMock = vi.mocked(invoke); + invokeMock.mockResolvedValueOnce({ status: "ok", repo: "me/repo" }); + + await createGitHubRepo("ws-77", "me/repo", "private", "main"); + + expect(invokeMock).toHaveBeenCalledWith("create_github_repo", { + workspaceId: "ws-77", + repo: "me/repo", + visibility: "private", + branch: "main", + }); + }); + it("maps workspace_id to workspaceId for GitHub issues", async () => { const invokeMock = vi.mocked(invoke); invokeMock.mockResolvedValueOnce({ total: 0, issues: [] }); diff --git a/src/services/tauri.ts b/src/services/tauri.ts index 0efe0c3b7..bf5ae8135 100644 --- a/src/services/tauri.ts +++ b/src/services/tauri.ts @@ -387,6 +387,43 @@ export async function getGitStatus(workspace_id: string): Promise<{ return invoke("get_git_status", { workspaceId: workspace_id }); } +export type InitGitRepoResponse = + | { status: "initialized"; commitError?: string } + | { status: "already_initialized" } + | { status: "needs_confirmation"; entryCount: number }; + +export async function initGitRepo( + workspaceId: string, + branch: string, + force = false, +): Promise { + return invoke("init_git_repo", { workspaceId, branch, force }); +} + +export type CreateGitHubRepoResponse = + | { status: "ok"; repo: string; remoteUrl?: string | null } + | { + status: "partial"; + repo: string; + remoteUrl?: string | null; + pushError?: string | null; + defaultBranchError?: string | null; + }; + +export async function createGitHubRepo( + workspaceId: string, + repo: string, + visibility: "private" | "public", + branch?: string | null, +): Promise { + return invoke("create_github_repo", { + workspaceId, + repo, + visibility, + branch, + }); +} + export async function listGitRoots( workspace_id: string, depth: number, diff --git a/src/styles/compact-phone.css b/src/styles/compact-phone.css index cee1af255..326120d2a 100644 --- a/src/styles/compact-phone.css +++ b/src/styles/compact-phone.css @@ -21,6 +21,7 @@ overflow-x: hidden; padding-top: calc(32px + env(safe-area-inset-top)); padding-bottom: 16px; + background: var(--surface-messages); } .app.layout-phone .compact-topbar { @@ -31,6 +32,10 @@ padding-top: calc(44px + env(safe-area-inset-top)); } +.app.layout-phone .debug-panel.full .debug-header { + padding-top: calc(10px + env(safe-area-inset-top)); +} + .app.layout-phone .messages-full { padding: 12px 16px 12px; } @@ -87,9 +92,11 @@ html[data-mobile-composer-focus="true"] .app.layout-phone { overflow: hidden; } -.compact-git-list > .ds-panel { +.app.layout-phone .compact-git-list > .ds-panel { flex: 1; min-height: 0; + border-radius: 0; + border: none; } .compact-git-viewer { @@ -98,8 +105,10 @@ html[data-mobile-composer-focus="true"] .app.layout-phone { overflow: hidden; } -.compact-git-viewer .diff-viewer { +.app.layout-phone .compact-git-viewer .diff-viewer { height: 100%; + border-top-left-radius: 0; + border-top-right-radius: 0; } .compact-git-back { @@ -111,6 +120,10 @@ html[data-mobile-composer-focus="true"] .app.layout-phone { background: var(--surface-topbar); } +.app.layout-phone .compact-git-back { + border-bottom: none; +} + .app.layout-phone .compact-panel > .compact-git-back:first-child { padding-top: calc(8px + env(safe-area-inset-top)); } diff --git a/src/styles/diff.css b/src/styles/diff.css index 5fe60f08c..8ceab98df 100644 --- a/src/styles/diff.css +++ b/src/styles/diff.css @@ -141,6 +141,10 @@ color: var(--text-strong); } +.git-root-primary-action { + display: flex; +} + .git-root-actions { display: flex; flex-wrap: wrap; diff --git a/src/styles/ds-modal.css b/src/styles/ds-modal.css index d18b7bd12..98ef07dfd 100644 --- a/src/styles/ds-modal.css +++ b/src/styles/ds-modal.css @@ -1,7 +1,7 @@ .ds-modal { position: fixed; inset: 0; - z-index: 40; + z-index: var(--ds-layer-modal, 40); } .ds-modal-backdrop { diff --git a/src/styles/ds-tokens.css b/src/styles/ds-tokens.css index 76a0a9bd4..b7965aa6f 100644 --- a/src/styles/ds-tokens.css +++ b/src/styles/ds-tokens.css @@ -37,4 +37,8 @@ --ds-diff-lib-bg-dark: rgba(10, 12, 16, 0.35); --ds-diff-lib-bg-system-light: rgba(255, 255, 255, 0.35); --ds-diff-lib-bg-system-dark: rgba(10, 12, 16, 0.35); + + /* Global layer scale (keep numeric so it works with z-index + calc()). */ + --ds-layer-modal: 10000; + --ds-layer-toast: 11000; } diff --git a/src/styles/error-toasts.css b/src/styles/error-toasts.css index d1177e825..b3bf75a7b 100644 --- a/src/styles/error-toasts.css +++ b/src/styles/error-toasts.css @@ -3,7 +3,7 @@ top: 16px; left: 50%; transform: translateX(-50%); - z-index: 60; + z-index: var(--ds-layer-toast, 60); gap: 8px; pointer-events: none; } diff --git a/src/styles/git-init-modal.css b/src/styles/git-init-modal.css new file mode 100644 index 000000000..fd1d9939c --- /dev/null +++ b/src/styles/git-init-modal.css @@ -0,0 +1,57 @@ +.git-init-modal .ds-modal-card { + width: min(520px, calc(100vw - 48px)); + border-radius: 16px; + padding: 18px 20px; + display: flex; + flex-direction: column; + gap: 12px; + /* Make this modal fully opaque (no glass/transparency). */ + background: var(--surface-sidebar-opaque); +} + +.git-init-modal .ds-modal-backdrop { + /* Fully opaque backdrop; no blur/transparency so nothing bleeds through. */ + background: color-mix(in srgb, var(--surface-sidebar-opaque) 18%, black); + backdrop-filter: none; + -webkit-backdrop-filter: none; +} + +.git-init-modal-checkbox-row { + display: flex; + align-items: flex-start; + gap: 10px; + font-size: 12px; + color: var(--ds-text-subtle); +} + +.git-init-modal-checkbox-row code { + font-family: var(--code-font-family, Menlo, Monaco, "Courier New", monospace); + color: var(--ds-text-strong); +} + +.git-init-modal-checkbox-row--nested { + margin-top: 2px; + padding-left: 22px; +} + +.git-init-modal-checkbox { + margin-top: 2px; +} + +.git-init-modal-remote { + display: flex; + flex-direction: column; + gap: 10px; + padding: 12px 12px; + border-radius: 14px; + border: 1px solid var(--ds-border-subtle); + background: var(--surface-sidebar-opaque); +} + +.git-init-modal-actions { + margin-top: 8px; +} + +.git-init-modal-button { + white-space: nowrap; +} diff --git a/src/styles/mobile-setup-wizard.css b/src/styles/mobile-setup-wizard.css index 162477f91..9d7d9719c 100644 --- a/src/styles/mobile-setup-wizard.css +++ b/src/styles/mobile-setup-wizard.css @@ -1,5 +1,5 @@ .mobile-setup-wizard-overlay { - z-index: 120; + z-index: calc(var(--ds-layer-modal, 40) + 80); } .mobile-setup-wizard-overlay .ds-modal-backdrop { diff --git a/src/styles/workspace-home.css b/src/styles/workspace-home.css index 64242ea17..2903b24f0 100644 --- a/src/styles/workspace-home.css +++ b/src/styles/workspace-home.css @@ -42,6 +42,33 @@ word-break: break-all; } +.workspace-home-git-banner { + display: flex; + align-items: center; + justify-content: space-between; + gap: 12px; + padding: 12px; + border-radius: 12px; + border: 1px solid var(--border-subtle); + background: var(--surface-card); +} + +.workspace-home-git-banner-title { + font-size: 12px; + color: var(--text-strong); +} + +.workspace-home-git-banner-actions { + display: inline-flex; + align-items: center; + gap: 8px; +} + +.workspace-home-git-banner-actions button { + padding: 8px 12px; + font-size: 12px; +} + .workspace-home-composer { display: flex; flex-direction: column; diff --git a/tsconfig.json b/tsconfig.json index a7fc6fbf2..7857950da 100644 --- a/tsconfig.json +++ b/tsconfig.json @@ -8,6 +8,15 @@ /* Bundler mode */ "moduleResolution": "bundler", + "baseUrl": ".", + "paths": { + "@/*": ["src/*"], + "@app/*": ["src/features/app/*"], + "@settings/*": ["src/features/settings/*"], + "@threads/*": ["src/features/threads/*"], + "@services/*": ["src/services/*"], + "@utils/*": ["src/utils/*"] + }, "allowImportingTsExtensions": true, "resolveJsonModule": true, "isolatedModules": true, diff --git a/tsconfig.node.json b/tsconfig.node.json index 42872c59f..deefc15a6 100644 --- a/tsconfig.node.json +++ b/tsconfig.node.json @@ -4,6 +4,15 @@ "skipLibCheck": true, "module": "ESNext", "moduleResolution": "bundler", + "baseUrl": ".", + "paths": { + "@/*": ["src/*"], + "@app/*": ["src/features/app/*"], + "@settings/*": ["src/features/settings/*"], + "@threads/*": ["src/features/threads/*"], + "@services/*": ["src/services/*"], + "@utils/*": ["src/utils/*"] + }, "allowSyntheticDefaultImports": true }, "include": ["vite.config.ts"] diff --git a/vite.config.ts b/vite.config.ts index db531818d..ea13de779 100644 --- a/vite.config.ts +++ b/vite.config.ts @@ -1,4 +1,5 @@ import { readFileSync } from "node:fs"; +import { fileURLToPath, URL } from "node:url"; import { defineConfig } from "vitest/config"; import react from "@vitejs/plugin-react"; @@ -14,6 +15,16 @@ const packageJson = JSON.parse( // https://vite.dev/config/ export default defineConfig(async () => ({ plugins: [react()], + resolve: { + alias: { + "@": fileURLToPath(new URL("./src", import.meta.url)), + "@app": fileURLToPath(new URL("./src/features/app", import.meta.url)), + "@settings": fileURLToPath(new URL("./src/features/settings", import.meta.url)), + "@threads": fileURLToPath(new URL("./src/features/threads", import.meta.url)), + "@services": fileURLToPath(new URL("./src/services", import.meta.url)), + "@utils": fileURLToPath(new URL("./src/utils", import.meta.url)), + }, + }, worker: { format: "es", },