fix(chat): fix deactivated LLM config errors and config selection bug#317
Conversation
|
Claude finished @zaira-bibi's task in 2m 25s —— View job Review: PR #317 — fix(chat): fix deactivated LLM config errors and config selection bug
OverviewClean, well-scoped PR that fixes four related bugs in the chat config flow. The changes are minimal and targeted. Backend adds a two-step validation (exists → active), frontend improves error display, fixes the single-config selection bug, and disables controls for inactive configs. Findings by PriorityMedium
The PR description states "postprocess still shows the config name (read path unchanged)", but The frontend's Recommendation: Either use Low (Suggestions)
What Looks Good
Test Coverage AssessmentThe new SecurityNo concerns. The changes maintain existing authorization checks (user_id filtering), and the new Overall AssessmentApprove with one non-blocking suggestion. The code is clean and the fixes are correct. The main point to verify is whether the |
446d634 to
00d9f9f
Compare
|
Claude finished @zaira-bibi's task in 2m 19s —— View job Re-review: PR #317 — fix(chat): fix deactivated LLM config errors and config selection bug
OverviewRe-reviewed after force push. The changes remain clean and well-scoped. The PR fixes four related bugs in the chat config flow with minimal, targeted modifications. One new observation on Previous Review Comments
Findings by PriorityMedium
The frontend works around this because Recommendation: Either use Low
The error message at What Looks Good
Test CoverageThe new test covers the backend change well. One minor gap (noted in previous review): no test for a deactivated + deleted config ( Overall AssessmentApprove with non-blocking suggestions. The code is clean, well-tested, and the fixes are correct. The |
00d9f9f to
124ea70
Compare
|
Claude finished @zaira-bibi's task in 1m 42s —— View job Re-review: PR #317 — fix(chat): fix deactivated LLM config errors and config selection bug
OverviewRe-reviewed after force push. The PR is clean, well-scoped, and all four bugs are correctly addressed. The Previous Review Comments
Findings by PriorityMedium
The frontend works around this because Recommendation: Either use Low
The message What Looks Good
Test CoverageThe new SecurityNo concerns. The new Overall AssessmentApprove with non-blocking suggestions. The code is clean, well-tested, and all four bugs are correctly fixed. The |
- preprocess_config now distinguishes "not found" vs "deactivated" by
querying without the is_active filter and checking the flag in Python
- model override dropdown is disabled when selected config is deactivated
- Save button disabled when selected config is deactivated
- API error response JSON is parsed to extract detail before surfacing
to the user, removing the raw {"detail":"..."} noise
- Select placeholder option is no longer marked disabled, fixing the
browser/React value mismatch that prevented onChange from firing when
only one config existed and nothing was yet selected
124ea70 to
3ff17e5
Compare
|
Claude finished @zaira-bibi's task in 1m 30s —— View job Re-review: PR #317 — fix(chat): fix deactivated LLM config errors and config selection bug
OverviewRe-reviewed after force push. The PR is clean, well-scoped, and all four bugs are correctly addressed. The Previous Review Comments
Findings by PriorityLow
The error message ends with Medium (Carried forward — non-blocking)
This was flagged in previous reviews because the earlier code still used the active-only filter in What Looks Good
Test CoverageGood. One minor gap (noted in previous reviews): no test for a deactivated + soft-deleted config ( SecurityNo concerns. The new Overall AssessmentApprove with one trivial nit (double period in |
| except LLMConfigNotFoundError as exc: | ||
| logger.warning("LLMConfig %s not found for user %s: %s", request.llm_config_id, current_user.id, exc) | ||
| detail = "No AI Key found for the current user." | ||
| detail = "No AI Key found for the current user. Please configure an AI key in your chat plugin settings.." |
There was a problem hiding this comment.
Low — Double period at end of sentence
Typo: the message ends with .. instead of ..
| detail = "No AI Key found for the current user. Please configure an AI key in your chat plugin settings.." | |
| detail = "No AI Key found for the current user. Please configure an AI key in your chat plugin settings." |
What
Four related bugs in the chat plugin LLM config flow.
1. Misleading backend error for deactivated configs
preprocess_configfiltered byis_active=True, so saving with a deactivated config raised "not found or does not belong to this user" instead of a meaningful message.2. Raw JSON error display
The frontend surfaced the full API response body (
Failed to update config: {"detail":"…"}) instead of the human-readabledetailstring.3. Single-config selection broken
When chat had no saved LLM config and exactly one config existed, the browser visually auto-selected it (the placeholder
<option>wasdisabled, causing a React/DOM value mismatch), butonChangenever fired. The config was never actually selected and models never loaded. The workaround was creating a second config, selecting it, then switching back.4. Deactivated config — Save and model override still enabled
Selecting a deactivated config showed a warning banner but left both the Save button and model override dropdown active.
Changes
app/llm/plugin_adapter.py_fetch_llm_config_any(nois_activefilter);preprocess_configchecks existence then activeness separately with distinct errorstests/llm/test_plugin_adapter.pytest_preprocess_deactivated_llm_config_id_raisesfrontend/lib/plugins/context.tsxdetailfieldfrontend/components/ui/Select.tsxdisabledfrom placeholder<option>frontend/plugins/chat/components/ChatConfigModal.tsxTest plan