OpenAI Assistant: Allow temperature to be 0#369
Conversation
WalkthroughAdjusts assistant temperature handling: validation models now allow temperature >= 0 (previously > 0), and sync logic preserves explicit 0 by defaulting only when temperature is None. No other functional changes. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Client
participant CRUD as sync_assistant
participant OA as OpenAIAssistant
participant DB
Client->>CRUD: Request sync
CRUD->>OA: Fetch assistant data
OA-->>CRUD: assistant.temperature (may be None or number)
alt temperature is None
note right of CRUD: Default to 0
CRUD->>DB: Save temperature = 0
else temperature provided (incl. 0)
note right of CRUD: Preserve provided value
CRUD->>DB: Save temperature = provided
end
DB-->>CRUD: Persisted assistant
CRUD-->>Client: Sync result
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Poem
✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (6)
backend/app/models/assistants.py (3)
69-70: Clarify “inclusive” in descriptionMinor doc tweak to avoid ambiguity on bounds.
- default=0.1, ge=0, le=2, description="Sampling temperature between 0 and 2" + default=0.1, ge=0, le=2, description="Sampling temperature between 0 and 2 (inclusive)"
94-95: Mirror the “inclusive” phrasing in update modelKeep descriptions consistent.
- default=None, ge=0, le=2, description="Sampling temperature between 0 and 2" + default=None, ge=0, le=2, description="Sampling temperature between 0 and 2 (inclusive)"
69-70: Unify default temperature across codepaths
Default temperature is 0.1 in the Assistant model (backend/app/models/assistants.py), falls back to 0 in the CRUD layer (backend/app/crud/assistants.py), and defaults to 1e-6 in the Collections route (backend/app/api/routes/collections.py). Pick one canonical value (e.g. 0.1) and update all three to use it.backend/app/crud/assistants.py (3)
150-152: Default fallback changed to 0; verify intent and align with models default (0.1)If OpenAI returns None, this now stores 0, diverging from AssistantBase/AssistantCreate default 0.1. Recommend keeping a single default (likely 0.1) unless product has decided zero is the canonical fallback.
- temperature=openai_assistant.temperature - if openai_assistant.temperature is not None - else 0, + temperature=openai_assistant.temperature + if openai_assistant.temperature is not None + else 0.1,If you prefer 0 as the global default, change AssistantBase/AssistantCreate defaults and the API AssistantOptions default to 0 for consistency.
144-156: Guard against out-of-range temperature from OpenAI (optional)OpenAI should return 0–2, but adding a lightweight check prevents bad data persisting through this bypass of the Pydantic constraints on create/update models.
db_assistant = Assistant( assistant_id=openai_assistant.id, name=openai_assistant.name or openai_assistant.id, instructions=openai_assistant.instructions, model=openai_assistant.model, vector_store_ids=vector_store_ids, - temperature=openai_assistant.temperature - if openai_assistant.temperature is not None - else 0, + temperature=( + openai_assistant.temperature + if (openai_assistant.temperature is not None) + else 0.1 + ), max_num_results=max_num_results, project_id=project_id, organization_id=organization_id, ) + # Optional: enforce range if upstream ever drifts + if not (0 <= db_assistant.temperature <= 2): + raise HTTPException(status_code=400, detail="Temperature must be between 0 and 2.")
150-152: Add tests for explicit 0 and None fallbackEnsure: (1) sync preserves 0 exactly; (2) None falls back to the chosen default; (3) out-of-range raises 400 if you add the guard.
I can add parametrized tests for these cases if you want a quick follow-up commit.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (2)
backend/app/crud/assistants.py(1 hunks)backend/app/models/assistants.py(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
backend/app/models/assistants.py (1)
backend/app/api/routes/collections.py (1)
AssistantOptions(108-131)
backend/app/crud/assistants.py (2)
backend/app/tests/crud/test_assistants.py (1)
test_sync_assistant_success(27-47)backend/app/api/routes/collections.py (1)
AssistantOptions(108-131)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: checks (3.11.7, 6)
🔇 Additional comments (3)
backend/app/models/assistants.py (2)
69-70: Allowing temperature=0 in create: LGTMChanging gt>0 to ge>=0 correctly permits zero. No functional concerns.
94-95: Allowing temperature=0 in update: LGTMUpdate model now accepts zero; matches the PR intent.
backend/app/crud/assistants.py (1)
150-152: Preserves explicit temperature=0: LGTMSwitching from a truthy check to an explicit None-check fixes the bug where 0 was overwritten.
Summary
Previously, the temperature field validation prevented setting temperature to 0, which is a valid value for openai assistant. The sync logic also incorrectly treated 0 as a falsy value and defaulted to 0.1.
Checklist
Before submitting a pull request, please ensure that you mark these task.
fastapi run --reload app/main.pyordocker compose upin the repository root and test.Notes
Please add here if any other information is required for the reviewer.
Summary by CodeRabbit