-
Notifications
You must be signed in to change notification settings - Fork 0
docs: update LLM model demo to remove OpenAI API key requirement and … #61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…simplify prerequisites
Summary of ChangesHello @KUASWoodyLIN, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request aims to enhance the accessibility and ease of use for the LLM model demo by eliminating the need for an OpenAI API key and the associated Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request updates the LLM model demo documentation to remove the dependency on an OpenAI API key and to simplify the prerequisites and examples. The changes make the documentation clearer and easier to follow by removing complex, multi-tab examples and focusing on a single, updated script. My review includes a suggestion to add an example value for a configuration variable to further enhance usability for the user.
| MODEL_NAME = "<your_model_name>" # e.g., llm-demo | ||
| MODEL_ID = "RedHatAI/Llama-3.2-1B-Instruct-FP8" | ||
| NAME = "<your_name>" # e.g., llm-demo | ||
| MODEL_NAME = "<your_model_name>" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR simplifies the LLM model demo documentation by removing the OpenAI API key dependency and streamlining the testing prerequisites. The changes make the demo more straightforward by focusing on direct API interaction with the deployed LLM service rather than through OpenAI's SDK.
Key changes:
- Removed OpenAI API key requirement from prerequisites
- Simplified Python dependencies to only require
requestslibrary - Consolidated testing examples from multiple tabs into a single, clearer example
- Updated connection information section to reference OtterScale UI fields directly
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| NAME = "<your_name>" # e.g., llm-demo | ||
| MODEL_NAME = "<your_model_name>" |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The variable naming and usage here is confusing. Based on the Connection Information section, there are two separate fields from the UI: "Name" and "Model Name". However, the variable names NAME and MODEL_NAME don't clearly indicate their purposes:
- NAME is used in the "OtterScale-Model-Name" header
- MODEL_NAME is used as the "model" field in the payload
Consider renaming these variables to be more descriptive of their purposes, such as OTTERSCALE_MODEL_NAME and MODEL_ID, or add inline comments explaining what each represents to help users correctly map the UI fields to the code variables.
|
|
||
| <Tabs> | ||
|
|
||
| Before running the test scripts, you'll need to find the following information from the `<url>/scope/<scope-name>/models/llm` page: |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The documentation mentions finding information from the page <url>/scope/<scope-name>/models/llm but doesn't explain what <url> and <scope-name> should be replaced with, or where users can find these values. This could leave users uncertain about where to navigate. Consider providing more context about what these placeholders represent or linking to documentation about URL structure and scope names.
| Before running the test scripts, you'll need to find the following information from the `<url>/scope/<scope-name>/models/llm` page: | |
| Before running the test scripts, you'll need to find the following information from the Models page in your OtterScale cluster. You can open it by going to your cluster URL (for example, `https://your-cluster.example.com`), then navigating to the appropriate scope (namespace) and opening the **Models** page for your LLM, which corresponds to the `<url>/scope/<scope-name>/models/llm` path (where `<url>` is your cluster URL and `<scope-name>` is the name of the scope/namespace you selected): |
| Before running the test scripts, you'll need to find the following information from the `<url>/scope/<scope-name>/models/llm` page: | ||
| - **Service URL**: The URL information from the Service card | ||
| - **Name**: The `name` field in the model table | ||
| - **Model Name**: The `Model Name` field in the model table |
Copilot
AI
Jan 7, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The naming convention for these configuration fields is confusing and inconsistent with the code example below. The documentation mentions three fields: "Service URL", "Name", and "Model Name", but the code example uses variables named SERVICE_URL, NAME, and MODEL_NAME. The relationship between these is unclear:
- "Name" maps to the NAME variable which is used in the "OtterScale-Model-Name" header
- "Model Name" maps to the MODEL_NAME variable which is used as the "model" in the payload
This creates ambiguity about which UI field corresponds to which variable. Consider using clearer naming that explicitly distinguishes between the model identifier used in the header versus the model identifier used in the API payload, or provide more explicit mapping between the UI fields and code variables.
…simplify prerequisites