Conversation
557370d to
48115b7
Compare
LlmConfig is no longer abstract. It can be used directly with model_id, provider, api_provider, api_type, url, and api_key to describe any LLM without a dedicated subclass. Existing subclasses remain unchanged. All framework adapters dispatch bare LlmConfig instances via api_provider. Schema generation handles concrete-with-subclasses via _include_subclasses_in_schema on Component. Documentation, JSON spec, and language spec updated accordingly.
48115b7 to
86e4440
Compare
|
Thank you @spichen for following-up. |
| api_provider: Optional[str] | ||
| api_type: Optional[str] | ||
| url: Optional[str] | ||
| api_key: Optional[str] |
There was a problem hiding this comment.
Please mark the api_key as sensitive
| api_key: Optional[str] | |
| api_key: SensitiveField[Optional[str]] |
There's also a table at the end of this file with all the sensitive fields, please that too
| model_id: str | ||
| provider: Optional[str] | ||
| api_provider: Optional[str] | ||
| api_type: Optional[str] |
There was a problem hiding this comment.
Some fields are missing, please align the definition with the one in the spec
| The ``provider`` field is optional and identifies the model provider (e.g. ``"openai"``, ``"meta"``, ``"anthropic"``, ``"cohere"``). | ||
| The ``api_provider`` field is optional and identifies the API provider serving the model (e.g. ``"openai"``, ``"oci"``, ``"vllm"``, ``"ollama"``, ``"aws_bedrock"``, ``"vertex_ai"``). | ||
| The ``api_type`` field is optional and identifies the wire protocol to use (e.g. ``"chat_completions"``, ``"responses"``). | ||
| The ``url`` field is optional and specifies the URL of the API endpoint (e.g. ``"https://api.openai.com/v1"``). |
There was a problem hiding this comment.
I would maybe specify that, if null, the default API url of the API provider (if any) should be used
| all_subclasses = cls._get_all_subclasses(only_core_components=only_core_components) | ||
| adapter = TypeAdapter(Union[all_subclasses]) # type: ignore | ||
| json_schema = adapter.json_schema(by_alias=by_alias, mode=mode) | ||
| elif getattr(cls, "_include_subclasses_in_schema", False): |
There was a problem hiding this comment.
Could you help understanding why you need this new logic?
|
|
||
| When a concrete class also serves as a base (e.g. ``LlmConfig``), Pydantic | ||
| only generates the parent schema. Phase 1 (abstract resolution) skips it | ||
| because it is not abstract, so subclass schemas are never added to ``$defs``. |
There was a problem hiding this comment.
Could you double check this? Because I think we have this pattern already in the class tree (e.g., OllamaConfig is child of OpenAiCompatibleConfig, which is not abstract, and it shows up in the json schema).
Maybe I misunderstood the issue?
Maybe the problem raises when the starting class form which we call model_json_schema is not abstract?
| return LlmConfig(name="test", model_id="some-model", api_provider="unsupported_provider") | ||
|
|
||
|
|
||
| class TestOpenAiAgentsDispatch: |
There was a problem hiding this comment.
For tests that are adapter-specific we have ad-hoc folders in tests/adapters, and tests in that folder are skipped automatically if the extra dependencies required are not installed.
Please split these tests in the respective adapter folders.
Implementation of RFC #114