Skip to content

fix: use LLM.from_url() when llm_server_url is provided#262

Open
giwaov wants to merge 1 commit intoOpenGradient:mainfrom
giwaov:fix/llm-server-url
Open

fix: use LLM.from_url() when llm_server_url is provided#262
giwaov wants to merge 1 commit intoOpenGradient:mainfrom
giwaov:fix/llm-server-url

Conversation

@giwaov
Copy link
Copy Markdown
Contributor

@giwaov giwaov commented Apr 8, 2026

Problem

\OpenGradientChatModel\ in \og_langchain.py\ passes \llm_server_url\ as a keyword argument to \LLM(), but \LLM.init()\ only accepts \private_key,
pc_url, and \ ee_registry_address.

This causes a \TypeError: init() got an unexpected keyword argument 'llm_server_url'\ at runtime when a developer tries to use the \llm_server_url\ parameter through \langchain_adapter()\ or \OpenGradientChatModel().

The correct API for creating an LLM client with a hardcoded TEE endpoint is \LLM.from_url(private_key, llm_server_url).

Fix

When \llm_server_url\ is provided, call \LLM.from_url(private_key, llm_server_url)\ instead of passing it as a kwarg to \LLM(). The regular \LLM()\ constructor path is preserved for the
pc_url/\ ee_registry_address\ case.

Changes

  • **\src/opengradient/agents/og_langchain.py**: Branch on \llm_server_url\ to call the correct constructor

Closes #248

OpenGradientChatModel passed llm_server_url as a kwarg to LLM(),
but LLM.__init__() does not accept that parameter, causing a
TypeError at runtime. The correct API is LLM.from_url().

When llm_server_url is provided, call LLM.from_url(private_key,
llm_server_url) instead of LLM(private_key, **llm_kwargs).

Closes OpenGradient#248
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

llm_server_url not working in LLM()

2 participants