-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Description
Description
Support Dynamic Runtime Tools for Foundry-Hosted Agents (GetAIAgentAsync)
Language
.NET
Type of Issue
Feature Request
Description
Overview
When using a Foundry-hosted agent (retrieved via GetAIAgentAsync) that has a remote Azure Search Index tool configured server-side (as an example), there is no way to supply additional dynamic, local function-based tools (AIFunctions/FunctionTool) at run time such that their schemas are advertised to the model. For us this is a blocker that prevents us from utilizing Foundry-hosted agents.
This capability works correctly with locally-created agents via CreateAIAgent, but breaks with Foundry-hosted agents due to how AzureAIProjectChatClient handles tool options.
Background: Relationship to #3083
Issue #3083 describes the pattern of dynamically injecting tools mid-run by mutating FunctionInvokingChatClient.CurrentContext.Options.Tools. This was discussed, validated, and a fix was shipped in Microsoft.Extensions.AI to support it.
However, the #3083 discussion and fix targeted locally-created agents (CreateAIAgent backed by an AzureOpenAIClient chat client), where the client fully controls tool advertisement via ChatOptions.Tools sent per-request to the Chat Completions API. The Foundry-hosted agent case (GetAIAgentAsync) was never tested or discussed in that issue.
Why it works with CreateAIAgent but not GetAIAgentAsync
With CreateAIAgent (locally-created agents):
- Tools are added to
ChatOptions.Tools FunctionInvokingChatClientpasses them to the innerIChatClienton each loop iteration- The inner client (a plain
AzureOpenAIClientchat client) passes them through to the API as-is - The model sees the new tool schemas and can call them
With GetAIAgentAsync (Foundry-hosted agents):
- Tools are added to
ChatOptions.Tools(via context provider orCurrentContext.Options.Toolsmutation) FunctionInvokingChatClientpasses them to the innerIChatClienton the next iterationAzureAIProjectChatClient.GetAgentEnabledChatOptions()explicitly setsTools = nullbefore making the API call- The API call goes out without the dynamically added tool schemas
- The model never sees them and cannot call them
The stripping happens in AzureAIProjectChatClient.cs in the GetAgentEnabledChatOptions method, which nulls out Tools, Instructions, Temperature, TopP, PresencePenalty, and ResponseFormat — preserving only ConversationId.
Our Scenario
- Hosted Agent Configuration: Azure Foundry agent created with Azure Search Index tool defined in its server-side manifest
- At runtime: We want to inject local functions (as tools) based on user context/session data. These may differ per session/request.
- Expected behavior: Both the pre-configured cloud Search tool and the local dynamic tools should have their schemas surfaced to the model for calling.
Approaches Attempted and Results
1. AIContextProvider Approach
We tried providing the local tools via a custom AIContextProvider that returns the union of remote and local tools in AIContext.Tools.
- Result: The context provider is invoked and updates
chatOptions.Toolsas expected. - Problem:
AzureAIProjectChatClient.GetAgentEnabledChatOptions()nulls out.Toolsbefore the API call, so the dynamic tool schemas are never sent to the Responses API.
2. Mutation via FunctionInvokingChatClient.CurrentContext.Options.Tools
We attempted to mutate FunctionInvokingChatClient.CurrentContext.Options.Tools as recommended in #3083.
- Result: The mutation is accepted locally, and
FunctionInvokingChatClientwould dispatch calls to these tools if the model requested them. However, the sameGetAgentEnabledChatOptions()stripping means the tool schemas are never included in API requests, so the model is never aware the tools exist.
Note: This approach does work correctly for locally-created agents via CreateAIAgent as demonstrated in #3083. The gap is specifically in the AzureAIProjectChatClient path used by Foundry-hosted agents.
Root Cause
AzureAIProjectChatClient is designed so that Foundry-hosted agents use their server-side agent definition for tool schemas, and per-request ChatOptions.Tools are intentionally discarded. There is currently no mechanism to merge client-side tool schemas with the server-defined tools in the API request.
Why This Is Important
- The ability to compose both cloud/remote tools (declared at agent creation) and per-run, per-session local tools is critical for safe, tailored, and context-aware use cases — especially where local data/functions cannot be registered up front but must be injected based on runtime context or user identity.
- Today, this is only possible if we create a new agent definition per tool set (not scalable) or fall back to non-hosted agents (losing the benefits of Foundry-hosted agent management).
Request
We are requesting a way to supply additional tools (AIFunctions/FunctionTool definitions) at run time to a Foundry-hosted agent such that their schemas are exposed to the model in the API request body alongside the persisted, server-side tools.
Possible approaches:
- Allow
AzureAIProjectChatClientto merge per-requestChatOptions.Toolswith the server-defined tools instead of discarding them - Provide an explicit API for registering additional per-run tool schemas that survive
GetAgentEnabledChatOptions() - Support an
AdditionalToolsconcept at theAzureAIProjectChatClientlevel that gets merged into the API request
Without this capability combining remote tools and local tools in a single agent becomes extremely difficult and requires awkward work arounds. It also makes impossible to implement progressive tool discovery, which is important for context management.
References
- Related: .NET: [Issue]: Enable agentic, in-run dynamic loading of tools #3083 (dynamic tool discovery — works for
CreateAIAgent, not forGetAIAgentAsync) AzureAIProjectChatClient.GetAgentEnabledChatOptions()— the method that strips per-request toolsAzureAIProjectChatClient.cs— source
Code Sample
Language/SDK
.NET