Skip to content

.NET: [Feature]: Support Dynamic Runtime Tools for Foundry-Hosted Agents (GetAIAgentAsync) #4918

@egorpavlikhin

Description

@egorpavlikhin

Description

Support Dynamic Runtime Tools for Foundry-Hosted Agents (GetAIAgentAsync)

Language

.NET

Type of Issue

Feature Request

Description

Overview

When using a Foundry-hosted agent (retrieved via GetAIAgentAsync) that has a remote Azure Search Index tool configured server-side (as an example), there is no way to supply additional dynamic, local function-based tools (AIFunctions/FunctionTool) at run time such that their schemas are advertised to the model. For us this is a blocker that prevents us from utilizing Foundry-hosted agents.

This capability works correctly with locally-created agents via CreateAIAgent, but breaks with Foundry-hosted agents due to how AzureAIProjectChatClient handles tool options.

Background: Relationship to #3083

Issue #3083 describes the pattern of dynamically injecting tools mid-run by mutating FunctionInvokingChatClient.CurrentContext.Options.Tools. This was discussed, validated, and a fix was shipped in Microsoft.Extensions.AI to support it.

However, the #3083 discussion and fix targeted locally-created agents (CreateAIAgent backed by an AzureOpenAIClient chat client), where the client fully controls tool advertisement via ChatOptions.Tools sent per-request to the Chat Completions API. The Foundry-hosted agent case (GetAIAgentAsync) was never tested or discussed in that issue.

Why it works with CreateAIAgent but not GetAIAgentAsync

With CreateAIAgent (locally-created agents):

  1. Tools are added to ChatOptions.Tools
  2. FunctionInvokingChatClient passes them to the inner IChatClient on each loop iteration
  3. The inner client (a plain AzureOpenAIClient chat client) passes them through to the API as-is
  4. The model sees the new tool schemas and can call them

With GetAIAgentAsync (Foundry-hosted agents):

  1. Tools are added to ChatOptions.Tools (via context provider or CurrentContext.Options.Tools mutation)
  2. FunctionInvokingChatClient passes them to the inner IChatClient on the next iteration
  3. AzureAIProjectChatClient.GetAgentEnabledChatOptions() explicitly sets Tools = null before making the API call
  4. The API call goes out without the dynamically added tool schemas
  5. The model never sees them and cannot call them

The stripping happens in AzureAIProjectChatClient.cs in the GetAgentEnabledChatOptions method, which nulls out Tools, Instructions, Temperature, TopP, PresencePenalty, and ResponseFormat — preserving only ConversationId.

Our Scenario

  1. Hosted Agent Configuration: Azure Foundry agent created with Azure Search Index tool defined in its server-side manifest
  2. At runtime: We want to inject local functions (as tools) based on user context/session data. These may differ per session/request.
  3. Expected behavior: Both the pre-configured cloud Search tool and the local dynamic tools should have their schemas surfaced to the model for calling.

Approaches Attempted and Results

1. AIContextProvider Approach

We tried providing the local tools via a custom AIContextProvider that returns the union of remote and local tools in AIContext.Tools.

  • Result: The context provider is invoked and updates chatOptions.Tools as expected.
  • Problem: AzureAIProjectChatClient.GetAgentEnabledChatOptions() nulls out .Tools before the API call, so the dynamic tool schemas are never sent to the Responses API.
2. Mutation via FunctionInvokingChatClient.CurrentContext.Options.Tools

We attempted to mutate FunctionInvokingChatClient.CurrentContext.Options.Tools as recommended in #3083.

  • Result: The mutation is accepted locally, and FunctionInvokingChatClient would dispatch calls to these tools if the model requested them. However, the same GetAgentEnabledChatOptions() stripping means the tool schemas are never included in API requests, so the model is never aware the tools exist.

Note: This approach does work correctly for locally-created agents via CreateAIAgent as demonstrated in #3083. The gap is specifically in the AzureAIProjectChatClient path used by Foundry-hosted agents.

Root Cause

AzureAIProjectChatClient is designed so that Foundry-hosted agents use their server-side agent definition for tool schemas, and per-request ChatOptions.Tools are intentionally discarded. There is currently no mechanism to merge client-side tool schemas with the server-defined tools in the API request.

Why This Is Important

  • The ability to compose both cloud/remote tools (declared at agent creation) and per-run, per-session local tools is critical for safe, tailored, and context-aware use cases — especially where local data/functions cannot be registered up front but must be injected based on runtime context or user identity.
  • Today, this is only possible if we create a new agent definition per tool set (not scalable) or fall back to non-hosted agents (losing the benefits of Foundry-hosted agent management).

Request

We are requesting a way to supply additional tools (AIFunctions/FunctionTool definitions) at run time to a Foundry-hosted agent such that their schemas are exposed to the model in the API request body alongside the persisted, server-side tools.

Possible approaches:

  • Allow AzureAIProjectChatClient to merge per-request ChatOptions.Tools with the server-defined tools instead of discarding them
  • Provide an explicit API for registering additional per-run tool schemas that survive GetAgentEnabledChatOptions()
  • Support an AdditionalTools concept at the AzureAIProjectChatClient level that gets merged into the API request

Without this capability combining remote tools and local tools in a single agent becomes extremely difficult and requires awkward work arounds. It also makes impossible to implement progressive tool discovery, which is important for context management.

References

Code Sample

Language/SDK

.NET

Metadata

Metadata

Assignees

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions