Blog Post

Azure AI Foundry Blog
4 MIN READ

Dynamic Tool Discovery: Azure AI Agent Service + MCP Server Integration

srikantan's avatar
srikantan
Icon for Microsoft rankMicrosoft
May 11, 2025

At the time of this writing, Azure AI Agent Service does not offer turnkey integration with Model Context Protocol (MCP) Servers. Discussed here is a solution that helps to leverage MCP's powerful capabilities while working within the Azure ecosystem.

At the time of this writing, Azure AI Agent Service does not offer turnkey integration with Model Context Protocol (MCP) Servers. Discussed here is a solution that helps to leverage MCP's powerful capabilities while working within the Azure ecosystem.

The integration approach piggybacks on the Function integration capability in the Azure AI Agent Service. By utilizing an MCP Client to discover and register tools from an MCP Server as Functions with the Agent Service, we create a seamless integration layer between the two systems.

Built using the Microsoft Bot Framework, this application can be published as an AI Assistant across numerous channels like Microsoft Teams, Slack, and others. For development and testing purposes, we've used the Bot Framework Emulator to run and validate the application locally.

 

Architecture Overview

The solution architecture consists of several key components:

  1. MCP Server: Hosted in Azure Container Apps, the MCP Server connects to Azure Blob Storage using Managed Identity, providing secure, token-based authentication without the need for stored credentials.
  2. Azure AI Agent Service: The core intelligence platform that powers our AI Assistant. It leverages various tools including:
    • Native Bing Search tool for retrieving news content
    • Dynamically registered MCP tools for storage operations
    • GPT-4o model for natural language understanding and generation
  3. Custom AI Assistant Application: Built with the Microsoft Bot Framework, this application runs locally during development but could be hosted in Azure Container Apps for production use. It serves as the bridge between user interactions and the Azure AI Agent Service.
  4. Integration Layer: The MCP client within our application discovers available tools from the MCP Server and registers them with the Azure AI Agent Service, enabling seamless function calling between these systems.

Technical Implementation

MCP Tool Discovery and Registration

The core of our integration lies in how we discover MCP tools and register them with the Azure AI Agent Service. Let's explore the key components of this process.

Tool Discovery Process

The agent.py file contains the logic for connecting to the MCP Server, discovering available tools, and registering them with the Azure AI Agent Service:

# Fetch tool schemas from MCP Server
async def fetch_tools():
    conn = ServerConnection(mcp_server_url)
    await conn.connect()
    tools = await conn.list_tools()
    await conn.cleanup()
    return tools

tools = asyncio.run(fetch_tools())

# Build a function for each tool
def make_tool_func(tool_name):
    def tool_func(**kwargs):
        async def call_tool():
            conn = ServerConnection(mcp_server_url)
            await conn.connect()
            result = await conn.execute_tool(tool_name, kwargs)
            await conn.cleanup()
            return result

        return asyncio.run(call_tool())

    tool_func.__name__ = tool_name
    return tool_func

functions_dict = {tool["name"]: make_tool_func(tool["name"]) for tool in tools}
mcp_function_tool = FunctionTool(functions=list(functions_dict.values()))

This approach dynamically creates Python function stubs for each MCP tool, which can then be registered with the Azure AI Agent Service.

Agent Creation and Registration

Once we have our function stubs, we register them with the Azure AI Agent Service:

# Initialize agent with tools
toolset = ToolSet()
toolset.add(mcp_function_tool)
toolset.add(bing)  # Adding the Bing Search tool

# Create or update agent with the toolset
agent = project_client.agents.create_agent(
    model=config.aoai_model_name,
    name=agent_name,
    instructions=agent_instructions,
    tools=toolset.definitions
)

The advantage with this approach is that it allows for dynamic discovery and registration. When the MCP Server adds or updates tools, you can simply run the agent creation process again to update the registered functions.

 

The picture below shows the tool actions discovered from the MCP Server are registered as Functions in the AI Agent Service upon Agent creation/updation.

 

Executing Requests Using the MCP Client

When a user interacts with the bot, the state_management_bot.py handles the function calls and routes them to the appropriate handler:

# Process each tool call
tool_outputs = []
for tool_call in tool_calls:
    if isinstance(tool_call, RequiredFunctionToolCall):
        # Get function name and arguments
        function_name = tool_call.function.name
        args_json = tool_call.function.arguments
        arguments = json.loads(args_json) if args_json else {}
        
        # Check if this is an MCP function
        if is_mcp_function(function_name):
            # Direct MCP execution using our specialized handler
            output = await execute_mcp_tool_async(function_name, arguments)
        else:
            # Use FunctionTool as fallback
            output = functions.execute(tool_call)

The system is designed to be loosely coupled - the Agent only knows about the tool signatures and how to call them, while the MCP Server handles the implementation details of interacting with Azure Storage.

 

Running the Application

The application workflow consists of two main steps:

1. Creating/Updating the Agent

This step discovers available tools from the MCP Server and registers them with the Azure AI Agent Service:

python agent.py

This process:

  1. Connects to the MCP Server
  2. Retrieves the schema of all available tools
  3. Creates function stubs for each tool
  4. Registers these stubs with the Azure AI Agent Service

2. Running the AI Assistant

Once the agent is configured with the appropriate tools, you can run the application:

python app.py

Users interact with the AI Assistant through the Bot Framework Emulator using natural language. The assistant can:

  • Search for news using Bing Search
  • Summarize the findings
  • Store and organize summaries in Azure Blob Storage via the MCP Server

 

References

Here is the GitHub Repo for this App. It has references to relevant documentation on the subject

Here is the GitHub Repo of the MCP Server

 

Here is a video demonstrating this Application in action

 

Conclusion

This implementation demonstrates a practical approach to integrating Azure AI Agent Service with MCP Servers. By leveraging the Function integration capability, we've created a bridge that allows these technologies to work together seamlessly.

The architecture is:

  • Flexible: New tools can be added to the MCP Server and automatically discovered
  • Maintainable: Changes to storage operations can be made without modifying the agent
  • Scalable: Additional capabilities can be easily added through new MCP tools

As Azure AI Agent Service evolves, we may see native integration with MCP Servers in the future. Until then, this approach provides a robust solution for developers looking to combine these powerful technologies.

 

 

Updated May 11, 2025
Version 3.0
No CommentsBe the first to comment