Blog Post

Microsoft Foundry Blog
3 MIN READ

Bring Your Own Model to Foundry Agent Service Is Now Generally Available

budzynski's avatar
budzynski
Icon for Microsoft rankMicrosoft
Apr 27, 2026

Enterprise teams building AI agents often need to route model requests through their own infrastructure — whether for compliance, governance, or other controls provided by gateways. Today, we are excited to announce general availability of the Bring Your Own Model (BYOM) for Foundry Agent Service feature, letting you connect prompt agents to models hosted behind Azure API Management or any third-party AI model gateway.

This means you can build agents in Foundry while keeping full control over how and where model traffic flows.

 

What This Unlocks

BYOM support in Foundry Agent Service enables organizations to:

  • Route agent requests through existing enterprise gateways — use Azure API Management or third-party gateways you already operate.
  • Enforce compliance and governance at the gateway layer — apply your existing security policies, rate limits, and audit controls without duplicating them in Foundry.
  • Use any model compatible with the Chat Completions API — connect to any model that implements the OpenAI Chat Completions API, regardless of provider.

 

How It Works

Setting up BYOM takes just two steps:

 

1. Create a model connection

In the Foundry portal, go to Operate > Admin, select your project's parent resource, and add a model connection under the Admin-connected models tab. Choose either Azure API Management or Other source as your connection type, configure authentication, and define one or more models.

You can also deploy connections programmatically using the Azure CLI with the Bicep templates in the Foundry samples repository.

 

2. Create a prompt agent

In the Foundry portal, go to Build > Agents, create a new agent, and pick a model added using the BYOM feature. Test the agent in the playground.

 

Core capabilities

BYOM is built around a set of capabilities designed to fit enterprise model platforms:

  • Two connection types for the gateways you already run. Choose between Azure API Management or a third-party gateway. The API Management connection type offers defaults tuned to common routing and authentication patterns in API Management.
  • Authentication that matches your security posture. Connect with an API key, managed identity with a configurable audience, or OAuth 2.0 client credentials.
  • Routing that adapts to your gateway's URL shape. Choose whether Foundry includes the deployment name in the request path — supporting both Azure OpenAI-style (/deployments/{deploymentName}/chat/completions) and OpenAI-style routes. Add static headers when your gateway expects them.
  • Multiple models per connection. Register as many model deployments as you need under a single gateway connection. Each gets its own deployment name and display name, and appears as a distinct model in Foundry.
  • First-class agent integration. BYOM deployments show up in the agent model picker and are addressable from the SDK.
  • Public and private networking. Both public networking and network isolation are supported.

 

Get Started

BYOM for Foundry Agent Service is available today. Here's how to get started:

 

If you're already running agents in Foundry, adding a gateway connection does not require a re-architecture — just connect your gateway and configure your agent to use a newly added model.

 

Note: When you use a third-party model, you are directly responsible for implementing your own responsible AI mitigations, ensuring that your use satisfies your data handling requirements, and complying with the model’s license. You are also responsible for the use of such models, as their data handling practices may differ from Microsoft's standards.

Updated Apr 27, 2026
Version 1.0
No CommentsBe the first to comment