copilot studio
21 TopicsImplementing MCP Remote Servers with Azure Function App and GitHub Copilot Integration
Introduction In the evolving landscape of AI-driven applications, the ability to seamlessly connect large language models (LLMs) with external tools and data sources is becoming a cornerstone of intelligent system design. Model Context Protocol (MCP) — a specification that enables AI agents to discover and invoke tools dynamically, based on context. While MCP is powerful, implementing it from scratch can be daunting !!! That’s where Azure Functions comes in handy. With its event-driven, serverless architecture, Azure Functions now supports a preview extension for MCP, allowing developers to build remote MCP servers that are scalable, secure, and cloud-native. Further, In VS Code, GitHub Copilot Chat in Agent Mode can connect to your deployed Azure Function App acting as an MCP server. This connection allows Copilot to leverage the tools and services exposed by your function app. Why Use Azure Functions for MCP? Serverless Simplicity: Deploy MCP endpoints without managing infrastructure. Secure by Design: Leverage HTTPS, system keys, and OAuth via EasyAuth or API Management. Language Flexibility: Build in .NET, Python, or Node.js using QuickStart templates. AI Integration: Enable GitHub Copilot, VS Code, or other AI agents to invoke your tools via SSE endpoints. Prerequisites Python version 3.11 or higher Azure Functions Core Tools >= 4.0.7030 Azure Developer CLI To use Visual Studio Code to run and debug locally: Visual Studio Code Azure Functions extension An storage emulator is needed when developing azure function app in VScode. you can deploy Azurite extension in VScode to meet this requirement. Press enter or click to view image in full size You can run the Azurite in VS Code as shown below. C:\Program Files\Microsoft Visual Studio\2022\Enterprise\Common7\IDE\Extensions\Microsoft\Azure Storage Emulator> .\azurite.exe Press enter or click to view image in full size alternatively, you can also run Azurite in docker container as shown below. docker run -p 10000:10000 -p 10001:10001 -p 10002:10002 \ mcr.microsoft.com/azure-storage/azurite For more information about setting up Azurite, visit Use Azurite emulator for local Azure Storage development | Microsoft Learn Github Repositories Following Github repos are needed to setup this PoC. Repository for MCP server using Azure Function App https://github.com/mafzal786/mcp-azure-functions-python.git Repository for AI Foundry agent as MCP Client https://github.com/mafzal786/ai-foundry-agent-with-remote-mcp-using-azure-functionapp.git Clone the repository Run the following command to clone the repository to start building your MCP server using Azure function app. git clone https://github.com/mafzal786/mcp-azure-functions-python.git Run the MCP server in VS Code Once cloned. Open the folder in VS Code. Create a virtual environment in VS Code. Change directory to “src” in a new terminal window, install the python dependencies and start the function host locally as shown below. cd src pip install -r requirements.txt func start Note: by default this will use the webhooks route: /runtime/webhooks/mcp/sse. Later we will use this in Azure to set the key on client/host calls: /runtime/webhooks/mcp/sse?code=<system_key> Press enter or click to view image in full size MCP Inspector In a new terminal window, install and run MCP Inspector. npx @modelcontextprotocol/inspector Click to load the MCP inspector. Also provide the generated proxy session token. http://127.0.0.1:6274/#resources In the URL type and click “Connect”: http://localhost:7071/runtime/webhooks/mcp/sse Once connected, click List Tools under Tools and select “hello_mcp” tool and click “Run Tool” for testing as shown below. Press enter or click to view image in full size Select another tool such as get_stockprice and run it as shown below. Press enter or click to view image in full size Deploy Function App to Azure from VS Code For deploying function app to azure from vs code, make sure you have Azure Tools extension enabled in VS Code. To learn more about Azure Tools extension, visit the following Azure Extensions if your VS code environment is not setup for Azure development, follow Configure Visual Studio Code for Azure development with .NET — .NET | Microsoft Learn Once Azure Tools are setup, sign in to Azure account with Azure Tools Press enter or click to view image in full size Once Sign-in is completed, you should be able to see all of your existing resources in the Resources view. These resources can be managed directly in VS Code. Look for Function App in Resource, right click and click “Deploy to Function App”. Press enter or click to view image in full size If you already have it deployed, you will get the following pop-up. Click “Deploy” Press enter or click to view image in full size This will start deploying your function app to Azure. In VS Code, Azure tab will display the following. Press enter or click to view image in full size Once the deployment is completed, you can view the function app and all the tools in Azure portal under function app as shown below. Press enter or click to view image in full size Get the mcp_extension key from Functions → App Keys in Function App. Press enter or click to view image in full size This mcp_extension key would be needed in mcp.json file in VS code, if you would like to test the MCP server using Github Copilot in VS Code. Your entries in mcp.json file will look like as below for example. { "inputs": [ { "type": "promptString", "id": "functions-mcp-extension-system-key", "description": "Azure Functions MCP Extension System Key", "password": true }, { "type": "promptString", "id": "functionapp-name", "description": "Azure Functions App Name" } ], "servers": { "remote-mcp-function": { "type": "sse", "url": "https://${input:functionapp-name}.azurewebsites.net/runtime/webhooks/mcp/sse", "headers": { "x-functions-key": "${input:functions-mcp-extension-system-key}" } }, "local-mcp-function": { "type": "sse", "url": "http://0.0.0.0:7071/runtime/webhooks/mcp/sse" } } } Test Azure Function MCP Server in MCP Inspector Launch MCP Inspector and provide the Azure Function in MCP inspector URL. Provide authentication as shown below. Bearer token is mcp_extension key. Testing an MCP server with GitHub Copilot Testing an MCP server with GitHub Copilot involves configuring and utilizing the server within your development environment to provide enhanced context and capabilities to Copilot Chat. Steps to Test an MCP Server with GitHub Copilot: Ensure Agent Mode is Enabled: Open Copilot Chat in Visual Studio Code and select “Agent” mode. This mode allows Copilot to interact with external tools and services, including MCP servers. Add the MCP Server: Open the Command Palette (Ctrl+Shift+P or Cmd+Shift+P) and run the command MCP: Add Server. Press enter or click to view image in full size Follow the prompts to configure the server. You can choose to add it to your workspace settings (creating a .vscode/mcp.json file) . Select HTTP or Server-Sent events Press enter or click to view image in full size Specify the URL and click Enter Press enter or click to view image in full size Provide a name of your choice Press enter or click to view image in full size Select scope as Global or workspace. I selected Workspace Press enter or click to view image in full size This will generate mcp.json file in .vscode or create a new entry if mcp.json already exists as shown below. Click Start to “start” the server. Also make sure your Azure function app is locally running with func start command. Press enter or click to view image in full size Now Type the prompt as shown below. Press enter or click to view image in full size Try another tool as below. Press enter or click to view image in full size VS code terminal output for reference. Press enter or click to view image in full size Testing an MCP server with Claude Desktop Claude Desktop is a standalone AI application that allows users to interact with Claude AI models directly from their desktop, providing a seamless and efficient experience. you can download Claude desktop at Download Claude In this article, I have added another tool to utilize to test your MCP server running in Azure Function app. Modify claude_desktop_config.json with the following. you can find this file in window environment at C:\Users\<username>\AppData\Roaming\Claude { "mcpServers": { "my mcp": { "command": "npx", "args": [ "mcp-remote", "http://localhost:7071/runtime/webhooks/mcp/sse" ] } } } Note: If claude_desktop_config.json does not exists, click on setting in Claude desktop under user and visit developer tab. You will see you MCP server in Claude Desktop as shown below. Press enter or click to view image in full size Type the prompt such as “What is the stock price of Tesla” . After submitting, you will notice that it is invoking the tool “get_stockprice” from the MCP server running locally and configured in the .json earlier. Click Allow once or Allow always as shown below. Following output will be displayed. Press enter or click to view image in full size Now lets try weather related prompt. As you can see, it has invoked “get_weatheralerts” tool from MCP server. Press enter or click to view image in full size Azure AI Foundry agent as MCP Client Use the following Github repo to set up Azure AI Foundry agent as MCP client. git clone https://github.com/mafzal786/ai-foundry-agent-with-remote-mcp-using-azure-functionapp.git Open the code in VS code and follow the instructions mentioned in README.md file at Github repo. Once you execute the code, following output will show up in VS code. Press enter or click to view image in full size In this code, message is hard coded. Change the content to “what is weather advisory for Florida” and rerun the program. It will call get_weatheralerts tool and output will look like as below. Press enter or click to view image in full size Conclusion The integration of Model Context Protocol (MCP) with Azure Functions marks a pivotal step in democratizing AI agent development. By leveraging Azure’s serverless architecture, developers can now build remote MCP servers that scale automatically, integrate seamlessly with other Azure services, and expose modular tools to intelligent agents like GitHub Copilot. This setup not only simplifies the deployment and management of MCP servers but also enhances the developer experience — allowing tools to be invoked contextually by AI agents in environments like VS Code, GitHub Codespaces, or Copilot Studio[2]. Whether you’re building a tool to query logs, calculate metrics, or manage data, Azure Functions provides the flexibility, security, and scalability needed to bring your AI-powered workflows to life. As the MCP spec continues to evolve, and GitHub Copilot expands its agentic capabilities, this architecture positions you to stay ahead — offering a robust foundation for cloud-native AI tooling that’s both powerful and future-proof.664Views1like1CommentDeployment Guide-Copilot Studio agent with MCP Server exposed by API Management using OAuth 2.0
Introduction In today’s enterprise landscape, enabling AI agents to interact with backend systems securely and at scale is critical. By exposing MCP servers through Azure API Management (APIM), organizations can provide controlled access to these services. When combined with OAuth 2.0 authorization code flow, this setup ensures robust, enterprise-grade security for AI agents built in Copilot Studio—empowering intelligent automation while maintaining strict access governance. Disclaimer & Caveats This article explores how to configure a MCP tool—exposed as a MCP server via APIM—for secure consumption by AI agents built in Copilot Studio. Leveraging the OAuth 2.0 Authorization Code Flow, this setup ensures enterprise-grade security by enabling delegated access without exposing user credentials. With Azure API Management now supporting MCP server capabilities in public preview, developers can expose REST APIs as MCP tools using a standardized JSON-RPC interface. This allows AI agents to invoke backend services securely and scalable, without the need to rebuild existing APIs. Copilot Studio, also in preview for MCP integration, empowers organizations to orchestrate intelligent agents that interact with these tools in real time. While this guide provides a foundational approach, every environment is unique. You can enhance security further by implementing app roles, conditional access policies, and extending your integration logic with custom Python code for advanced scenarios. ⚠️ Note: Both MCP server support in APIM and MCP tool integration in Copilot Studio are currently in public preview. As these platforms evolve rapidly, expect changes and improvements over time. Always refer to the https://learn.microsoft.com/en-us/azure/api-management/export-rest-mcp-server for the latest updates. This article is about consuming remote MCP servers. In Azure, managed identity can also be leveraged for APIM integration. What is Authorization Code Flow? The Authorization Code Flow is designed for applications that can securely store a client secret (like server-side apps). It allows the app to obtain an access token on behalf of the user without exposing their credentials. This flow uses an intermediate authorization code to exchange for tokens, adding an extra layer of security. Steps in the Flow User Authentication The user is redirected to the Authorization Server (In this case: Azure AD) to log in and grant consent. Authorization Code Issued After successful login, the Authorization Server sends an authorization code to the app via the redirect URI. Token Exchange The app sends the authorization code (plus client credentials) to the Token Endpoint to get: Access Token (for API calls) and Refresh Token (to renew access without user interaction) API Access The app uses the Access Token to call protected resources. Below diagram shows the Authorization code flow in detail. Press enter or click to view image in full size Microsoft identity platform and OAuth 2.0 authorization code flow — Microsoft identity platform | Microsoft Learn High Level Architecture Press enter or click to view image in full size This architecture can also be implemented with APIM backend app registration only. However, stay cautious in configuring redirect URIs appropriately. Remote MCP Servers using APIM Architecture APIM exposing Remote MCP servers, enabling AI agents—such as those built in Copilot Studio—to securely access backend services using standardized JSON-RPC interfaces. This integration offers a robust, scalable, and secure way to connect AI tools with enterprise APIs. Key Capabilities: Secure Gateway: APIM acts as an intelligent gateway, handling OAuth 2.0 Authorization Code Flow, authentication, and request routing. Monitoring & Observability: Integration with Azure Log Analytics and Application Insights enables deep visibility into API usage, performance, and errors. Policy Enforcement: APIM’s policy engine allows for custom rules, including token validation, header manipulation, and response transformation. Rate Limiting & Throttling: Built-in support for rate limits, quotas, and IP filtering helps protect backend services from abuse and ensures fair usage. Managed Identity & Entra ID: Secure service-to-service communication is enabled via system-assigned and user-assigned managed identities, with Entra ID handling identity and access management. Flexible Deployment: MCP servers can be hosted in Azure Functions, App Services, or Container Apps, and exposed via APIM with minimal changes to existing APIs. To learn more, visit https://learn.microsoft.com/en-us/samples/azure-samples/remote-mcp-apim-functions-python/remote-mcp-apim-functions-python/ Develop MCP server in VS Code This deployment guide provides sample MCP code written in python for ease of use. It is available on the following GitHub repo. However, you can also use your own MCP server. Clone the following repository and open in VS Code. git clone https://github.com/mafzal786/mcp-server.git Run the following to execute it locally. cd mcp-server uv venv uv sync uv run mcpserver.py Deploy MCP Server as Azure Container App In this deployment guide, MCP server is deployed in Azure Container App. It can also be deployed as Azure App service. Deploy the MCP server in Azure container App by running the following command. It can be deployed by many other various ways such as via VS Code or CI/CD pipeline. AZ Cli is used for simplicity. az containerapp up \ --resource-group <RESOURCE_GROUP_NAME> \ --name streamable-mcp-server2 \ --environment mcp \ --location <REGION> \ --source . Configure Authentication for Azure Container App 1. Sign in Azure portal. Visit the container App in Azure and Click “Authentication” as shown below. Press enter or click to view image in full size For more details, visit the following link: Enable authentication and authorization in Azure Container Apps with Microsoft Entra ID | Microsoft Learn Click Add Identity Provider as shown. 2. Select Microsoft from the drop down and leave everything as is as shown below. 3. This will create a new app registration for the container App. After it is all setup, it will look like as below. As soon as authentication is configured. it will make container app inaccessible except for OAuth. Note: If you have app registration for Azure Container App already configured, use that by selecting "pick an existing app registration in this directory" option. Review App Registration of Container App — Backend Visit App registration and click streamable-mcp-server2 as in this case. Click on Authentication tab. Verify the Redirect URIs. you should see a redirect URL for container app. URI will end with /.auth/login/aad/callback as shown in the green box in the below screenshot. Now click on “Expose an API”. Confirm Application ID URI is configured with scope as shown below. its format is api://<client id> Scope "user_impersonation" is created. Verify API Permission. Make sure you Grant admin consent for your tenant as shown below. More scope can be created depending on the requirement of data access. Note: Make sure to "Grant admin consent" before proceeding to next step. Create App registration for representing APIM API Lauch Azure Portal. Visit App registration. Click New registration. Create a new App registration as shown below. For example, "apim-mcp-backend-api" in this case. Click "Expose an API", configure Application ID URI, and add a scope as shown in the below diagram such as user_impersonation. Click "App roles" and create the following role as shown below. More roles can be created depending on the requirements and case by case basis. Here app roles are created to get the concept around it and how it will be used in APIM inbound policies in the coming sections. Create App Registration for Client App — Copilot Studio In these steps, we will be configuring app registration for the client app, such as copilot studio in this case acting as a client app. This is also mentioned in the “high level architecture” diagram in the earlier section of this article. Lauch Azure Portal. Visit App registration. Click New registration. Create a new App registration. leave the Redirect URL as of now, we will configure it later as it is provided by copilot studio when configuring custom MCP connector. 3. Click on API permission and click “Add a permission”. Click Microsoft Graph and then click “Delegated permissions”. Select email, openid, profile as shown below. 4. Make sure to Grant admin consent and it should look like as below. 5. Create a secret. click “Certificates & secrets”. Create a new client secret by clicking “New client secret”. store the value as it will be masked after some time. if that happens, you can always delete and re-create a new secret. 6. Capture the following as you would need it in configuring MCP tool in Copilot Studio. Client ID from the Overview Tab of app registration. Client secret from “Certificates & secrets” tab. 7. Configure API permissions for APIM API i.e. "apim-mcp-backend-api" in this case. Click “API permissions” tab. Click “Add a permission”. Click on “My APIs” tab as shown below and select "apim-mcp-backend-api". Note: If you don't see the app registration in "My APIs". Go to App registration. Click "Owners". Add your AD account as Owners. 8. Select "Delegated permissions". Then select the permission as shown below. 9. Select the Application permission. Select the App roles created in the apim-mcp-backend-api registration. Such as mcp.read in this case. You MUST “Grant admin consent” as final step. It is very important!!! I can’t emphasize more on that. without it, nothing will work!!! 10. End result of this client app registration should look like as mentioned in the below figure. Configure permissions for Container App registration Lauch Azure Portal. Visit app registration. Select app registration of Azure container app such as streamable-mcp-server2 in this case. Select API permissions. Add the following delegated and application permissions as shown in the below diagram. Note: Don't forget to Grant admin consent. Configure allowed token audience for Container App It defines which audience values (aud claim) in a token are considered valid for your app. When a client app requests an access token from Microsoft Entra ID (Azure AD), the token includes an aud claim that identifies the intended recipient. Your container app will only accept tokens where the aud claim matches one of the values in the Allowed Token Audiences list. This is important as it ensures that only tokens issued for your API or app are accepted and prevents misuse of tokens intended for other resources. This adds extra layer of security. In the Azure Portal, visit Azure Container App. i.e. streamable-mcp-server2. Click on "Authentication" Click "Edit" under identity provider Under "Allowed token audiences", add the application ID URI of "apim-mcp-backend-api". As this will be included as an audience in the access token. Best Practices Only include trusted client app IDs. Avoid using overly broad values like “allow all” (not recommended). Validate tokens using Microsoft libraries (MSAL) or built-in auth features. Configure MCP server in API Management Note: Provisioning an API Management resource is outside the scope of this document. If you do not already have an API Management instance, follow this QuickStart: https://learn.microsoft.com/en-us/azure/api-management/get-started-create-service-instance The following service tiers are available for preview: Classic Basic, Standard, Premium, and Basic v2, Standard v2, Premium v2. For the Classic Basic, Standard, or Premium tiers, you must join the AI Gateway Early Update group to enable MCP server features. Please allow up to 2 hours for the update to take effect. Expose an existing MCP server Follow these steps to expose an existing MCP server is API Management: In the Azure portal, navigate to your API Management instance. In the left-hand menu, under APIs, select MCP servers > + Create MCP server. Select Expose an existing MCP server. In Backend MCP server: Enter the existing MCP server base URL. Example: https://streamable-mcp-serverv2.kdhg489457dslkjgn,.eastus2.azurecontainerapps.io/mcpfor the Microsoft Azure Container App hosting MCP server. In Transport type, Streamable HTTP is selected by default. In New MCP server: Enter a Name the MCP server in API Management. In Base path, enter a route prefix for tools. Example: mcptools Optionally, enter a Description for the MCP server. Select Create. Below diagram shows the MCP servers configured in APIM for reference. Configure policies for MCP server Configure one or more API Management policies to help manage the MCP server. The policies are applied to all API operations exposed as tools in the MCP server and can be used to control access, authentication, and other aspects of the tools. To configure policies for the MCP server: In the Azure portal, navigate to your API Management instance. In the left-hand menu, under APIs, select MCP Servers. Select an MCP server from the list. In the left menu, under MCP, select Policies. In the policy editor, add or edit the policies you want to apply to the MCP server's tools. The policies are defined in XML format. <!-- - Policies are applied in the order they appear. - Position <base/> inside a section to inherit policies from the outer scope. - Comments within policies are not preserved. --> <!-- Add policies as children to the <inbound>, <outbound>, <backend>, and <on-error> elements --> <policies> <!-- Throttle, authorize, validate, cache, or transform the requests --> <inbound> <base /> <set-variable name="accessToken" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "").Replace("Bearer ", ""))" /> <!-- Log the captured access token to the trace logs --> <trace source="Access Token Debug" severity="information"> <message>@("Access Token: " + (string)context.Variables["accessToken"])</message> </trace> <set-variable name="userId" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["oid"].FirstOrDefault())" /> <set-variable name="userName" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["name"].FirstOrDefault())" /> <trace source="User Name Debug" severity="information"> <message>@("username: " + (string)context.Variables["userName"])</message> </trace> <set-variable name="scp" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["scp"].FirstOrDefault())" /> <trace source="Scope Debug" severity="information"> <message>@("scope: " + (string)context.Variables["scp"])</message> </trace> <set-variable name="roles" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["roles"].FirstOrDefault())" /> <trace source="Role Debug" severity="information"> <message>@("Roles: " + (string)context.Variables["roles"])</message> </trace> <!-- <set-variable name="requestBody" value="@{ return context.Request.Body.As<string>(preserveContent:true); }" /> <trace source="Request Body information" severity="information"> <message>@("Request body: " + (string)context.Variables["requestBody"])</message> </trace> --> <validate-azure-ad-token tenant-id="{{tenant-id}}" header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="Unauthorized. Access token is missing or invalid."> <client-application-ids> <application-id>{{client-application-id}}</application-id> </client-application-ids> <audiences> <audience>{{audience}}</audience> </audiences> <required-claims> <claim name="roles" match="any"> <value>mcp.read</value> </claim> </required-claims> </validate-azure-ad-token> </inbound> <!-- Control if and how the requests are forwarded to services --> <backend> <base /> </backend> <!-- Customize the responses --> <outbound> <base /> </outbound> <!-- Handle exceptions and customize error responses --> <on-error> <base /> <trace source="Role Debug" severity="error"> <message>@("username: " + (string)context.Variables["userName"] + " has error in accessing the MCP server, could be auth or role related...")</message> </trace> <return-response> <set-status code="403" reason="Forbidden" /> <set-body> {"error":"Missing required scope or role"} </set-body> </return-response> </on-error> </policies> Note: Update the above inbound policy with the tenant Id, client application id, and audience as per your environment. It is recommended to use APIM "Named values" instead of hard coding inside the policy. To learn more, visit Use named values in Azure API Management policies Configure Diagnostics for APIM In this solution, APIM diagnostics are configured to forward log data to Log Analytics. Testing and validation will be carried out using insights from Log Analytics. Note: Setting up diagnostics is outside the scope of this article. However, you can visit the following link for more information. https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-azure-monitor Below diagram shows what Logs are being sent to Log Analytics workspace. MCP Tool configuration in Copilot Studio Lauch copilot studio at https://copilotstudio.microsoft.com/. Configuration of environment and agent is beyond the scope of this article. It is assumed, you already have environment setup and agent has been created. Following link will help you, how to create an agent in copilot studio. Quickstart: Create and deploy an agent — Microsoft Copilot Studio | Microsoft Learn Inside agent configuration, click "Add tool". 3. Click on New tool. 4. Select Model Context Protocol. 5. Provide all relevant information for MCP server. Make sure your server URL ends with your mcp setup. In this case, it is APIM MCP server URL, with base path configured in APIM in the end. Provide server name and server description. Select OAuth 2.0 radio button. 6. Provide the following in the OAuth 2.0 section Client ID of client app registration. In this case, copilot-studio-client as configured earlier. Client secret of copilot-studio-client app registration. Authorization URL: https://login.microsoftonline.com/common/oauth2/v2.0/authorize Token URL template & Refresh URL: https://login.microsoftonline.com/oauth2/v2.0/token Scopes: openid, profile, email — which we selected earlier for Microsoft Azure Graph permissions. Click “Create”. This will provide you Redirect URL. you need to configure the redirect URL in client app registration. In this case, it is copilot-agent-client. Configure Redirect URI in Client App Registration Visit client app registration. i.e. copilot-studio-client. Click Authentication Tab and provide the Web Redirect URIs as shown below. Note: Configure Redirect URIs MUST be configured in app registration. Otherwise, authorization will not complete and sign on will fail. Configure redirect URI in APIM API app registration Also configure apim-mcp-backend-api app registration with the same redirect URI as shown below. Modify MCP connector in PowerApps Now visit the https://make.powerapps.com and open the newly created connector as shown below. Select the security tab and modify the Resource URL with application ID URI of apim-mcp-backend-api configured earlier in app registration for expose an API. Add .default in the scope. Provide the secret of client app registration as it will not let you update the connector. This is extra security measure for updating the connector in Powerapps. Click Update connector. CORS Configuration CORS configuration is a MUST!!! Since our Azure Container App is a remote MCP server with totally different domain or origin. Power Apps and CORS for External Domains — Brief Overview When embedding or integrating Power Apps with external web applications or APIs, Cross-Origin Resource Sharing (CORS) becomes a critical consideration. CORS is a browser security feature that restricts web pages from making requests to a different domain than the one that served the page, unless explicitly allowed. Key Points: Power Apps hosted on *.powerapps.com or within Microsoft 365 domains will block calls to external APIs unless those APIs include the proper CORS headers. The external API must return: Access-Control-Allow-Origin: https://apps.powerapps.com (or * for all origins, though not recommended for production) Access-Control-Allow-Methods: GET, POST, OPTIONS (or as needed) Access-Control-Allow-Headers: Content-Type, Authorization (and any custom headers) If the API requires authentication (e.g., OAuth 2.0), ensure preflight OPTIONS requests are handled correctly. For scenarios where you cannot modify the external API, consider using: Power Automate flows as a proxy Azure API Management or Azure Functions to inject CORS headers Always validate security implications before enabling wide-open CORS. If the CORS are not setup. You will encounter following error in copilot studio after pressing F12 (Browser Developer) CORS policy — blocking the container app Azure container app provides very efficient way of configuring CORS in the Azure portal. Lauch Azure Portal. Visit Azure container app i.e. streamable-mcp-server2 in this case. Click on CORS under Networking section. Configure the following in Allowed Origin Section as shown below. localhost is added to make it work from local laptop, although it is not required for Copilot Studio. 4. Click on “Allowed Method” tab and provide the following. 5. Provide wild card “*” in “Allowed Headers”tab. Although, it is not recommended for production system. it is done for the sake for simplicity. Configure that for added security 6. Click “Apply”. This will configure CORS for remote application. Test the MCP custom connector We are in the final stages of configuring the connector. It is time to test it, if everything is configured correctly and works. Launch the http://make.powerapps.com and click on “Custom connectors”, select your configured connector and click “5. Test” tab as shown below. You will see Selected Connection as blank if you are running it first time. Click “+ New connection” 2. New connection will launch the Authorization flow and browser dialog will pop up for making a request for authorization code. 3. Click “Create”. 4. Complete the login process. This will create a successful connection. 5. Click “Test operation”. If the response is 406 means everything is configured correctly as shown below. Solution validation Add user in Enterprise Application for App roles Roles have been defined under the required claims in the APIM inbound policy and also configured in the apim-mcp-backend-api app registration. As a result, any request from Copilot Studio will be denied if this role is not properly assigned. This role is included in the JWT access token, which we will validate in the following sections. To assign role, perform the following steps. Visit Azure Portal. Visit Enterprise Application. Select APIM backend app registration. In this case for example, apim-mcp-backend-api Click "Users and groups" Select "Add user/group" 5. Select User or Group who should have access to the role. 6. Click "Assign". It will look like as below. Note: Role assignment for users or groups is an important step. If it is not configured, MCP server tests will fail in Copilot studio. Test MCP server in Copilot Studio Lauch copilot studio and click on the Agent you created in earlier steps and click on “Tools tab”. Select your MCP tool as shown the following figure. Make sure it is “Enabled” if you have other tools attached to the same agent, disable them for now for testing. Make sure you have connection available which we created during the testing of custom connector in earlier step. You can also initiate a fresh connection by clicking on the drop down under “Connection” as shown below. Refreshing the tools will show all the tools available in this MCP server. Provide the sample prompt such as “Give me the stock price of tesla”. This will trigger the MCP server and call the respective method to bring the stock price of Tesla. Now try a weather-related question to see more. Now invoking weather forecast tool in the MCP server. APIM Monitoring with Log Analytics We previously configured APIM diagnostic settings to forward log data to Log Analytics. In this section, we’ll review that data, as the inbound policy in APIM sends valuable information to Log Analytics. Run the Kusto query to retrieve data from the last 30 minutes. As shown, the logs capture the APIM API endpoint URL and the backend URL, which corresponds to the Azure Container App endpoint. Scrolling further, we find the TraceRecords section. This contains the information captured by APIM inbound policies and sent to Log Analytics. The figure below illustrates the TraceRecords data. In the inbound policy, we configured it to extract details from the access token—such as the token itself, username, scope, and roles—and forward them to Log Analytics. Now let's capture the access token in the clip board, launch the http://jwt.io which is JSON Web Token (JWT) debugger, and paste the access token in the ENCODED VALUE box as show below. Note the following information. aud: This shows the Application URI ID of apim-mcp-backend-api. which shows access token is requested for that audience. appid: This shows the client Id for copilot-studio-client app registration. You can also see roles and scope. These roles are specified in APIM inbound policy. Note: As you can see, roles are included in access token and if it is not assigned in the enterprise application for "apim-mcp-backend-api", all requests will be denied by APIM inbound policy configured earlier. Perform a test using another Azure AD account that does not have the app role assigned Now, let's try the copilot studio agent by logging in with another account which is not assigned for the "mcp.read" role. Let's, review the below diagram. Logged in as demo and tried to access the MCP tool in copilot studio agent. Request failed with the error "Missing required scope or roles". If you look at it, this is coming from the APIM policy configured earlier in <on-error> Let's review log analytics. As you can see request failed due to inbound APIM policy with 403 error and there is no backend URL. Error is also reported under TraceRecords as we configured it in APIM policy. Now copy the Access token from log analytics and paste it into jwt.io. You can notice in the below diagram, there is no "roles" in the access token, resulting access denied from APIM inbound policy definition to the APIM backend i.e. azure container app. Assign the app role to the demo account Let's assign the "mcp.read" role to the demo account and test if it accesses the tool. Visit Azure Portal, Lauch Enterprise application, and select "apim-mcp-backend-api" as in this example. Click "Users and groups" Click "+ Add user/group" Select demo Click "Select" Click "Assign" End result would look like as shown below. Now, login again as demo. Make sure a new access token is generated. Access token refresh happens after one hours. As you can see in the image below, this time the request is successful after assigning the "mcp.read" app roles. Now let's review the log analytics entries. Let's review the access token in JWT.io. As you can see, roles are included in the access token. Conclusion Exposing the MCP server through Azure API Management (APIM) and integrating it with Copilot Studio agents provides a secure and scalable way to extend enterprise capabilities. By implementing OAuth 2.0, you ensure robust authentication and authorization, protecting sensitive data and maintaining compliance with industry standards. Beyond security, APIM adds significant operational value. With APIM policies, you can monitor traffic, enforce rate limits, and apply fine-grained controls to manage access and performance effectively. This combination of security and governance empowers organizations to innovate confidently while maintaining control and visibility over API usage. In today’s enterprise landscape, leveraging APIM with OAuth 2.0 for MCP integration is not just best practice—it’s a strategic move toward building resilient, secure, and well-governed solutions.1.3KViews2likes1CommentIntegrate Custom Azure AI Agents with CoPilot Studio and M365 CoPilot
Integrating Custom Agents with Copilot Studio and M365 Copilot In today's fast-paced digital world, integrating custom agents with Copilot Studio and M365 Copilot can significantly enhance your company's digital presence and extend your CoPilot platform to your enterprise applications and data. This blog will guide you through the integration steps of bringing your custom Azure AI Agent Service within an Azure Function App, into a Copilot Studio solution and publishing it to M365 and Teams Applications. When Might This Be Necessary: Integrating custom agents with Copilot Studio and M365 Copilot is necessary when you want to extend customization to automate tasks, streamline processes, and provide better user experience for your end-users. This integration is particularly useful for organizations looking to streamline their AI Platform, extend out-of-the-box functionality, and leverage existing enterprise data and applications to optimize their operations. Custom agents built on Azure allow you to achieve greater customization and flexibility than using Copilot Studio agents alone. What You Will Need: To get started, you will need the following: Azure AI Foundry Azure OpenAI Service Copilot Studio Developer License Microsoft Teams Enterprise License M365 Copilot License Steps to Integrate Custom Agents: Create a Project in Azure AI Foundry: Navigate to Azure AI Foundry and create a project. Select 'Agents' from the 'Build and Customize' menu pane on the left side of the screen and click the blue button to create a new agent. Customize Your Agent: Your agent will automatically be assigned an Agent ID. Give your agent a name and assign the model your agent will use. Customize your agent with instructions: Add your knowledge source: You can connect to Azure AI Search, load files directly to your agent, link to Microsoft Fabric, or connect to third-party sources like Tripadvisor. In our example, we are only testing the CoPilot integration steps of the AI Agent, so we did not build out additional options of providing grounding knowledge or function calling here. Test Your Agent: Once you have created your agent, test it in the playground. If you are happy with it, you are ready to call the agent in an Azure Function. Create and Publish an Azure Function: Use the sample function code from the GitHub repository to call the Azure AI Project and Agent. Publish your Azure Function to make it available for integration. azure-ai-foundry-agent/function_app.py at main · azure-data-ai-hub/azure-ai-foundry-agent Connect your AI Agent to your Function: update the "AIProjectConnString" value to include your Project connection string from the project overview page of in the AI Foundry. Role Based Access Controls: We have to add a role for the function app on OpenAI service. Role-based access control for Azure OpenAI - Azure AI services | Microsoft Learn Enable Managed Identity on the Function App Grant "Cognitive Services OpenAI Contributor" role to the System-assigned managed identity to the Function App in the Azure OpenAI resource Grant "Azure AI Developer" role to the System-assigned managed identity for your Function App in the Azure AI Project resource from the AI Foundry Build a Flow in Power Platform: Before you begin, make sure you are working in the same environment you will use to create your CoPilot Studio agent. To get started, navigate to the Power Platform (https://make.powerapps.com) to build out a flow that connects your Copilot Studio solution to your Azure Function App. When creating a new flow, select 'Build an instant cloud flow' and trigger the flow using 'Run a flow from Copilot'. Add an HTTP action to call the Function using the URL and pass the message prompt from the end user with your URL. The output of your function is plain text, so you can pass the response from your Azure AI Agent directly to your Copilot Studio solution. Create Your Copilot Studio Agent: Navigate to Microsoft Copilot Studio and select 'Agents', then 'New Agent'. Make sure you are in the same environment you used to create your cloud flow. Now select ‘Create’ button at the top of the screen From the top menu, navigate to ‘Topics’ and ‘System’. We will open up the ‘Conversation boosting’ topic. When you first open the Conversation boosting topic, you will see a template of connected nodes. Delete all but the initial ‘Trigger’ node. Now we will rebuild the conversation boosting agent to call the Flow you built in the previous step. Select 'Add an Action' and then select the option for existing Power Automate flow. Pass the response from your Custom Agent to the end user and end the current topic. My existing Cloud Flow: Add action to connect to existing Cloud Flow: When this menu pops up, you should see the option to Run the flow you created. Here, mine does not have a very unique name, but you see my flow 'Run a flow from Copilot' as a Basic action menu item. If you do not see your cloud flow here add the flow to the default solution in the environment. Go to Solutions > select the All pill > Default Solution > then add the Cloud Flow you created to the solution. Then go back to Copilot Studio, refresh and the flow will be listed there. Now complete building out the conversation boosting topic: Make Agent Available in M365 Copilot: Navigate to the 'Channels' menu and select 'Teams + Microsoft 365'. Be sure to select the box to 'Make agent available in M365 Copilot'. Save and re-publish your Copilot Agent. It may take up to 24 hours for the Copilot Agent to appear in M365 Teams agents list. Once it has loaded, select the 'Get Agents' option from the side menu of Copilot and pin your Copilot Studio Agent to your featured agent list Now, you can chat with your custom Azure AI Agent, directly from M365 Copilot! Conclusion: By following these steps, you can successfully integrate custom Azure AI Agents with Copilot Studio and M365 Copilot, enhancing you’re the utility of your existing platform and improving operational efficiency. This integration allows you to automate tasks, streamline processes, and provide better user experience for your end-users. Give it a try! Curious of how to bring custom models from your AI Foundry to your CoPilot Studio solutions? Check out this blog17KViews3likes11CommentsStep-by-Step Deployment Guide: MCP Tool Call in Copilot Studio agent with OAuth 2.0
Introduction Modern development workflows increasingly rely on secure integrations between tools and platforms. Copilot Studio, with its ability to extend functionality through MCP (Model Context Protocol) tools, offers developers powerful customization options. However, when these tools need to access sensitive APIs or user-specific data, robust authentication becomes essential. OAuth 2.0, particularly the Authorization Code Flow, is the industry standard for secure delegated access. It enables applications to obtain tokens on behalf of users without exposing credentials, ensuring compliance with enterprise security policies. In this guide, we’ll walk through how to configure MCP tools in Copilot Studio using OAuth 2.0 Authorization Code Flow — covering prerequisites, configuration steps, token handling, and best practices for a seamless and secure setup. Disclaimer & Context This article focuses on configuring an MCP tool within a Copilot Studio agent using OAuth 2.0 Authorization Code Flow. Every environment is unique, so the approach outlined here should be treated as a starting point rather than a one-size-fits-all solution. You can enhance this setup with additional security measures such as app roles, conditional access policies, or by extending the Python code for advanced scenarios. We assume readers have a basic understanding of Python, MCP concepts, OAuth 2.0 flows, and some familiarity with Copilot Studio. For deeper dives into these individual technologies, refer to the official documentation linked throughout this article. Please note: This solution reflects the state of the technology at the time of writing. Given the fast-paced nature of these platforms, minor adjustments may be required as features evolve. MCP tool integration in Copilot Studio is currently a preview feature, so expect changes and improvements over time. What is Authorization Code Flow? The Authorization Code Flow is designed for applications that can securely store a client secret (like server-side apps). It allows the app to obtain an access token on behalf of the user without exposing their credentials. This flow uses an intermediate authorization code to exchange for tokens, adding an extra layer of security. Steps in the Flow User Authentication The user is redirected to the Authorization Server (In this case: Azure AD) to log in and grant consent. Authorization Code Issued After successful login, the Authorization Server sends an authorization code to the app via the redirect URI. Token Exchange The app sends the authorization code (plus client credentials) to the Token Endpoint to get: Access Token (for API calls) and Refresh Token (to renew access without user interaction) API Access The app uses the Access Token to call protected resources. Below diagram shows the Authorization code flow in detail. Press enter or click to view image in full size Microsoft identity platform and OAuth 2.0 authorization code flow — Microsoft identity platform | Microsoft Learn High Level Architecture Press enter or click to view image in full size High Level Architecture for MCP server as a backend server and Copilot Studio as a front-end client Develop MCP server in VS Code Clone the following repository and open in VS Code. git clone https://github.com/mafzal786/mcp-server.git Run the following to execute it locally. cd mcp-server uv venv uv sync uv run mcpserver.py Deploy MCP Server as Azure Container App Deploy the MCP server in Azure container App by running the following command. It can be deployed by many other various ways such as via VS Code or CI/CD pipeline. AZ Cli is used for simplicity. az containerapp up \ --resource-group <RESOURCE_GROUP_NAME> \ --name streamable-mcp-server2 \ --environment mcp \ --location <REGION> \ --source . Configure Authentication for Azure Container App Sign in Azure portal. Visit the container App in Azure and Click “Authentication” as shown below Press enter or click to view image in full size For more details, visit the following link: Enable authentication and authorization in Azure Container Apps with Microsoft Entra ID | Microsoft Learn Click Add Identity Provider as shown. Select Microsoft from the drop down and leave everything as is as shown below. This will create a new app registration for the container App. After it is all setup, it will look like as below. As soon as authentication is configured. it will make container app inaccessible except for OAuth. Review App Registration of Container App — Backend Visit App registration and click streamable-mcp-server2 as in this case. Click on Authentication tab. Verify the Redirect URIs. you should see a redirect URL for container app. URI will end with /.auth/login/aad/callback as shown in the green box in the below screenshot. Now click on “Expose an API”. Confirm Application ID URI is configured with scope as shown below. its format isapi://<client id> Verify API Permission. Make sure you Grant admin consent for your tenant as shown below. More scope can be created depending on the requirement of data access. Create App Registration for Client App — Copilot Studio In these steps, we will be configuring app registration for the client app, such copilot studio in this case acting as a client app. This is also mentioned in the “high level architecture” diagram in the earlier section of this article. Lauch Azure Portal. Visit App registration. Click New registration. Create a new App registration. leave the Redirect URL as of now, we will configure it later as it is provided by copilot studio when configuring custom MCP connector. Click on API permission and click “Add a permission”. Click Microsoft Graph and then click “Delegated permissions”. Select email, openid, profile as shown below. Make sure to Grant admin consent and it should look like as below. Create a secret. click “Certificates & secrets”. Create a new client secret by clicking “New client secret”. store the value as it will be masked after some time. if that happens, you can always delete and re-create a new secret. Capture the following as you would need it in configuring MCP tool in Copilot Studio. Client ID from the Overview Tab of app registration. Client secret from “Certificates & secrets” tab. Configure API permissions for backend which is App registration of Azure container app i.e. streamable-mcp-server2 in this case. Click “API permissions” tab. Click “Add a permission”. Click on “My APIs” tab as shown below and select streamable-mcp-server2. Select “Delegated permissions” Select the Permissions already created as a result of configuring Authentication for Azure Container App earlier. Click “Add permission” You MUST “Grant admin consent” as final step. It is very important!!! I can’t emphasize more on that. without it, nothing will work!!! End result of this client app registration should look like as mentioned in the below figure. MCP Tool configuration in Copilot Studio Lauch copilot studio at https://copilotstudio.microsoft.com/. Configuration of environment and agent is beyond the scope of this article. It is assumed, you already have environment setup and agent has been created. Following link will help you, how to create an agent in copilot studio. Quickstart: Create and deploy an agent — Microsoft Copilot Studio | Microsoft Learn Inside agent configuration, Click “Add tool”. Click on New tool. Select Model Context Protocol. Provide all relevant information for MCP server. Make sure your server URL ends with your mcp setup. In this case, it is Azure container app URL with/mcpin the end. Provide server name and server description. Select OAuth 2.0 radio button. Provide the following in the OAuth 2.0 section Client ID of client app registration. In this case, copilot-studio-client as configured earlier. Client secret of copilot-studio-client app registration. Authorization URL: https://login.microsoftonline.com/common/oauth2/v2.0/authorize Token URL template & Refresh URL: https://login.microsoftonline.com/oauth2/v2.0/token Scopes: openid, profile, email — which we selected earlier for Microsoft Azure Graph permissions. Click “Create”. This will provide you Redirect URL. you need to configure the redirect URL in client app registration. In this case, it is copilot-agent-client. Configure Redirect URL in Client App Registration Visit client app registration. i.e. copilot-studio-client. Click Authentication Tab and provide the Web Redirect URIs as shown below. Modify MCP connector in PowerApps Now visit the https://make.powerapps.com and open the newly created connector as shown below. Change the Resource URL from “Expose an API” in streamable-mcp-server2 app registration. The backend “Application ID URI” and also add .default in the scope. Provide the secret of client app registration as it will not let you update the connector. This is extra security measure for updating the connector in Powerapps. Click Update connector. CORS Configuration Congratulation of getting that far!!!, We are getting close to make it work!!! CORS configuration is a MUST!!! Since our Azure Container App is a remote MCP with totally different domain or origin. Power Apps and CORS for External Domains — Brief Overview When embedding or integrating Power Apps with external web applications or APIs, Cross-Origin Resource Sharing (CORS) becomes a critical consideration. CORS is a browser security feature that restricts web pages from making requests to a different domain than the one that served the page, unless explicitly allowed. Key Points: Power Apps hosted on *.powerapps.com or within Microsoft 365 domains will block calls to external APIs unless those APIs include the proper CORS headers. The external API must return: Access-Control-Allow-Origin: https://apps.powerapps.com (or * for all origins, though not recommended for production) Access-Control-Allow-Methods: GET, POST, OPTIONS (or as needed) Access-Control-Allow-Headers: Content-Type, Authorization (and any custom headers) If the API requires authentication (e.g., OAuth 2.0), ensure preflight OPTIONS requests are handled correctly. For scenarios where you cannot modify the external API, consider using: Power Automate flows as a proxy Azure API Management or Azure Functions to inject CORS headers Always validate security implications before enabling wide-open CORS. If the CORS are not setup. You will encounter following error in copilot studio after pressing F12 (Browser Developer) CORS policy — blocking the container app Azure container app provides very efficient way of configuring CORS in the Azure portal. Lauch Azure Portal. Visit Azure container app i.e. streamable-mcp-server2 in this case. Click on CORS under Networking section. Configure the following in Allowed Origin Section as shown below. localhost is added to make it work from local laptop, although it is not required for Copilot Studio. Click on “Allowed Method” tab and provide the following. Provide wild card “*” in “Allowed Headers”tab. Although, it is not recommended for production system. it is done for the sake for simplicity. Configure that for added security Click “Apply”. This will configure CORS for remote application. Test the connector We are in the final stages of configuring the connector. It is time to test it, if everything is configured correctly and works. Lauch the http://make.powerapps.com and click on “Custom connectors”, select your configured connector and click “5. Test” tab as shown below. You will see Selected Connection as blank if you are running it first time. Click “+ New connection” New connection will launch the Authorization flow and browser dialog will pop up for making a request for authorization code. Click “Create”. Complete the login process. This will create a successful connection. Click “Test operation”. If the response is 406 means everything is configured correctly. Test MCP Tool in Copilot Studio Lauch copilot studio and click on the Agent you created in earlier steps and click on “Tools tab”. Select your MCP tool as shown the following figure. Make sure it is “Enabled” if you have other tools attached to the same agent, disable them for now for testing. Make sure you have connection available which we created during the testing of custom connector in earlier step. You can also initiate a fresh connection by clicking on the drop down under “Connection” as shown below. Refreshing the tools will show all the tools available in this MCP server. Provide the prompt such as “Give me the stock price of tesla”. This will trigger the MCP server and call the respective method to bring the stock price of Tesla. Now try a weather-related question to see more. Conclusion Securing MCP tools in Copilot Studio with OAuth 2.0 Authorization Code Flow is a critical step toward building enterprise-ready integrations. By leveraging this flow, you ensure that user credentials remain protected while enabling delegated access to sensitive APIs and resources. The approach outlined here provides a solid foundation, but it’s only the beginning. As environments differ, you should evaluate additional security enhancements such as app roles, conditional access policies, and token lifecycle management to meet organizational compliance standards. Remember, MCP integration in Copilot Studio is still a preview feature, and the ecosystem evolves rapidly. Stay informed, revisit configurations periodically, and adapt to new best practices as they emerge. With a thoughtful implementation, you can unlock the full potential of MCP tools while maintaining robust security and user trust.1.5KViews4likes2CommentsThe Future of AI: Developing Lacuna - an agent for Revealing Quiet Assumptions in Product Design
A conversational agent named Lacuna is helping product teams uncover hidden assumptions embedded in design decisions. Built with Copilot Studio and powered by Azure AI Foundry, Lacuna analyzes product documents to identify speculative beliefs and assess their risk using design analysis lenses: impact, confidence, and reversibility. By surfacing cognitive biases and prompting reflection, Lacuna encourages teams to validate assumptions through lightweight evidence-gathering methods. This experiment in human-AI collaboration explores how agents can foster epistemic humility and transform static documents into dynamic conversations.522Views1like1CommentIntegrating Azure AI Foundry with Copilot Studio: A Strategic and Technical Overview
As organizations accelerate their AI adoption, the need for flexible, scalable, and secure platforms becomes paramount. My previous article, Navigating AI Solutions: Microsoft Copilot Studio vs. Azure AI Foundry | Microsoft Community Hub, represented two powerful yet distinct approaches to building AI agents. While Copilot Studio offers a low-code/no-code interface for rapid deployment, targeting any kind of business user, Azure AI Foundry provides a pro-code environment with deep customization and orchestration capabilities, targeting developer audiences. But what if you would not need to decide between one or the other, but benefit from integrating both platforms and unlock transformative business value across all teams? This is exactly the question I got asked increasingly while I was teaching our “Copilot, Copilot Studio and Azure AI Foundry” Instructor Led Training courses as a Microsoft Technical Trainer. This article starts with the business rationale for integration. From there, I will continue with detailing the influence of cost and ROI parameters as part of decision-making. Last, I will guide you through multiple technical integration capabilities available today, and how both platforms can complement each other. Business Rationale for Integration Copilot Studio is primarily designed for business users to build conversational agents quickly. It excels in rapid prototyping, using a graphical workflow-style interface, identical to Power Automate. Users don’t require much development skills to build such agents. Azure AI Foundry, on the other hand, is tailored for developers and data scientists who are typically in need of model orchestration, customized tool integration and enterprise-grade scalability and governance. Integrating both platforms allows organizations to bridge the gap between business agility and technical depth, enabling the ones closer to the business to prototype while developers can focus on custom features, refining and scale. For example, organizations can start with Copilot Studio for customer-facing bots or internal assistants, but then later, transition to Azure AI Foundry for more complex workflows, multi-agent orchestration or custom model integration. This layered approach supports progressive AI maturity, allowing teams to evolve from simple agents to fully sophisticated AI ecosystems. Cost and ROI Considerations Copilot Studio billing vs Azure AI Foundry consumption cost billing As users interact with Copilot Studio agents, or as the agents perform tasks on behalf of users, users consume Copilot Studio messages. Copilot Studio messages are the key component influencing the monthly cost of using Copilot Studio. Capabilities are available via the Copilot Studio pay-as-you-go meter (pay per message) and the Copilot Studio message pack subscription (25,000 messages monthly) license, or a combination of both. These license options are active on tenant-level. Any user with a Microsoft 365 Copilot license gets access to Copilot Studio, with no message-based charge. More details are available in the Microsoft Copilot Studio Licensing Guide. Azure AI Foundry is part of Azure’s consumption-based model, where you do not get charged for Azure AI Foundry itself, but you get charged a consumption cost for the different models your applications use. This charge can be listed as Microsoft (e.g. Azure OpenAI) or charged through the Azure marketplace (e.g. Cohere). Image: Azure AI Foundry model cost consumption overview from within Azure Cost Analysis Depending on the AI solution architecture your application workloads are based on, you should also take other Azure costs into account such as Azure Storage Accounts, Azure AI Search, Azure App Services, Azure Key Vault and alike. Since Azure AI Foundry charges are identical to any other Azure Resource charges, managing these is not different than your current Azure Cost Analysis approach. ROI and Budget alignment From the previous section, it should be clear that allocating the right budget can become complex, depending on the AI platforms used. By integrating both platforms, organizations can achieve cost optimization, by using Copilot Studio for lightweight tasks but scaling via Azure AI Foundry for compute-intensive operations. Given the lower complexity of building applications with Copilot Studio, they tend to result in early ROI, through Copilot Studio’s fast deployment. Azure AI Foundry’s robust and extensible infrastructure could lead to a longer-term value of ROI optimization. Technical Integration Capabilities HTTP Request Trigger One integration method involves using Copilot Studio’s HTTP Request feature to trigger Foundry Agents. This allows for Natural language prompts in Studio to initiate backend processes in Azure AI Foundry. This allows users to benefit from a seamless flow between conversational UI and enterprise logic, to consult business data, run data analytics or retrieve information across different enterprise application backends. Image: HTTP Request setup within Copilot Studio Topic MCP Protocol Azure AI Foundry now supports Model Context Protocol (MCP), an open standard enabling seamless interaction between large language models (LLMs) and external tools, systems or data sources. MCP provides a model-agnostic interface for tasks such as reading files, executing functions, and handling contextual prompts. Its primary goal is to simplify the integration of LLMs with third-party systems by addressing the complexity of building custom connectors for each tool or data source. MCP Tools can be integrated into your AI solutions using Azure AI Foundry Agent Service or through common development language SDKs or REST API. Check this Microsoft Learn module for more technical details on how to configure this or check out MCP-for beginners on YouTube https://aka.ms/MCP-for-beginners. Recently, the Model Context Protocol (MCP) Connector also became available as a new tool directly within Copilot Studio. Image: Model Context Protocol Connector Tool in Copilot Studio By integrating MCP Tools from within either Foundry Agent Service or through Copilot Studio, the organization can benefit from the standardized approach to allow connectivity to different enterprise systems, data endpoints or external applications. Simplifying the complexity and providing a smooth interaction irrespective of the AI platform used, provides major benefits to both business users and developer teams building these applications. Azure AI Foundry Models available to Copilot Studio (preview feature) Azure AI Foundry Models provides +11,000 models for you to choose from, offered by both Microsoft and an extensive range of model providers such as OpenAI, DeepSeek, Black Forest Labs, Meta and many more. On top of existing models offered, organizations can also create their own customized models by fine-tuning from within Azure AI Foundry. For example, imagine an organization building an IT support agent, which interacts with end-users using a chat interface and natural language. Users might be able to provide screenshots of errors, as well as describe technical issues in their own words. Traditional LLMs could struggle with recognizing specific screenshot details or business-specific terminology used by custom in-house developed applications, as they are not trained in this kind of information. That’s where fine-tuned models could be a solution. At the time of writing this article, a new preview feature became available to Copilot Studio customers, allowing them to use any Azure AI Foundry model, both catalog and fine-tuned ones, as the primary model for their Copilot Studio Agents. (FYI, follow this link for all details on the Copilot Studio Roadmap and features list) Image: Copilot Studio New Feature setting to enable AI Foundry model integration Conclusion Integrating Copilot Studio and Azure AI Foundry is not just a technical exercise, but rather a strategic move which aligns business goals, cost efficiency, and adoption readiness. By leveraging the strengths of both platforms, organizations can build AI solutions that are agile, scalable, and secure. Your business can focus on developing (or ‘making’ if not code-based) AI Agents, without facing bottlenecks or unneeded complexity or isolation of workloads. Instead of asking the question of which platform to use for building AI applications, organizations should invest in and benefit from a tight integration between both platforms, quickly enabling teams from both the business side as well as developers, to create AI-influenced applications that provide immediate business value, without compromise. #MicrosoftLearn #SkilledByMTT1.6KViews2likes0CommentsAnnouncing Public Preview for Business Process Solutions
In today’s AI powered enterprises, success hinges on access to reliable, unified business information. Whether you are deploying AI-augmented workflows or fully autonomous agentic solutions, one thing is clear: trusted, consistent data is the fuel that drives intelligent outcomes. Yet in many organizations, data remains fragmented across best of breed applications – creating blind spots in cross-functional processes and throwing roadblocks in the path of automation. Microsoft is dedicated to tackle these challenges, delivering a unified data foundation that accelerates AI adoption, simplifies automation and reduces risk – empowering businesses to unlock the full potential of unified data analytics and agentic intelligence. Our new solution offers cross-functional insights across previously siloed environments and includes: Prebuilt data models for enterprise business applications in Microsoft Fabric Source system data mappings and transformations Prebuilt dashboards and reports in Power BI Prebuilt AI Agents in Copilot Studio (coming soon) Integrated Security and Compliance By unifying Microsoft’s Fabric and AI solutions we can rapidly accelerate transformation and derisk AI rollout through repeatable, reliable, prebuilt solutions. Functional Scope Our new solution currently supports a set of business applications and functional areas, enabling organizations to break down silos and drive actionable insights across their core processes. The platform covers key domains such as: Finance: Delivers a comprehensive view of financial performance, integrating data from general ledger, accounts receivable, and accounts payable systems. This enables finance teams to analyze trends, monitor compliance, and optimize cash flow management all from within Power BI. The associated Copilot agent provides not only access to this data via natural language but will also enable financial postings. Sales: Provides a complete perspective on customers’ opportunity to cash journeys, from initial opportunity through invoicing and payment via Power BI reports and dashboards. The associated Copilot agent can help improve revenue forecasting, by connecting structured ERP and CRM data with unstructured data from Microsoft 365, also tracking sales pipeline health and identify bottlenecks. Procurement: Supports strategic procurement and supplier management, consolidating purchase orders, goods receipts, and vendor invoicing data into a complete spend dashboard. This empowers procurement teams to optimize sourcing strategies, manage supplier risk, and control spend. Manufacturing: (coming soon): Will extend coverage to manufacturing and production processes, enabling organizations to optimize resource allocation and monitor production efficiency. Each item within Business Process Solutions is delivered as a complete, business-ready offering. These models are thoughtfully designed to ensure that organizations can move seamlessly from raw data to actionable execution. Key features include: Facts and Dimensions: Each model is structured to capture both transactional details (facts) and contextual information (dimensions), supporting granular analysis and robust reporting across business processes. Transformations: Built-in transformations automatically prepare data for reporting and analytics, making it compatible with Microsoft Fabric. For example, when a business user needs to compare sales results from Europe, Asia, and North America, the solution transformations handle currency conversion behind the scenes. This ensures that results are consistent across regions, making analysis straightforward and reliable—without the need for manual intervention or complex configuration. Insight to Action: Customers will be able to leverage prebuilt Copilot Agents within Business Process Solutions to turn insight into action. These agents are deeply integrated not only with Microsoft Fabric and Microsoft Teams, but also connected source applications, enabling users to take direct, contextual actions across systems based on real-time insights. By connecting unstructured data sources such as emails, chats, and documents from Microsoft 365 apps, the agents can provide a holistic and contextualized view to support smarter decisions. With embedded triggers and intelligent agents, automated responses could be initiated based on new insights -- streamlining decision-making and enabling proactive, data-driven operations. Ultimately, this will empower teams to not just understand what is happening on a wholistic level, but to also take faster and smarter actions, and with greater confidence. Authorizations: Data models are tailored to respect organizational security and access policies, ensuring that sensitive information is protected and only accessible to authorized users. The same user credential principles apply to the Copilot agents when interacting with/updating the source system in the user-context. Behind the scenes, the solution automatically provisions the required objects and infrastructure to build the data warehouse, removing the usual complexity of bringing data together. It guarantees consistency and reliability, so organizations can focus on extracting value from their data rather than managing technical details. This reliable data foundation serves as one of the key informants of the agentic business processes. Accelerated Insights with Prebuilt Analytics Building on these robust data models, Business Process Solutions offer a suite of prebuilt Power BI reports tailored to common business processes. These reports provide immediate access to key metrics and trends, such as financial performance, sales effectiveness, and procurement efficiency. Designed for rapid deployment, they allow organizations to: Start analyzing data from day one, without lengthy setup or customization. Adapt existing reports for your organization’s exact business needs. Demonstrate best practices for leveraging data models in analytics and decision-making. This approach accelerates time-to-value and also empowers users to explore new analytical scenarios and drive continuous improvement. Extensibility and Customization Every organization is unique and our new solution is designed to support this, allowing you to adapt analytics and data models to fit your specific processes and requirements. You can customize scope items, bring in your own tables and views, integrate new data sources as your business evolves, and combine data across Microsoft Fabric for deeper insights. Similarly, the associated agents will be customizable from Copilot Studio to adapt to your specific Enterprise apps configuration. This flexibility ensures that, no matter how your organization operates, Business Process Solutions helps you unlock the full value of your data. Data integration Business Process Solutions uses the same connectivity options as Microsoft Fabric and Copilot Studio but goes further by embedding best practices that make integration simpler and more effective. We recognize that no single pattern can address the diverse needs of all business applications. We also understand that many businesses have already invested in data extraction tools, which is why our solution supports a wide range of options, from native connectivity to third-party options that bring specialized capabilities to the table. With Business Process Solutions we ensure data can be interacted with in a reliable and high-performant way, whether working with massive volumes or complex data structures. Getting started If your organization is ready to unlock the value of unified analytics, getting started is simple. Just send us a request using the form at: https://aka.ms/JoinBusAnalyticsPreview. Our team will guide you through the next steps and help you begin your journey.Microsoft Copilot Studio: Créer et déployer un chatbot
Qu’est-ce que Copilot Studio ? Copilot Studio est un outil faisant parti de Microsoft 365 qui permet de créer facilement des Chatbots. Un Agent Copilot est capable de discuter avec les utilisateurs en langage naturel. Ils peuvent répondre à des questions, guider les utilisateurs ou même déclencher des actions comme envoyer un courriel ou récupérer des données. Un outil sans code Avec Copilot, pas besoin d’avoir une connaissance en programmation. Son interface est simple, on peut écrire des questions, définir des réponses, ajouter des boutons et même le connecter à d’autres services comme Power Automate, SharePoint ou la Dataverse. Un grand avantage : Décrire pour créer Avec Copilot Studio, on peut décrire en quelque lignes ce que l’on veut que notre chatbot fasse, et Copilot Studio génère automatiquement un chatbot de base avec des sujets et des réponses. Exemple : « Tu es un assistant virtuel pour les étudiants d’une université. Tu réponds aux questions sur les horaires de cours, l’accès à la bibliothèque, les inscriptions, et les coordonnées du secrétariat. » Copilot Vs Chatbot classique Contrairement à un simple chatbot basé sur des mots-clés, un Agent Copilot peut : Comprendre le langage naturel S’adapter aux différentes manières de poser une question Se connecter à des données personnalisées. Être déployé sur plusieurs canaux comme Microsoft Teams, Power Apps, site web, etc… Exemple concret : Un responsable d’un projet d’étudiant qui veut créer un assistant virtuel pour répondre aux questions fréquentes sur le projet : Quels sont les horaires de la bibliothèque ? Comment envoyer un rapport final ? Qui est le coordinateur pédagogique ? En quelque minutes, on pourrait créer un agent Copilot qui répond à toutes ces questions. C’est ça l’utilité de Copilot Studio. Créer Un Copilot Agent Accéder à Copilot Studio - Depuis votre navigateur préfère (Edge, Chrome…) - Ouvrir la page : https://copilotstudio.microsoft.com - Connectez-vous à votre compte Microsoft (Scolaire ou professionnel) - Une fois connecté, vous arrivez sur le tableau de bord - Dans la fenêtre de chat, sous Décrire votre assistant pour le créer, mettons ce texte - Vous êtes un assistant qui aide à répondre aux questions sur les déplacements en toute sécurité dans le pays et à l’international. Veuillez répondre poliment. - Avec ceci, Copilot Studio crée un assistant qui comprend déjà les thèmes, et que l’on peut ensuite personnaliser. D’ici l’assistant est presque prêt, dans la conversation, Copilot nous suggère un nom pour notre assistant Conseiller Sécurité Voyage. On peut confirmer ceci si le nom nous va. - On met « Ok » dans la case de conversation et tapé sur Entré - Sélectionner le bouton Créer pour continuer A ce niveau, Copilot Studio nous configure notre chatbot et nous redirige vers la page de configuration. Sur cette nouvelle page, on peut modifier certaines parties comme le nom du chatbot, les instructions, ajouter des connaissances, des déclencheurs, des requêtes et autres. Descendons un peut vers le bas pour ajouter des connaissances et activer l’option Recherche sur le Web. Dans la fenêtre Ajouter des connaissances, on sélectionne Sites web publics et taper ceci (https://europa.eu/youreurope/citizens/ ) dans la case Lien du site web public puis taper sur ajouter ensuite taper sur Ajouter à l’assistant pour fermer la fenêtre. Tester et déployer Copilot Maintenant notre Copilot est prêt. Dans la partie Tester votre assistant, on peut tester le chatbot voir si la connaissance ajoutée fonctionne bien. Avec le teste fait dans cette démo, l’assistant nous retour une réponse avec des liens redirigeant vers le site ajouté a la liste des connaissances. Selon les réponses données par l’assistant ou le chatbot, on peut corriger les sujets si nécessaire. Ce test permet d’assurer que notre chatbot comprend bien l’utilisateur et répond de manière utile. Notre chatbot est prêt, on peut le publier. - Cliquez sur Publier en haut à droite. - La publication prend quelques secondes. Maintenant le chatbot est prêt à être utiliser dans d’autres plateformes ou canaux. On peut l’ajouter à Microsoft Teams, à un site web, à SharePoint et autre plateforme. Conclusion Créer un Copilot personnalisé avec Microsoft Copilot Studio, c’est à la portée de tous même sans expérience en développement. En un bout de temps, on peut : - Décrire l’assistant en un phrase simple - Laisser l’intelligence artificielle générer les sujets de discussion - Ajouter des réponses et des actions - Le tester, le publier et le déployer sur Microsoft Teams, SharePoint, site web et autres. Curieux d’exploiter l’univers de l’intelligence artificielle, Copilot Studio est un excellent point de départ pour apprendre à créer des expériences interactives utiles et modernes. Pourquoi ne pas créer ton premier Copilot dès aujourd’hui ? Pour plus d’information, aller sur la page de Microsoft Learn : https://learn.microsoft.com/fr-fr/microsoft-copilot-studio/154Views0likes0CommentsNew Microsoft Applied Skill Alert – Create Agents in Microsoft Copilot Studio (APL-7008)
Hi Friends👋 If you’ve been demoing Copilot Studio in your classes, here’s a quick way to validate and showcase those agent-building skills—without sitting a full certification exam. Why grab this micro-credential? Hands-on, half-day lab — prove you can build, publish & govern generative-AI agents end-to-end. Instant résumé boost — digital badge drops into Credly the moment you pass. Perfect add-on to PL-300/PL-400/PL-700 prep or any Power Platform course you teach. Lab tasks you’ll master Design the agent persona & generative AI instructions Build topics, variables & rich dialogues (Adaptive Cards included!) Call Dataverse data with Power Automate flows Publish to Microsoft Teams & the web, then secure with content moderation Prep in three steps Run the free learning path: Create agents in Microsoft Copilot Studio (9 bite-size modules). Skim the official study guide checklist (APL-7008). Spin up a trial tenant for learners and let them practice before the live lab. Ready? 👉 Copilot Studio Applied Skill Let’s keep empowering our students (and ourselves) to build the next generation of AI agents inside Microsoft 365. If you earn the badge, drop it below—would love to celebrate your win! 🏆 #CopilotStudio #AppliedSkills #PowerPlatform #GenerativeAI1.1KViews3likes1CommentMastering Agent Governance in Microsoft 365
The "Mastering Agent Governance in Microsoft 365" series is based on the Administering and Governing Agents whitepaper published by Microsoft and designed to educate IT leaders, compliance officers, and decision-makers about the importance of governance for AI agents in Microsoft 365, particularly in highly regulated industries like Healthcare and Life Sciences (HLS). The six-episode series cover the growing role of agents, the risks of unmanaged agents, and the strategic importance of governance frameworks. Empowering innovation while protecting patient data and ensuring compliance In the age of AI-powered productivity, agents—automated digital assistants built with tools like Microsoft 365 Copilot, SharePoint, and Copilot Studio—are transforming how work gets done. From streamlining clinical documentation to automating regulatory reporting, agents are becoming indispensable in Healthcare and Life Sciences (HLS). But with great power comes great responsibility. Why Governance Can’t Be an Afterthought In highly regulated industries like HLS, where data sensitivity and compliance are paramount, the rise of autonomous agents introduces new risks: Unauthorized data access could expose protected health information (PHI). Unmonitored agent behavior could lead to regulatory violations. Lack of lifecycle controls could result in outdated or insecure agents operating in production environments. Agent governance isn’t just an IT concern—it’s a business imperative. It ensures that innovation doesn’t outpace compliance, and that every agent deployed aligns with organizational policies, security standards, and regulatory frameworks like HIPAA, GDPR, and FDA 21 CFR Part 11. Understanding the Agent Landscape Microsoft 365 supports a spectrum of agent creators: End Users using SharePoint or Copilot templates to automate simple tasks. Makers building more complex agents in Copilot Studio. Developers crafting sophisticated, enterprise-grade agents with Azure AI and Teams Toolkit. Each persona requires a different level of oversight. For example, a clinical researcher using SharePoint to build a data retrieval agent may need minimal governance, while a developer building a patient-facing chatbot must adhere to strict data protection and validation protocols. Governance in Action Microsoft provides a layered governance model: Tool Controls: Define what agent creators can do within tools like Copilot Studio and SharePoint. Content Controls: Ensure agents only access data they’re authorized to use, leveraging Microsoft Purview for sensitivity labeling and DLP. Agent Management: Monitor usage, enforce lifecycle policies, and block non-compliant agents via the Microsoft 365 Admin Center. This framework allows organizations to empower innovation while maintaining control—critical in environments where patient safety and regulatory compliance are non-negotiable. The Business Case for Governance For HLS organizations, agent governance delivers tangible benefits: Reduced compliance risk through proactive policy enforcement. Improved operational efficiency by enabling safe automation. Greater trust from patients, regulators, and internal stakeholders. In short, governance is the foundation that allows agents to scale safely and sustainably.2.3KViews2likes3Comments