copilot
678 TopicsIntegrate Custom Azure AI Agents with CoPilot Studio and M365 CoPilot
Integrating Custom Agents with Copilot Studio and M365 Copilot In today's fast-paced digital world, integrating custom agents with Copilot Studio and M365 Copilot can significantly enhance your company's digital presence and extend your CoPilot platform to your enterprise applications and data. This blog will guide you through the integration steps of bringing your custom Azure AI Agent Service within an Azure Function App, into a Copilot Studio solution and publishing it to M365 and Teams Applications. When Might This Be Necessary: Integrating custom agents with Copilot Studio and M365 Copilot is necessary when you want to extend customization to automate tasks, streamline processes, and provide better user experience for your end-users. This integration is particularly useful for organizations looking to streamline their AI Platform, extend out-of-the-box functionality, and leverage existing enterprise data and applications to optimize their operations. Custom agents built on Azure allow you to achieve greater customization and flexibility than using Copilot Studio agents alone. What You Will Need: To get started, you will need the following: Azure AI Foundry Azure OpenAI Service Copilot Studio Developer License Microsoft Teams Enterprise License M365 Copilot License Steps to Integrate Custom Agents: Create a Project in Azure AI Foundry: Navigate to Azure AI Foundry and create a project. Select 'Agents' from the 'Build and Customize' menu pane on the left side of the screen and click the blue button to create a new agent. Customize Your Agent: Your agent will automatically be assigned an Agent ID. Give your agent a name and assign the model your agent will use. Customize your agent with instructions: Add your knowledge source: You can connect to Azure AI Search, load files directly to your agent, link to Microsoft Fabric, or connect to third-party sources like Tripadvisor. In our example, we are only testing the CoPilot integration steps of the AI Agent, so we did not build out additional options of providing grounding knowledge or function calling here. Test Your Agent: Once you have created your agent, test it in the playground. If you are happy with it, you are ready to call the agent in an Azure Function. Create and Publish an Azure Function: Use the sample function code from the GitHub repository to call the Azure AI Project and Agent. Publish your Azure Function to make it available for integration. azure-ai-foundry-agent/function_app.py at main · azure-data-ai-hub/azure-ai-foundry-agent Connect your AI Agent to your Function: update the "AIProjectConnString" value to include your Project connection string from the project overview page of in the AI Foundry. Role Based Access Controls: We have to add a role for the function app on OpenAI service. Role-based access control for Azure OpenAI - Azure AI services | Microsoft Learn Enable Managed Identity on the Function App Grant "Cognitive Services OpenAI Contributor" role to the System-assigned managed identity to the Function App in the Azure OpenAI resource Grant "Azure AI Developer" role to the System-assigned managed identity for your Function App in the Azure AI Project resource from the AI Foundry Build a Flow in Power Platform: Move into the Power Platform (https://make.powerapps.com) to build out a flow that connects your Copilot Studio solution to your Azure Function App. When creating a new flow, select 'Build an instant cloud flow' and trigger the flow using 'Run a flow from Copilot'. Add an HTTP action to call the Function using the URL and pass the message prompt from the end user with your URL. The output of your function is plain text, so you can pass the response from your Azure AI Agent directly to your Copilot Studio solution. Create Your Copilot Studio Agent: Navigate to Microsoft Copilot Studio and select 'Agents', then 'New Agent'. Now select ‘Create’ button at the top of the screen From the top menu, navigate to ‘Topics’ and ‘System’. We will open up the ‘Conversation boosting’ topic. When you first open the Conversation boosting topic, you will see a template of connected nodes. Delete all but the initial ‘Trigger’ node. Now we will rebuild the conversation boosting agent to call the Flow you built in the previous step. Select 'Add an Action' and then select the option for existing Power Automate flow. Pass the response from your Custom Agent to the end user and end the current topic. My existing Cloud Flow: Add action to connect to existing Cloud Flow: When this menu pops up, you should see the option to Run the flow you created. Here, mine does not have a very unique name, but you see my flow 'Run a flow from Copilot' as a Basic action menu item. Now complete building out the conversation boosting topic: Make Agent Available in M365 Copilot: Navigate to the 'Channels' menu and select 'Teams + Microsoft 365'. Be sure to select the box to 'Make agent available in M365 Copilot'. Save and re-publish your Copilot Agent. It may take up to 24 hours for the Copilot Agent to appear in M365 Teams agents list. Once it has loaded, select the 'Get Agents' option from the side menu of Copilot and pin your Copilot Studio Agent to your featured agent list Now, you can chat with your custom Azure AI Agent, directly from M365 Copilot! Conclusion: By following these steps, you can successfully integrate custom Azure AI Agents with Copilot Studio and M365 Copilot, enhancing you’re the utility of your existing platform and improving operational efficiency. This integration allows you to automate tasks, streamline processes, and provide better user experience for your end-users. Give it a try! Curious of how to bring custom models from your AI Foundry to your CoPilot Studio solutions? Check out this blog1.3KViews1like3CommentsSmall and Medium Business Track for the M365 Community Conference
The Microsoft 365 Community Conference is the ultimate Microsoft 365 community event—and the perfect place to expand your skills and knowledge in Microsoft 365 and AI. You’ll learn directly from over 160 Microsoft product leaders, and gain access to more than 200 sessions, including keynotes, workshops, breakouts, and hands-on demos. This year we’re bringing you all-new content and sessions to help you transform the way you work—and maximize your impact with the tools you use every day. Small and Medium Business Track Ready to transform business growth with insights from AI and industry experts? Learn practical tips for using Microsoft 365 and Microsoft 365 Copilot to grow your customer base, build your brand, and scale securely. Access tailored product information and demos for small and medium businesses and learn from other business owners using AI to compete with larger organizations. Speakers Brenna Robinson | General Manager, Microsoft SMB Jacky Magee | Head of Marketing, AI at Work for SMB | Microsoft Alexia Cambon | Senior Director, Research | Microsoft Nick Masci | Chief Product Officer and Co-Founder | Industrialized Construction Group (ICG) Karim R. Lakhani | Dorothy & Michael Hintze Professor of Business Administration | Harvard Business School Sessions Accelerating Growth with AI-First Strategies for Small and Medium Businesses with Brenna Robinson The Future of Work: How AI Will Transform Small and Medium Businesses with Alexia Cambon, Jacky Magee, Nick Masci, and Karim R. Lakhani Copilot Adoption and Best Practices for Small and Medium Businesses with Gabe Ho and Greg Otterstrom Expert Tips for a Successful Migration to Microsoft 365 for Small and Medium Businesses with Chris Boyd and Meg Garland Grow Your Small or Medium Business with New Tools from Microsoft Teams with Bindu Pillai and Angela Chin Scale with AI: How Small and Medium Business Can Leverage Copilot Agents to Orchestrate Their Business at Scale with Gabe Ho and Elif Algedik Learn more: Visit our SMB Track page.1.4KViews0likes0CommentsCopilot Studio - Access SharePoint List
Hi There, for a testing prupose, I want to create an agent with Copilot Studio. I want to access SharePoint List via action="Get items" from SharePoint. I selected/entered in "Inputs" section: Site Address = http:// .... List Name = Employee List Top Count = 5000 Filter Query = Dynamically The list has employee information, around 10 columns, and 150 rows. When I submit such "Who is Jasmin?" or any other similar question such "Who is Jasmin Tailor?" Overall: I am searching about existing employee and not existisng in the list, ... I get always this error message: Error Message: The connector 'SharePoint' returned an HTTP error with code 400. Error Code: ConnectorRequestFailure Conversation Id: 3c0377a0-83c1-412b-9d1a-761f8202203a Time (UTC): 2025-04-15T16:49:08.414Z First time I run this agent with this action, I was asked to connect, which worked fine. However I still get this error message. I am the creator of the agent and I am also the owner of this sharepoint list. Any idea, what am I doing wrong? Or can I somewhere check if this "Get itrems" action or other actions are temporarily not working? THANKS ! Regards, Aykut55Views0likes3CommentsMicrosoft Purview protections for Copilot
Use Microsoft Purview and Microsoft 365 Copilot together to build a secure, enterprise-ready foundation for generative AI. Apply existing data protection and compliance controls, gain visibility into AI usage, and reduce risk from oversharing or insider threats. Classify, restrict, and monitor sensitive data used in Copilot interactions. Investigate risky behavior, enforce dynamic policies, and block inappropriate use — all from within your Microsoft 365 environment. Erica Toelle, Microsoft Purview Senior Product Manager, shares how to implement these controls and proactively manage data risks in Copilot deployments. Control what content can be referenced in generated responses. Check out Microsoft 365 Copilot security and privacy basics. Uncover risky or sensitive interactions. Use DSPM for AI to get a unified view of Copilot usage and security posture across your org. Block access to sensitive resources. See how to configure Conditional Access using Microsoft Entra. Watch our video here. QUICK LINKS: 00:00 — Microsoft Purview controls for Microsoft 365 Copilot 00:32 — Copilot security and privacy basics 01:47 — Built-in activity logging 02:24 — Discover and Prevent Data Loss with DSPM for AI 04:18 — Protect sensitive data in AI interactions 05:08 — Insider Risk Management 05:12 — Monitor and act on inappropriate AI use 07:14 — Wrap up Link References Check out https://aka.ms/M365CopilotwithPurview Watch our show on oversharing at https://aka.ms/OversharingMechanics Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -Not all generative AI is created equal. In fact, if data security or privacy-related concerns are holding your organization back, today I’ll show you how the combination of Microsoft 365 Copilot and the data security controls in Microsoft Purview provide an enterprise-ready platform for GenAI in your organization. This way, GenAI is seamlessly integrated into your workflow across familiar apps and experiences, all backed by unmatched data security and visibility to minimize data risk and prevent data loss. First, let’s level set on a few Copilot security and privacy basics. Whether you’re using the free Copilot Chat that’s included with Microsoft 365 or have a Microsoft 365 Copilot license, they both honor your existing access permissions to work information in SharePoint and OneDrive, your Teams meetings and your email, meaning generated AI responses can only be based on information that you have access to. -Importantly, after you submit a prompt, Copilot will retrieve relevant index data to generate a response. The data only stays within your Microsoft 365 service trust boundary and doesn’t move out of it. Even when the data is presented to the large language models to generate a response, information is kept separate to the model, and is not used to train it. This is in contrast to consumer apps, especially the free ones, which are often designed to collect training data. As users upload files into them or paste content into their prompts, including sensitive data, the data is now duplicated and stored in a location outside of your Microsoft 365 service trust boundary, removing any file access controls or classifications you’ve applied in the process, placing your data at greater risk. -And beyond being stored there for indexing or reasoning, it can be used to retrain the underlying model. Next, adding to the foundational protections of Microsoft 365 Copilot, Microsoft Purview has activity logging built in and helps you to discover and protect sensitive data where you get visibility into current and potential risks, such as the use of unprotected sensitive data in Copilot interactions, classify and secure data where information protection helps you to automatically classify, and apply sensitivity labels to data, ensuring it remains protected even when it’s used with Copilot, and detect and mitigate insider risks where you can be alerted to employee activities with Copilot that pose a risk to your data, and much more. -Over the next few minutes, I’ll focus on Purview capabilities to get ahead of and prevent data loss and insider risks. We’ll start in Data Security Posture Management or DSPM for AI for short. DSPM for AI is the one place to get a rich and prioritized bird’s eye view on how Copilot is being used inside your organization and discover corresponding risks, along with recommendations to improve your data security posture that you can implement right from the solution. Importantly, this is where you’ll find detailed dashboards for Microsoft 365 Copilot usage, including agents. -Then in Activity Explorer, we make it easy to see recent activities with AI interactions that include sensitive information types, like credit cards, ID numbers or bank accounts. And you can drill into each activity to see details, as well as the prompt and response text generated. One tip here, if you are seeing a lot of sensitive information exposed, it points to an information oversharing issue where people have access to more information than necessary to do their job. If you find yourself in this situation, I recommend you also check out our recent show on the topic at aka.ms/OversharingMechanics where I dive into the specific things you should do to assess your Microsoft 365 environment for potential oversharing risks to ensure the right people can access the right information when using Copilot. -Ultimately, DSPM for AI gives you the visibility you need to establish a data security baseline for Copilot usage in your organization, and helps you put in place preventative measures right away. In fact, without leaving DSPM for AI on the recommendations page, you’ll find the policies we advise everyone to use to improve data security, such as this one for detecting potentially risky interactions using insider risk management and other recommendations, like this one to detect potentially unethical behavior using communication compliance policies and more. From there, you can dive in to Microsoft Purview’s best-in-class solutions for more granular insights, and to configure specific policies and protections. -I’ll start with information protection. You can manage data security controls with Microsoft 365 Copilot in scope with the information protection policies, and the sensitivity labels that you have in use today. In fact, by default, any Copilot response using content with sensitivity labels will automatically inherit the highest priority label for the referenced content. And using data loss prevention policies, you can prevent Copilot from processing any content that has a specific sensitivity label applied. This way, even if users have access to those files, Copilot will effectively ignore this content as it retrieves relevant information from Microsoft Graph used to generate responses. Insider risk management helps you to catch data risk based on trending activities of people on your network using established user risk indicators and thresholds, and then uses policies to prevent accidental or intentional data misuse as they interact with Copilot where you can easily create policies based on quick policy templates, like this one looking for high-risk data leak patterns from insiders. -By default, this quick policy will scope all users in groups with a defined triggering event of data exfiltration, along with activity indicators, including external sharing, bulk downloads, label downgrades, and label removal in addition to other activities that indicate a high risk of data theft. And it doesn’t stop there. As individuals perform more risky activities, those can add up to elevate that user’s risk level. Here, instead of manually adjusting data security policies, using Adaptive Protection controls, you can also limit Copilot use depending on a user’s dynamic risk level, for example, when a user exceeds your defined risk condition thresholds to reach an elevated risk level, as you can see here. -Using Conditional Access policies in Microsoft Entra, in this case based on authentication context, as well as the condition for insider risk that you set in Microsoft Purview, you can choose to block their permission when attempting to access sites with a specific sensitivity label. That way, even if a user is granted access to a SharePoint site resource by an owner, their access will be blocked by the Conditional Access policy you set. Again, this is important because Copilot honors the user’s existing permissions to work with information. This way, Copilot will not return information that they do not have access to. -Next, Communication Compliance is a related insider risk solution that can act on potentially inappropriate Copilot interactions. In fact, there are specific policy options for Microsoft 365 Copilot interactions in communication compliance where you can flag jailbreak or prompt injection attempts using Prompt Shields classifiers. Communication compliance can be set to alert reviewers of that activity so they can easily discover policy matches and take corresponding actions. For example, if a person tries to use Copilot in an inappropriate way, like trying to get it to work around its instructions to generate content that Copilot shouldn’t, it will report on that activity, and you’ll also be able to see the response informing the user that their activity was blocked. -Once you have the controls you want in place, it’s a good idea to keep going back to DSPM for AI so you can see where Copilot usage is matching your data security policies. Sensitive interactions per AI app shows you interactions based on sensitive information types. Top unethical AI interactions surfaces insights based on the communication compliance controls you’ve defined. Top sensitivity labels referenced in Microsoft 365 Copilot reports on the labels you’ve created, and applied to reference content. And you can see Copilot interactions mapped to insider risk severity levels. Then digging into these reports shows you a filtered view of activities in Activity Explorer with time-based trends and details for each. Additionally, because all Copilot interactions are logged, like other Microsoft 365 activities in email, Microsoft Teams, SharePoint and OneDrive, you can now use the new data security investigation solution. This uses AI to quickly reason over thousands of items, including Copilot Chat interactions to help you investigate the potential cause of risks for known data leaks in similar incidents. -So that’s how Microsoft 365 Copilot, along with Microsoft Purview, provides comprehensive controls to help protect your data, minimize risk, and quickly identify Copilot interactions that could lead to compromise so you can take corrective actions. No other AI solution has this level of protection and control. To learn more, check out aka.ms/M365CopilotwithPurview. Keep watching Microsoft Mechanics for the latest updates and thanks for watching.434Views0likes0CommentsNew Copilot Features coming to Mac
If you're a Mac user in healthcare, I've got some exciting news to share with you. Microsoft has introduced some fantastic new Copilot experiences for Mac, specifically within Microsoft Outlook. If you're already a fan of Copilot in Teams, on the web, and other places, you're going to love these updates. Let's dive into the details! What's New? The same amazing Copilot experience that you're used to in Teams and other Microsoft applications is now available in Microsoft Outlook for Mac. You can find the Copilot app directly in the left app bar of Outlook, alongside your contacts and calendars. This makes it incredibly convenient to access Copilot's powerful features right from your email client. Unified Experience Across Applications With these new features, you can start accessing Copilot directly from the Outlook app, just like you can in other Microsoft applications. This unified experience ensures that you have consistent access to Copilot's capabilities, no matter which app you're using. Preview and Rollout These new features are currently in preview and will be rolling out globally this month, January 2025. If you're an admin, make sure to pay attention to your application update rings so you can start testing these features before they roll out to all users. Install Copilot as an App Through Your Browser For those who prefer a dedicated app experience, I've got you covered. I'll show you how to install Copilot as an app on your Mac using Edge, Chrome, and Safari. This approach leverages the "install as an app" and "add to dock" features, giving you a dedicated Copilot application right at your fingertips. How to Install Copilot as an App Edge: Go to the URL you want to create an app from, click the menu, and select "Install this site as an app." Give it a name and hit install. Chrome: Similar to Edge, navigate to the URL, click the menu, and select "Install this site as an app." Safari: Go to the URL, click "File" at the top, and select "Add to Dock." Give it a name and hit add. The new Microsoft Copilot experience on Mac is designed to make your life easier and more productive and help get things done! I hope you find these new features as exciting as I do. ***The article has been updated to accurately reflect the current status of feature launches. The features mentioned above are in preview, while other features are still under development.***Partner Blog | What's new for Microsoft partners: April 2025 edition
We value partner feedback and celebrate the range of perspectives within our community as we continue to enhance the Microsoft AI Cloud Partner Program. Our second blog of 2025 provides expert insights, updated learning resources, and recent benefits from the last four months to support your development. Announcements Microsoft at 50: the journey and future of the partner ecosystem As we celebrate Microsoft’s 50th anniversary in April, our annual State of the Partner Ecosystem blog was a great opportunity to reflect on the incredible journey we’ve shared with our partners, employees, and customers. Celebrate with us! Watch this video from Judson Althoff, Executive Vice President and Chief Commercial Officer, Microsoft. Join the Microsoft AI Skills Fest for 50 days of learning and discovery starting April 8! Gain skills that will empower you and your team to build innovative AI solutions with Microsoft’s apps and services. Download the Microsoft 50th Anniversary Social Toolkit. See the full list of partner quotes on the Microsoft 50th Anniversary celebration site. Upcoming API changes for Microsoft partners: what you need to know Security and compliance are vital to maintaining trust and enhancing business efficiency. Our recent blog outlines several significant updates to application programming interfaces (APIs) that Microsoft partners will need to implement over the coming months to ensure compliance and avoid disruptions in business operations. These changes include updates to billing frequency scheduling, Partner of Record (POR) assignment, CSP billing reconciliation APIs, Partner Center pricelist upgrades, and the deprecation of Azure AD graph tokens. Additionally, the blog emphasizes the importance of multi-factor authentication (MFA) and upcoming changes to Microsoft Customer Agreement (MCA) attestation methods. Continue reading here92Views0likes1Comment