Event banner
Prevent data loss and insider risks for Microsoft 365 Copilot with Microsoft Purview
Event Ended
Tuesday, Jun 17, 2025, 09:30 AM PDTEvent details
To be truly enterprise-ready, a generative AI tool must do more than generate content—it must actively help prevent data loss and mitigate insider risks. Microsoft Copilot and agents are built with security in mind, integrating seamlessly with Microsoft Purview to deliver robust data protection capabilities.
In this session, you’ll learn how to:
- Receive real-time alerts and detailed reports on risky behaviors and AI usage.
- Safeguard sensitive files and interactions with Copilot and agents.
- Automatically enforce security policies in response to high-risk actions.
Empower your organization to adopt Copilot and agents with confidence—securely, responsibly, and at scale.
How do I participate?
No registration is required. Select "Add to calendar" to save the date, select "Attend" to receive event reminders, then join us live on June 17th ready to learn and ask questions! Feel free to post questions and comments below. You can post during the live session and/or in advance if the timing doesn't work for you.
Access session presentations
Looking for session materials or presentations? Visit the new Copilot Control System page on AMC to download resources and explore more content from the event.
This session is part of the Digital Deep Dive: Copilot Control System (CCS). Add it to your calendar, select "Attend" for event reminders, and check out the other sessions! Each session has its own page where the session livestream and discussion space will be available at the start time. You will also be able to view sessions on demand after the event. |
Sarah_Gilbert
Updated Jun 19, 2025
15 Comments
Sort By
- JeremyChapmanMSFT
Microsoft
Sharing additional resources as promised during the session from Microsoft Mechanics:
- Introducing Microsoft Purview Alert Triage Agents for Data Loss Prevention & Insider Risk Management This explains how you can use generative AI with these agents to help analyze and triage DLP and IRM alerts if you're overwhelmed by the number of alerts.
- Microsoft Purview: New Data Loss Prevention (DLP) controls for the browser & network This one explains DLP protections and how they are expanding to the browser, apps communicating with web protocols, and network controls.
- Microsoft Purview protections for Copilot This is the show Erica presented on to explain the Microsoft 365 Copilot and Copilot Chat specific capabilities in Microsoft Purview.
- Sarah_Gilbert
Community Manager
Thank you all for joining our AMA today.
This session on preventing data loss and insider risks for Microsoft 365 Copilot with Microsoft Purview is now wrapping up, but comments will remain open for the next couple of days so you can still add follow-up questions or join ongoing threads. If something comes to mind after the session, you’ve still got time to jump in.
You can continue to view all questions and answers right here on this page—feel free to bookmark it for future reference.
Looking for the session deck?
All decks from this series will be posted publicly on a new Copilot Control System page on adoption.microsoft.com. The page is currently in development and is expected to go live Thursday morning. If you’re looking for a copy of today’s deck, please return to this event page on Thursday for a direct link.
Have more questions? Head over to the Microsoft 365 Copilot Discussion Space to continue the conversation.
For regular product updates, don’t forget to subscribe to the What’s New in Copilot blog—published at the end of each month.
Up next in the series:
Understanding web search controls in Microsoft 365 Copilot
We hope to see you there.
- CarlLourenzCopper Contributor
Where do you ask your questions so you can get live answers?
- Emily_Perina
Community Manager
Hi CarlLourenz ! You can add any questions you have here in the comments.
- Gravity_MarcOccasional Reader
Emily_Perina, do you mean these comments? Like CarlLourenz, I'm not following where to put the Ask me Anything questions.
- mrknowflowOccasional Reader
Quick question from the recent Purview AI webinar: If Purview captures individual user prompts and responses for monitoring, how does this comply with GDPR?
Seems like a potential conflict between AI governance needs and data privacy requirements - especially regarding consent and data minimization.
Are there specific configurations or best practices to handle this properly under European data protection law?
Thanks!
- JeremyChapmanMSFT
Microsoft
In this case, is the user is logged in to a company-managed device. In order to access company resources, they have also set up a work user profile in the browser and are logged in in the context of their Microsoft 365 work account.
- Casey Spiller
Microsoft
Prompts and responses are stored in Teams and in the substrate, DSPM for AI displays that data.
Depending on the need, you can use eDiscovery+Graph to delete this information if it pertains to data removal, or alternatively you could use auto retention to delete information ongoing based on a specific data classifier.
Search for and delete Copilot data in eDiscovery | Microsoft Learn
Automatically apply a retention label to Microsoft 365 items | Microsoft Learn - Ben_Summers
Microsoft
Like every Microsoft product, Purview has undergone extensive compliance and privacy reviews - to help use ensure that we can meet the legal and regulatory commitments we make to customers regarding GDPR, EUDB compliance, etc. Because we don't give legal advice, we'd strongly encourage you to work with your own legal counsel to discuss concerns about potential conflicts between privacy and abuse/misuse monitoring to determine what's right for your organization.
- Hughes818Iron Contributor
An external user from another organization joined one of our internal meetings both as themselves and with a bot/agent enabled on their own calendar. This bot follows them to join meetings with other organizations as an anonymous user to transcribe their meetings. Our meeting organizer was unaware of this. Microsoft advised us to enable CAPTCHA in the Teams Admin Center to prevent bots from joining meetings. Are there any upcoming improvements for these situations? It would be helpful if the Teams app prompted the meeting organizer when a bot/agent joins, allowing them to approve or deny it.
- Emily_Blundo
Microsoft
You could potentially use the Lobby to control entry by using the Meeting Options and set Who can bypass the lobby to "Only me" or "People I invite".
As far as DSPM for AI or Purview controls for this use case, I have not heard about this being added to our roadmap yet, but I can take this as feedback for our team. Thank you!
- JM_Lamont_HendersonCopper Contributor
What PIM roles are required to use and leverage Purview in the governance of M365 Copilot and agents?
- Emily_Blundo
Microsoft
To access DSPM for AI that Erica is showing, you will need the permissions mentioned in this documentation: Permissions for Microsoft Purview Data Security Posture Management for AI | Microsoft Learn
- Sarah_Gilbert
Community Manager
Welcome to today’s Microsoft 365 Copilot Ask Microsoft Anything (AMA)
This session is focused on preventing data loss and insider risks for Microsoft 365 Copilot with Microsoft Purview.
To be enterprise-ready, a generative AI solution must do more than generate content—it must also protect it. Microsoft 365 Copilot and agents are designed with security in mind, offering built-in protections that integrate with Microsoft Purview to help you adopt AI responsibly and at scale.
In this AMA, we’ll explore:
- How to receive real-time alerts and reports on risky behaviors and AI usage
- Ways to safeguard sensitive data in interactions with Copilot and agents
- How to automatically enforce security policies in response to high-risk actions
To participate: Please post each question in a new comment thread below. The Microsoft team is here live to answer during the event hour.
Let’s get started—ask us anything.