Generative and agentic AI are transforming how organizations operate, but it also can introduce new risks and amplify existing risks. Tools like Microsoft 365 Copilot generate content based on what the user has permission to access in an environment. That means any gaps in governance, such as over-permissioned sites, inherited access, or lack of sensitivity label protections, become amplified.
We recognize this is a growing concern for security leaders. One of the core issues is internal oversharing, or when employees have access to more information than is necessary to perform their roles. Given that Copilot and agents can summarize, synthesize, and surface data across contexts, enforced access controls are critical.
Most internal oversharing stems from configuration issues rather than malicious user intent. Common patterns include:
- Site privacy settings that grant access to everyone in the organization.
- Default sharing options set to “everyone,” bypassing more secure configurations.
- Broken permission inheritance, where site-level permissions don’t align with those at the file or folder level.
- Sharing with the “everyone except external users” domain group
- Sites and files without sensitivity labels that enforce policies around how data can be accessed and shared.
To address this, organizations must shift from reactive content oversight to proactive governance and control, so the data driving their AI systems are properly managed, labeled, and secured. In this post, we’ll walk through a structured approach to securing Copilot and agents, with a focus on Microsoft Purview and SharePoint Advanced Management.
Governing AI Starts with Governing Access
According to Gartner, by 2027, 60% of businesses will fail to realize the anticipated value of their AI use cases due to incohesive data frameworks. Managing exposure to generative AI tools begins with governing the data environments AI draws from. It’s a cumulative operational challenge, shaped primarily by over-permissioned data access, paired with under-enforcement of internal controls.
To address internal oversharing and strengthen AI governance, Microsoft recommends a structured deployment blueprint: Pilot → Deploy → Operate. This phased approach helps organizations assess risk, take action, and build lasting governance practices without slowing innovation. Below is a quick snapshot:
- Pilot (optional): Deploy Copilot to a sub-set of users and grant access to up to 100 popular, low-risk sites. Pilots are typically done to validate core Copilot use cases before a larger deployment. This phase helps validate permission controls, surface oversharing issues, and build internal confidence before scaling.
- Deploy: Scale Copilot use across the organization while remediating oversharing risks. Use native tools to secure sensitive data, increase site privacy, and enforce labeling policies. By the end of this phase, Copilot is broadly deployed with oversharing guardrails in place.
- Operate: Establish ongoing governance with automated policies, regular monitoring, and continuous improvement to further reduce risk, secure sensitive data, and improve Copilot responses. As collaboration grows, AI access will remain aligned to business and security intent.
SharePoint Advanced Management: Site governance controls
For many organizations, SharePoint is the hub of file sharing and collaboration across Teams, Outlook, Exchange, and OneDrive. What happens within SharePoint directly determines what Copilot, agents, and your employees can access, summarize, and resurface.
SharePoint Advanced Management (SAM), included with M365 Copilot, equips IT and security leaders with tools to assess, clean up, and lock down sprawling SharePoint sites before Copilot scales. Let’s explore some of the ways SAM enables admins to systematically take control of their sites and drive org-wide readiness for broad Copilot adoption:
- Content Management Assessment: Provides a guided evaluation of your SharePoint environment, surfacing misconfigurations, inactivity, permission issues, and lifecycle risks across the thousands of sites in your tenant. It packages that data into a single, actionable dashboard, with prescriptive actions for quick resolution. This assessment becomes your baseline for governance.
- Site Lifecycle Management: Automatically identifies inactive, ownerless, or uncertified sites and enables scalable remediation actions, such as marking them read-only, archiving them, or prompting owners to take action to maintain access to site content. This tool helps reduce risk by cleaning up content before it can be surfaced by Copilot.
- Oversharing Control with Permission State Reports: Provides detailed, customizable reports of site permissions across Microsoft 365 to uncover oversharing risks like broken inheritance, public links, and excessive group access. The report deduplicates group membership to show the true number of users with access, offering clear visibility into who can see what. From there, admins can trigger Site Access Reviews to prompt site owners to clean up permissions or apply controls like Restricted Access and Restricted Content Discovery to contain exposure.
- Restricted Access Control: Empowers admins to lock down a site to a specific set of users, ignoring existing permissions and applying a strict allow list. This gives you a rapid-response option to contain risk while implementing risk remediation measures and maintaining business continuity.
- Restricted Content Discovery (RCD) + Delegation: When immediate action is needed, RCD offers a fast and effective way to block overshared SharePoint sites from Copilot and agent access. With a single setting, organizations can limit the ability of end users to search for files from specific SharePoint sites. To scale oversight, admins can also delegate RCD authority to site owners, enabling a distributed model with shared responsibility.
Operationalizing Data Security for Copilot and AI Apps
While SAM helps IT admins assess and clean up sprawling SharePoint sites, Microsoft Purview gives organizations the tools to scale protective and reactive data protections across all Microsoft apps. Purview can enforce protections, monitor and alert to AI data exposure, and build long-term governance.
Organizations can start with Microsoft Purview Data Security Posture Management (DSPM) for AI to assess their current environment and identify potential risks.
DSPM for AI gives security teams:
- Visibility into how Copilot and agents are used in the organization
- Data risk assessments to identify, monitor, and remediate oversharing
- Ready-to-use policies to protect data and prevent data loss while using Copilot and agents, that ensure sensitive data is stored, handled, and accessed properly
- Controls to govern use of Copilot and Agents, to help organizations meet the requirements of company policies and regulations
Identify and Remediate Oversharing with Data Risk Assessments
Data Risk Assessments in DSPM for AI provide a targeted, scalable way to identify oversharing risks before they become amplified by Copilot and agent use. A default data risk assessment runs weekly on your organization’s top 100 most active SharePoint sites based on usage in your organization. It surfaces potential risks such as:
- The number of sensitive files in a site and whether they are protected with sensitivity labels
- Overexposed sharing patterns, including links shared with “anyone,” external domains, or everyone in the organization
- Indicators to understand how a site is being used, such as the number of unique users accessing files, and other detailed access metrics
You can also run custom assessments to evaluate different users or specific sites. These are especially useful when preparing Copilot pilots or scaling across business units. Each assessment provides a high-level snapshot of risk, with drill-down views into individual sites. Selecting a site opens a four-tab panel: Overview, Identify, Protect, and Monitor –guiding you through a complete remediation journey.
Take Action: Remediate Oversharing at the Source
Once you’ve identified a high-risk site using Data Assessments, you can view suggestions for remediation and long-term guardrails to reduce exposure and restore control. Key actions include:
- Prevent Copilot and Agent access to specific sensitive files: In the Protect tab, you can apply a Microsoft Purview Data Loss Prevention policy that blocks Copilot and agents from processing or referencing a file or email with specific sensitivity labels. For example, some regulations require that organizations block personally identifiable information (PII) from use in AI.
- Prevent Copilot and Agent access to the entire site: Use SharePoint’s Restricted Content Discovery (RCD) feature to block the entire site from Copilot or Agent processing regardless of access permissions. Organizations might use this feature temporarily for high-risk projects or data to allow time for more detailed remediation on files.
- Automatically enforce file protections: When sensitive information is detected in unlabeled files, Use Microsoft Purview Information Protection to create auto-labeling policies to automatically apply protections, such as encryption to control file access. This helps close gaps in governance without relying solely on manual tagging.
- Eliminate stale content with Data Lifecycle Policies: Use Microsoft Purview Data Lifecycle Management to create retention policies that automatically delete files that haven’t been modified recently. Reducing data sprawl minimizes risk surface and lowers the risk of irrelevant or outdated data being surfaced by Copilot and Agents.
Read here for an in-depth blog post on deploying Microsoft purview DSPM for AI to secure your AI apps.
Safeguarding Your Data with a Unified Approach
With the right controls, your data becomes an asset, not a liability. By combining SharePoint Advanced Management’s SharePoint site governance controls with Microsoft Purview’s proactive protection and enforcement, organizations can take a comprehensive approach to governing Copilot and Agent use to help enable safe, secure innovation.
Ready to take control of AI access?
Tune into our Copilot Control System Digital Deep Dive segment to dive deeper into protecting sensitive data, govern discovery, and enable secure Copilot and agent use across Microsoft 365: Secure Microsoft 365 Copilot and agents | Digital Deep Dive: Copilot Control System