ediscovery
20 TopicsSearch and Purge using Microsoft Graph eDiscovery API
Welcome back to the series of blogs covering search and purge in Microsoft Purview eDiscovery! If you are new to this series, please first visit the blog post in our series that you can find here: Search and Purge workflow in the new modern eDiscovery experience Also, please ensure you have fully read the Microsoft Learn documentation on this topic as I will not be covering some of the steps in full (permissions, releasing holds, all limitations): Find and delete Microsoft Teams chat messages in eDiscovery | Microsoft Learn So as a reminder, for E5/G5 customers and cases with premium features enabled- you must use the Graph API to execute the purge operation. With the eDiscovery Graph API, you have the option to create the case, create a search, generate statistics, create an item report and issue the purge command all from the Graph API. It is also possible to use the Purview Portal to create the case, create the search, generate statistics/samples and generate the item report. However, the final validation of the items that would be purged by rerunning the statistics operation and issuing the purge command must be run via the Graph API. In this post, we will take a look at two examples, one involving an email message and one involving a Teams message. I will also look to show how to call the graph APIs. Purging email messages via the Graph API In this example, I want to purge the following email incorrectly sent to Debra Berger. I also want to remove it from the sender's mailbox as well. Let’s assume in this example I do not know exactly who sent and received the email, but I do know the subject and date it was sent on. In this example, I am going to use the Modern eDiscovery Purview experience to create a new case where I will undertake some initial searches to locate the item. Once the case is created, I will Create a search and give it a name. In this example, I do not know all the mailboxes where the email is present, so my initial search is going to be a tenant wide search of all Exchange mailboxes, using the subject and date range as conditions to see which locations have hits. Note: For scenarios where you know the location of the items there is no requirement to do a tenant wide search. You can target the search to the know locations instead. I will then select Run Query and trigger a Statistics job to see which locations in the tenant have hits. For our purposes, we do not need to select Include categories, Include query keywords report or Include partially indexed items. This will trigger a Generate statistics job and take you to the Statistics tab of the search. Once the job completes it will display information on the total matches and number of locations with hits. To find out exactly which locations have hits, I can use the improved process reports to review more granular detail on the locations with hits. The report for the Generate statistics job can be found by selecting Process manager and then selecting the job. Once displayed I can download the reports associated with this process by selecting Download report. Once we have downloaded the report for the process, we get a ZIP file containing four different reports, to understand where I had hits I can review the Locations report within the zip file. If I open the locations report and filter on the count column I can see in this instance I have two locations with hits, Admin and DebraB. I will use this to make my original search more targeted. It also gives me an opportunity to check that I am not going to exceed the limits on the number of items I can target for the purge per execution. Returning to our original search I will remove All people and groups from my Data Sources and replace it with the two locations I had hits from. I will re-run my Generate Statistics job to ensure I am still getting the expected results. As the numbers align and remain consistent, I will do a further check and generate samples from the search. This will allow me to review the items to confirm that they are the items I wish to purge. From the search query I select Run query and select Sample. This will trigger a Generate sample job and take you to the Sample tab of the search. Once complete, I can review samples of the items returned by the search to confirm if these items are the items I want to purge. Now that I have confirmed, based on the sampling, that I have the items I want to purge I want to generate a detailed item report of all items that are a match for my search. To do this I need to generate an export report for the search. Note: Sampling alone may not return all the results impacted by the search, it only returns a sample of the items that match the query. To determine the full set of items that will be targeted we need to generate the export report. From the Search I can select Export to perform a direct export without having to add the data to a review set (available when premium features are enabled). Ensure to configure the following options on the export: Indexed items that match your search query Unselect all the options under Messages and related items from mailboxes and Exchange Online Export Item report only If you want to manually review the items that would be impacted by the purge operation you can optionally export the items alongside the items report for further review. You can also add the search to a review set to review the items that you are targeting. The benefit of adding to the review set is that it enables to you review the items whilst still keeping the data within the M365 service boundary. Note: If you add to a review set, a copy of the items will remain in the review set until the case is deleted. I can review the progress of the export job and download the report via the Process Manager. Once I have downloaded the report, I can review the Items.csv file to check the items targeted by the search. It is at this stage I must switch to using the Graph APIs to validate the actions that will be taken by the purge command and to issue the purge command itself. Not undertaking these additional validation steps can result in un-intended purge of data. There are two approaches you can use to interact with the Microsoft Graph eDiscovery APIs: Via Graph Explorer Via the MS.Graph PS module For this example, I will show how to use the Graph Explorer to make the relevant Graph API calls. For the Teams example, I will use the MS.Graph PS Module. We are going to use the APIs to complete the following steps: Trigger a statistics job via the API and review the results Trigger the purge command The Graph Explorer can be accessed via the following link: Graph Explorer | Try Microsoft Graph APIs - Microsoft Graph To start using the Graph Explorer to work with Microsoft Graph eDiscovery APIs you first need to sign in with your admin account. You need to ensure that you consent to the required Microsoft Graph eDiscovery API permissions by selecting Consent to permissions. From the Permissions flyout search for eDiscovery and select Consent for eDiscovery.ReadWrite.All. When prompted to consent to the permissions for the Graph Explorer select Accept. Optionally you can consent on behalf of your organisation to suppress this step for others. Once complete we can start making calls to the APIs via Graph Explorer. To undertake the next steps we need to capture some additional information, specifically the Case ID and the Search ID. We can get the case ID from the Case Settings in the Purview Portal, recording the Id value shown on the Case details pane. If we return to the Graph Explorer we can use this CaseID to see all the searches within an eDiscovery case. The structure of the HTTPS call is as follows: GET https://graph.microsoft.com/v1.0/security/cases/ediscoveryCases/<caseID>/searches List searches - Microsoft Graph v1.0 | Microsoft Learn If we replace <caseID> with the Id we captured from the case settings we can issue the API call to see all the searches within the case to find the required search ID. When you issue the GET request in Graph Explorer you can review the Response preview to find the search ID we are looking for. Now that we have the case ID and the Search ID we can trigger an estimate by using the following Graph API call. POST https://graph.microsoft.com/v1.0/security/cases/ediscoveryCases/{ediscoveryCaseId}/searches/{ediscoverySearchId}/estimateStatistics ediscoverySearch: estimateStatistics - Microsoft Graph v1.0 | Microsoft Learn Once you issue the POST command you will be returned with an Accepted – 202 message. Now I need to use the following REST API call to review the status of the Estimate Statistics job in Graph Explorer. GET https://graph.microsoft.com/v1.0/security/cases/ediscoveryCases/{ediscoveryCaseId}/searches/{ediscoverySearchId}/lastEstimateStatisticsOperation List lastEstimateStatisticsOperation - Microsoft Graph v1.0 | Microsoft Learn If the estimates job is not complete when you run the GET command the Response preview contents will show the status as running. If the estimates job is complete when you run the GET command the Response preview contents will show you the results of the estimates job. CRITICAL: Ensure that the indexedItemCount matches the items returned in the item report generated via the Portal. If this does not match do not proceed to issuing the purge command. Now that I have validated everything, I am ready to issue the purge command via the Graph API. I will use the following Graph API call. POST https://graph.microsoft.com/v1.0/security/cases/ediscoveryCases/{ediscoveryCaseId}/searches/{ediscoverySearchId}/purgeData ediscoverySearch: purgeData - Microsoft Graph v1.0 | Microsoft Learn With this POST command we also need to provide a Request Body to tell the API which areas we want to target (mailboxes or teamsMessages) and the purge type (recoverable, permantlyDelete). As we are targeting email items I will use mailboxes as the PurgeAreas option. As I only want to remove the item from the user’s mailbox view I am going to use recoverable as the PurgeType. { "purgeType": "recoverable", "purgeAreas": "mailboxes" } Once you issue the POST command you will be returned with an Accepted – 202 message. Once the command has been issued it will proceed to purge the items that match the search criteria from the locations targeted. If I go back to my original example, we can now see the item has been removed from the users mailbox. As it has been soft deleted I can review the recoverable items folder from Outlook on the Web where I will see that for the user, it has now been deleted pending clean-up from their mailbox. Purging Teams messages via the Graph API In this example, I want to purge the following Teams conversation between Debra, Adele and the admin (CDX) from all participants Teams client. I am going to reuse the “HK016 – Search and Purge” case to create a new search called “Teams conversation removal”. I add three participants of the chat as Data sources to the search, I am then going to use the KeyQL condition to target the items I want to remove. In this example I am using the following KeyQL. (Participants=AdeleV@M365x00001337.OnMicrosoft.com AND Participants=DebraB@M365x00001337.OnMicrosoft.com AND Participants=admin@M365x00001337.onmicrosoft.com) AND (Kind=im OR Kind=microsoftteams) AND (Date=2025-06-04) This is looking for all Teams messages that contain all three participants sent on the 4 th of June 2025. It is critical when targeting Teams messages that I ensure my query targets exactly the items that I want to purge. With Teams messages (opposed to email items) there are less options available that enable us to granularly target the team items for purging. Note: The use of the new Identifier condition is not supported for purge options. Use of this can lead to unintended data to be removed and should not be used as a condition in the search at this time. If I was to be looking for a very specific phrase, I could further refine the query by using the Keyword condition to look for that specific Teams message. Once I have created my search I am ready to generate both Statistics and Samples to enable me to validate I am targeting the right items for my search. My statistics job has returned 21 items, 7 from each location targeted. This aligns with the number of items within the Teams conversation. However, I am going to also validate that the samples I have generated match the content I want to purge, ensuring that I haven’t inadvertently returned additional items I was not expecting. Now that I have confirmed, based on the sampling, that the sample of items returned look to be correct I want to generate a detailed item report of all items that are a match for my search. To do this I need to generate an export report for the search. From the Search I can select Export to perform a direct export without having to add the data to a review set (available when premium features are enabled). Ensure to configure the following options on the export: Indexed items that match your search query Unselect all the options under Messages and related items from mailboxes and Exchange Online Export Item report only Once I select Export it will create a new export job, I can review the progress of the job and download the report via the Process Manager. Once I have downloaded the report, I can review the Items.csv file to check the items targeted by the search and that would be purged when I issue the purge call. Now that I have confirmed that the search is targeting the items I want to purge it is at this stage I must switch to using the Graph APIs. As discussed, there are two approaches you can use to interact with the Microsoft Graph eDiscovery APIs: Using Graph Explorer Using the MS.Graph PS module For this example, I will show how to use the MS.Graph PS Module to make the relevant Graph API calls. To understand how to use the Graph Explorer to issue the purge command please refer to the previous example for purging email messages. We are going to use the APIs to complete the following steps: Trigger a statistics job via the API and review the results Trigger the purge command To install the MS.Graph PowerShell module please refer to the following article. Install the Microsoft Graph PowerShell SDK | Microsoft Learn To understand more about the MS.Graph PS module and how to get started you can review the following article. Get started with the Microsoft Graph PowerShell SDK | Microsoft Learn Once the PowerShell module is installed you can connect to the eDiscovery Graph APIs by running the following command. connect-mgGraph -Scopes "ediscovery.ReadWrite.All" You will be prompted to authenticate, once complete you will be presented with the following banner. To undertake the next steps we need to capture some additional information, specifically the Case ID and the Search ID. As before we can get the case ID from the Case Settings in the Purview Portal, recording the Id value shown on the Case details pane. Alternatively we can use the following PowerShell command to find a list of cases and their ID. get-MgSecurityCaseEdiscoveryCase | ft displayname,id List ediscoveryCases - Microsoft Graph v1.0 | Microsoft Learn Once we have the ID of the case we want to execute the purge command from, we can run the following command to find the IDs of all the search jobs in the case. Get-MgSecurityCaseEdiscoveryCaseSearch -EdiscoveryCaseId <ediscoveryCaseId> | ft displayname,id,ContentQuery List searches - Microsoft Graph v1.0 | Microsoft Learn Now that we have both the Case ID and the Search ID we can trigger the generate statistics job using the following command. Invoke-MgEstimateSecurityCaseEdiscoveryCaseSearchStatistics -EdiscoveryCaseId <ediscoveryCaseId> -EdiscoverySearchId <ediscoverySearchId> ediscoverySearch: estimateStatistics - Microsoft Graph v1.0 | Microsoft Learn Now I need to use the following command to review the status of the Estimate Statistics job. Get-MgSecurityCaseEdiscoveryCaseSearchLastEstimateStatisticsOperation -EdiscoveryCaseID <ediscoveryCaseId> -EdiscoverySearchId <ediscoverySearchId> List lastEstimateStatisticsOperation - Microsoft Graph v1.0 | Microsoft Learn If the estimates job is not complete when you run the command the status will show as running. If the estimates job is complete when you run the command status will show as succeeded and will also show the number of hits in the IndexItemCount. CRITICAL: Ensure that the indexedItemCount matches the items returned in the item report generated via the Portal. If this does not match do not proceed to issuing the purge command. Now that I have validated everything I am ready to issue the purge command via the Graph API. With this command we need to provide a Request Body to tell the API which areas we want to target (mailboxes or teamsMessages) and the purge type (recoverable, permantlyDelete). As we are targeting teams items I will use teamsMessages as the PurgeAreas option. Note: If you specify mailboxes then only the compliance copy stored in the user mailbox will be purged and not the item from the teams services itself. This will mean the item will remain visible to the user in Teams and can no longer be purged. When purgeType is set to either recoverable or permanentlyDelete and purgeAreas is set to teamsMessages, the Teams messages are permanently deleted. In other words either option will result in the permanent deletion of the items from Teams and they cannot be recovered. $params = @{ purgeType = "recoverable" purgeAreas = "teamsMessages" } Once I have prepared my request body I will issue the following command. Clear-MgSecurityCaseEdiscoveryCaseSearchData -EdiscoveryCaseId $ediscoveryCaseId -EdiscoverySearchId $ediscoverySearchId -BodyParameter $params ediscoverySearch: purgeData - Microsoft Graph v1.0 | Microsoft Learn Once the command has been issued it will proceed to purge the items that match the search criteria from the locations targeted. If I go back to my original example, we can now see the items has been removed from Teams. Congratulations, you have made it to the end of the blog post. Hopefully you found it useful and it assists you to build your own operational processes for using the Graph API to issue search and purge actions.Search and Purge workflow in the new modern eDiscovery experience
With the retirement of Content Search (Classic) and eDiscovery Standard (Classic) in May, and alongside the future retirement of eDiscovery Premium (Classic) in August, organizations may be wondering how this will impact their existing search and purge workflow. The good news is that it will not impact your organizations ability to search for and purge email, Teams and M365 Copilot messages; however there are some additional points to be careful about when working with purge with cmdlet and Graph alongside of the modern eDiscovery experience. We have made some recent updates to our documentation regarding this topic to reflect the changes in the new modern eDiscovery experience. These can be found below and you should ensure that you read them in full as they are packed with important information on the process. Find and delete email messages in eDiscovery | Microsoft Learn Find and delete Microsoft Teams chat messages in eDiscovery | Microsoft Learn Search for and delete Copilot data in eDiscovery | Microsoft Learn The intention of this first blog post in the series is to cover the high-level points including some best practices when it comes to running search and purge operations using Microsoft Purview eDiscovery. Please stay tuned for further blog posts intended to provide more detailed step-by-step of the following search and purge scenarios: Search and Purge email and Teams messages using Microsoft Graph eDiscovery APIs Search and Purge email messages using the Security and Compliance PowerShell cmdlets I will update this blog post with the subsequent links to the follow-on posts in this series. So let’s start by looking at the two methods available to issue a purge command with Microsoft Purview eDiscovery, they are the Microsoft Graph eDiscovery APIs or the Security and Compliance PowerShell cmdlets. What licenses you have dictates which options are available to you and what type of items you can be purge from Microsoft 365 workloads. For E3/G3 customers and cases which have the premium features disabled You can only use the PowerShell cmdlets to issue the purge command You should only purge email items from mailboxes and not Teams messages You are limited to deleting 10 items per location with a purge command For E5/G5 customers and cases which have the premium features enabled You can only use the Graph API to issue the purge command You can purge email items and Teams messages You can delete up to 100 items per location with a purge command To undertake a search and then purge you must have the correct permissions assigned to your account. There are two key Purview Roles that you must be assigned, they are: Compliance Search: This role lets users run the Content Search tool in the Microsoft Purview portal to search mailboxes and public folders, SharePoint Online sites, OneDrive for Business sites, Skype for Business conversations, Microsoft 365 groups, and Microsoft Teams, and Viva Engage groups. This role allows a user to get an estimate of the search results and create export reports, but other roles are needed to initiate content search actions such as previewing, exporting, or deleting search results. Search and Purge: This role lets users perform bulk removal of data matching the criteria of a search. To learn more about permissions in eDiscovery, along with the different eDiscovery Purview Roles, please refer to the following Microsoft Learn article: Assign permissions in eDiscovery | Microsoft Learn By default, eDiscovery Manager and eDiscovery Administrators have the “Compliance Search” role assigned. For search and purge, only the Organization Management Purview Role group has the role assigned by default. However, this is a highly privileged Purview Role group and customers should considering using a custom role group to assign the Search and Purge Purview role to authorised administrators. Details on how to create a custom role group in Purview can be found in the following article. Permissions in the Microsoft Purview portal | Microsoft Learn It is also important to consider the impact of any retention policies or legal holds will have when attempting to purge email items from a mailbox where you want to hard delete the items and remove it completely from the mailbox. When a retention policy or legal hold is applied to a mailbox, email items that are hard deleted via the purge process are moved and retained in the Recoverable Items folder of the mailbox. There purged items will be retained until such time as all holds are lifted and until the retention period defined in the retention policy has expired. It is important to note that items retained in the Recoverable Items folder are not visible to users but are returned in eDiscovery searches. For some search and purge use cases this is not a concern; if the primary goal is to remove the item from the user’s view then additional steps are required. However if the goal is to completely remove the email item from the mailbox in Exchange Online so it doesn't appear in the user’s view and is not returned by future eDiscovery searches then additional steps are required. They are: Disable client access to the mailbox Modify retention settings on the mailbox Disable the Exchange Online Managed Folder Assistant for the mailbox Remove all legal holds and retention policies from the mailbox Perform the search and purge operation Revert the mailbox to its previous state These steps should be carefully followed as any mistake could result in additional data that is being retained being permanently deleted from the service. The full detailed steps can be found in the following article. Delete items in the Recoverable Items folder mailboxes on hold in eDiscovery | Microsoft Learn Now for some best practice when running search and purge operations: Where possible target the specific locations containing the items you wish to purge and avoid tenant wide searches where possible If a tenant wide search is used to initially locate the items, once the locations containing the items are known modify the search to target the specific locations and rerun the steps Always validate the item report against the statistics prior to issuing the purge command to ensure you are only purging items you intend to remove If the item counts do not align then do not proceed with the purge command Ensure admins undertaking search and purge operations are appropriately trained and equipped with up-to-date guidance/process on how to safely execute the purge process The search conditions Identifier, Sensitivity Label and Sensitive Information Type do not support purge operations and if used can cause un-intended results Organizations with E5/G5 licenses should also take this opportunity to review if other Microsoft Purview and Defender offerings can help them achieve the same outcomes. When considering the right approach/tool to meet your desired outcomes you should become familiar with the following additional options for removing email items: Priority Clean-up (link): Use the Priority cleanup feature under Data Lifecycle Management in Microsoft Purview when you need to expedite the permanent deletion of sensitive content from Exchange mailboxes, overriding any existing retention settings or eDiscovery holds. This process might be implemented for security or privacy in response to an incident, or for compliance with regulatory requirements. Threat Explorer (link): Threat Explorer in Microsoft Defender for Office 365 is a powerful tool that enables security teams to investigate and remediate malicious emails in near real-time. It allows users to search for and filter email messages based on various criteria - such as sender, recipient, subject, or threat type - and take direct actions like soft delete, hard delete, or moving messages to junk or deleted folders. For manual remediation, Threat Explorer supports actions on emails delivered within the past 30 days In my next posts I will be delving further into how to use both the Graph APIs and the Security and Compliance PowerShell module to safely execute your purge commands.Getting started with the eDiscovery APIs
The Microsoft Purview APIs for eDiscovery in Microsoft Graph enable organizations to automate repetitive tasks and integrate with their existing eDiscovery tools to build repeatable workflows that industry regulations might require. Before you can make any calls to the Microsoft Purview APIs for eDiscovery you must first register an app in the Microsoft’s Identity Platform, Entra ID. An app can access data in two ways: Delegated Access: an app acting on behalf of a signed-in user App-only access: an app action with its own identity For more information on access scenarios see Authentication and authorization basics. This article will demonstrate how to configure the required pre-requisites to enable access to the Microsoft Purview APIs for eDiscovery. This will based on using app-only access to the APIs, using either a client secret or a self-signed certificate to authenticate the requests. The Microsoft Purview APIs for eDiscovery have two separate APIs, they are: Microsoft Graph: Part of the Microsoft.Graph.Security namespace and used for working with Microsoft Purview eDiscovery Cases. MicrosoftPurviewEDiscovery: Used exclusively to download programmatically the export package created by a Microsoft Purview eDiscovery Export job. Currently, the eDiscovery APIs in Microsoft Graph only work with eDiscovery (Premium) cases. For a list of supported API calls within the Microsoft Graph calls, see Use the Microsoft Purview eDiscovery API. Microsoft Graph API Pre-requisites Implementing app-only access involves registering an app in Azure portal, creating client secret/certificates, assigning API permissions, setting up a service principal, and then using app-only access to call Microsoft Graph APIs. To register an app, create client secret/certificates and assign API permissions the account must be at least a Cloud Application Administrator. For more information on registering an app in the Azure portal, see Register an application with the Microsoft identity platform. Granting tenant-wide admin consent for Microsoft Purview eDiscovery API application permissions requires you to sign in as a user that is authorized to consent on behalf of the organization, see Grant tenant-wide admin consent to an application. Setting up a service principal requires the following pre-requisites: A machine with the ExchangeOnlineManagement module installed An account that has the Role Management role assigned in Microsoft Purview, see Roles and role groups in Microsoft Defender for Office 365 and Microsoft Purview Configuration steps For detailed steps on implementing app-only access for Microsoft Purview eDiscovery, see Set up app-only access for Microsoft Purview eDiscovery. Connecting to Microsoft Graph API using app-only access Use the Connect-MgGraph cmdlet in PowerShell to authenticate and connect to Microsoft Graph using the app-only access method. This cmdlets enables your app to interact with Microsoft Graph securely and enables you to explore the Microsoft Purview eDiscovery APIs. Connecting via client secret To connect using a client secret, update and run the following example PowerShell code. $clientSecret = "<client secret>" ## Update with client secret added to the registered app $appID = "<APP ID>" ## Update with Application ID of registered/Enterprise app $tenantId = "<Tenant ID>" ## Update with tenant ID $ClientSecretPW = ConvertTo-SecureString "$clientSecret" -AsPlainText -Force $clientSecretCred = New-Object System.Management.Automation.PSCredential -ArgumentList ("$appID", $clientSecretPW) Connect-MgGraph -TenantId "$tenantId" -ClientSecretCredential $clientSecretCred Connecting via certificate To connect using a certificate, update and run the following example PowerShell code. $certPath = "Cert:\currentuser\my\<xxxxxxxxxx>" ## Update with the cert thumbnail $appID = "<APP ID>" ## Update with Application ID of registered/Enterprise app $tenantId = "<Tenant ID>" ## Update with tenant ID $ClientCert = Get-ChildItem $certPath Connect-MgGraph -TenantId $TenantId -ClientId $appId -Certificate $ClientCert Invoke Microsoft Graph API calls Once connected you can start making calls to the Microsoft Graph API. For example, lets look at listing the eDiscovery cases within the tenant, see List ediscoveryCases. Within the documentation, for each operation it will list the following information: Permissions required to make the API call HTTP request and method Request header and body information Response Examples (HTTP, C#, CLI, Go, Java, PHP, PowerShell, Python) As we are connected via the Microsoft Graph PowerShell module we can either use the HTTP or the eDiscovery specific cmdlets within the Microsoft Graph PowerShell module. First let’s look at the PowerShell cmdlet example. As you can see it returns a list of all the cases within the tenant. When delving deeper into a case it is important to record the Case ID as you will use this in future calls. Then we can look at the HTTP example, we will use the Invoke-MgGraphRequest cmdlet to make the call via PowerShell. First we need to store the URL in a variable as below. $uri = "https://graph.microsoft.com/v1.0/security/cases/ediscoveryCases" Then we will use the Invoke-MgGraphRequest cmdlet to make the API call. Invoke-MgGraphRequest -Method Get -Uri $uri As you can see from the output below, we need to extract the values from the returned response. This can be done by saving the Value elements of the response to a new variable using the following command. $cases = (Invoke-MgGraphRequest -Method Get -Uri $uri).value This returns a collection of Hashtables; optionally you can run a small bit of PowerShell code to convert the hash tables into PS Objects for easier use with cmdlets such as format-table and format-list. $CasesAsObjects = @() foreach($i in $cases) {$CasesAsObjects += [pscustomobject]$i} MicrosoftPurviewEDiscovery API You can also configure the MicrosoftPurviewEDiscovery API to enable the programmatic download of export packages and the item report from an export job in a Microsoft Purview eDiscovery case. Pre-requisites Prior to executing the configuration steps in this section it is assumed that you have completed and validated the configuration detailed in the Microsoft Graph API section. The previously registered app in Entra ID will be extended to include the required permissions to achieve programmatic download of the export package. This already provides the following pre-requisites: Registered App in Azure portal configured with the appropriate client secret/certificate Service principal in Microsoft Purview assigned the relevant eDiscovery roles Microsoft eDiscovery API permissions configured for the Microsoft Graph To extend the existing registered apps API permissions to enable programmatic download, the following steps must be completed Registering a new Microsoft Application and service principal in the tenant Assign additional API permissions to the previously registered app in the Azure Portal Granting tenant-wide admin consent for Microsoft Purview eDiscovery APIs application permissions requires you to sign in as a user that is authorized to consent on behalf of the organization, see Grant tenant-wide admin consent to an application. Configuration steps Step 1 – Register the MicrosoftPurviewEDiscovery app in Entra ID First validate that the MicrosoftPurviewEDiscovery app is not already registered by logging into the Azure Portal and browsing to Microsoft Entra ID > Enterprise Applications. Change the application type filter to show Microsoft Applications and in the search box enter MicrosoftPurviewEDiscovery. If this returns a result as below, move to step 2. If the search returns no results as per the example below, proceed with registering the app in Entra ID. The Microsoft.Graph PowerShell Module can be used to register the MicrosoftPurviewEDiscovery App in Entra ID, see Install the Microsoft Graph PowerShell SDK. Once installed on a machine, run the following cmdlet to connect to the Microsoft Graph via PowerShell. Connect-MgGraph -scopes "Application.ReadWrite.All" If this is the first time using the Microsoft.Graph PowerShell cmdlets you may be prompted to consent to the following permissions. To register the MicrosoftPurviewEDiscovery app, run the following PowerShell commands. $spId = @{"AppId" = "b26e684c-5068-4120-a679-64a5d2c909d9" } New-MgServicePrincipal -BodyParameter $spId; Step 2 – Assign additional MicrosoftPurviewEDiscovery permissions to the registered app Now that the Service Principal has been added you can update the permissions on your previously registered app created in the Microsoft Graph API section of this document. Log into the Azure Portal and browse to Microsoft Entra ID > App Registrations. Find and select the app you created in the Microsoft Graph API section of this document. Select API Permissions from the navigation menu. Select Add a permission and then APIs my organization uses. Search for MicrosoftPurviewEDiscovery and select it. Then select Application Permissions and select the tick box for eDiscovery.Download.Read before selecting Add Permissions. You will be returned to the API permissions screen, now you must select Grant Admin Consent.. to approve the newly added permissions. User.Read Microsoft Graph API permissions have been added and admin consent granted. It also shows that the eDiscovery.Download.Read MicrosoftPurviewEDiscovery API application permissions have been added but admin consent has not yet been granted. Once admin consent is granted you will see the Status of the newly added permissions update to Granted for... Downloading the export packages and reports Retrieving the case ID and export Job ID To successfully download the export packages and reports of an export job in an eDiscovery case, you must first retrieve the case ID and the operation/job ID for the export job. To gather this information via the Purview Portal you can open the eDiscovery Case, locate the export job and select Copy support information before pasting this information into Notepad. , case ID, job ID, job state, created by, created timestamp, completed timestamp and support information generation time. To access this information programmatically you can make the following Graph API calls to locate the case ID and the job ID you wish to export. First connect to the Microsoft Graph using the steps detailed in the previous section titled "Connecting to Microsoft Graph API using app-only access" Using the eDiscovery Graph PowerShell Cmdlets you can use the following command if you know the case name. Get-MgSecurityCaseEdiscoveryCase | where {$_.displayname -eq "<Name of case>"} Once you have the case ID you can look up the operations in the case to identify the job ID for the export using the following command. Get-MgSecurityCaseEdiscoveryCaseOperation -EdiscoveryCaseId "<case ID>" Export jobs will either be logged under an action of exportResult (direct export) or ContentExport (export from review set). The name of the export jobs are not returned by this API call, to find the name of the export job you must query the specific operation ID. This can be achieved using the following command. Get-MgSecurityCaseEdiscoveryCaseOperation -EdiscoveryCaseId "<case ID>" -CaseOperationId “<operation ID>” The name of the export operation is contained within the property AdditionalProperties. If you wish to make the HTTP API calls directly to list cases in the tenant, see List ediscoveryCases - Microsoft Graph v1.0 | Microsoft Learn. If you wish to make the HTTP API calls directly to list the operations for a case, see List caseOperations - Microsoft Graph v1.0 | Microsoft Learn. You will need to use the Case ID in the API call to indicate which case you wish to list the operations from. For example: https://graph.microsoft.com/v1.0/security/cases/ediscoveryCases/<CaseID>/operations/ The name of the export jobs are not returned with this API call, to find the name of the export job you must query the specific job ID. For example: https://graph.microsoft.com/v1.0/security/cases/ediscoveryCases/<CaseID>/operations/<OperationID> Downloading the Export Package Retrieving the download URLs for export packages The URL required to download the export packages and reports are contained within a property called exportFileMetaData. To retrieve this information we need to know the case ID of the eDiscovery case that the export job was run in, as well as the operation ID for the export job. Using the eDiscovery Graph PowerShell Cmdlets you can retrieve this property use the following commands. $Operation = Get-MgSecurityCaseEdiscoveryCaseOperation -EdiscoveryCaseId "<case ID>" -CaseOperationId “<operation ID>” $Operation.AdditionalProperties.exportFileMetadata If you wish to make the HTTP API calls directly to return the exportFileMetaData for an operation, see List caseOperations - Microsoft Graph v1.0 | Microsoft Learn. For each export package visible in the Microsoft Purview Portal there will be an entry in the exportFileMetaData property. Each entry will list the following: The export package file name The downloadUrl to retrieve the export package The size of the export package Example scripts to download the Export Package As the MicrosoftPurviewEDiscovery API is separate to the Microsoft Graph API, it requires a separate authentication token to authorise the download request. As a result, you must use the MSAL.PS PowerShell Module and the Get-MSALToken cmdlet to acquire a separate token in addition to connecting to the Microsoft Graph APIs via the Connect-MgGraph cmdlet. The following example scripts can be used to as a reference when developing your own scripts to enable the programmatic download of the export packages. Connecting with a client secret If you have configured your app to use a client secret, then you can use the following example script for reference to download the export package and reports programmatically. Copy the contents into notepad and save it as DownloadExportUsingApp.ps1. [CmdletBinding()] param ( [Parameter(Mandatory = $true)] [string]$tenantId, [Parameter(Mandatory = $true)] [string]$appId, [Parameter(Mandatory = $true)] [string]$appSecret, [Parameter(Mandatory = $true)] [string]$caseId, [Parameter(Mandatory = $true)] [string]$exportId, [Parameter(Mandatory = $true)] [string]$path = "D:\Temp", [ValidateSet($null, 'USGov', 'USGovDoD')] [string]$environment = $null ) if (-not(Get-Module -Name Microsoft.Graph -ListAvailable)) { Write-Host "Installing Microsoft.Graph module" Install-Module Microsoft.Graph -Scope CurrentUser } if (-not(Get-Module -Name MSAL.PS -ListAvailable)) { Write-Host "Installing MSAL.PS module" Install-Module MSAL.PS -Scope CurrentUser } $password = ConvertTo-SecureString $appSecret -AsPlainText -Force $clientSecretCred = New-Object System.Management.Automation.PSCredential -ArgumentList ($appId, $password) if (-not(Get-MgContext)) { Write-Host "Connect with credentials of a ediscovery admin (token for graph)" if (-not($environment)) { Connect-MgGraph -TenantId $TenantId -ClientSecretCredential $clientSecretCred } else { Connect-MgGraph -TenantId $TenantId -ClientSecretCredential $clientSecretCred -Environment $environment } } Write-Host "Connect with credentials of a ediscovery admin (token for export)" $exportToken = Get-MsalToken -ClientId $appId -Scopes "b26e684c-5068-4120-a679-64a5d2c909d9/.default" -TenantId $tenantId -RedirectUri "http://localhost" -ClientSecret $password $uri = "/v1.0/security/cases/ediscoveryCases/$($caseId)/operations/$($exportId)" $export = Invoke-MgGraphRequest -Uri $uri; if (-not($export)){ Write-Host "Export not found" exit } else{ $export.exportFileMetadata | % { Write-Host "Downloading $($_.fileName)" Invoke-WebRequest -Uri $_.downloadUrl -OutFile "$($path)\$($_.fileName)" -Headers @{"Authorization" = "Bearer $($exportToken.AccessToken)"; "X-AllowWithAADToken" = "true" } } } Once saved, open a new PowerShell windows which has the following PowerShell Modules installed: Microsoft.Graph MSAL.PS Browse to the directory you have saved the script and issue the following command. .\DownloadExportUsingApp.ps1 -tenantId “<tenant ID>” -appId “<App ID>” -appSecret “<Client Secret>” -caseId “<CaseID>” -exportId “<ExportID>” -path “<Output Path>” Review the folder which you have specified as the Path to view the downloaded files. Connecting with a certificate If you have configured your app to use a certificate then you can use the following example script for reference to download the export package and reports programmatically. Copy the contents into notepad and save it as DownloadExportUsingAppCert.ps1. [CmdletBinding()] param ( [Parameter(Mandatory = $true)] [string]$tenantId, [Parameter(Mandatory = $true)] [string]$appId, [Parameter(Mandatory = $true)] [String]$certPath, [Parameter(Mandatory = $true)] [string]$caseId, [Parameter(Mandatory = $true)] [string]$exportId, [Parameter(Mandatory = $true)] [string]$path = "D:\Temp", [ValidateSet($null, 'USGov', 'USGovDoD')] [string]$environment = $null ) if (-not(Get-Module -Name Microsoft.Graph -ListAvailable)) { Write-Host "Installing Microsoft.Graph module" Install-Module Microsoft.Graph -Scope CurrentUser } if (-not(Get-Module -Name MSAL.PS -ListAvailable)) { Write-Host "Installing MSAL.PS module" Install-Module MSAL.PS -Scope CurrentUser } ##$password = ConvertTo-SecureString $appSecret -AsPlainText -Force ##$clientSecretCred = New-Object System.Management.Automation.PSCredential -ArgumentList ($appId, $password) $ClientCert = Get-ChildItem $certPath if (-not(Get-MgContext)) { Write-Host "Connect with credentials of a ediscovery admin (token for graph)" if (-not($environment)) { Connect-MgGraph -TenantId $TenantId -ClientId $appId -Certificate $ClientCert } else { Connect-MgGraph -TenantId $TenantId -ClientId $appId -Certificate $ClientCert -Environment $environment } } Write-Host "Connect with credentials of a ediscovery admin (token for export)" $connectionDetails = @{ 'TenantId' = $tenantId 'ClientId' = $appID 'ClientCertificate' = $ClientCert 'Scope' = "b26e684c-5068-4120-a679-64a5d2c909d9/.default" } $exportToken = Get-MsalToken @connectionDetails $uri = "/v1.0/security/cases/ediscoveryCases/$($caseId)/operations/$($exportId)" $export = Invoke-MgGraphRequest -Uri $uri; if (-not($export)){ Write-Host "Export not found" exit } else{ $export.exportFileMetadata | % { Write-Host "Downloading $($_.fileName)" Invoke-WebRequest -Uri $_.downloadUrl -OutFile "$($path)\$($_.fileName)" -Headers @{"Authorization" = "Bearer $($exportToken.AccessToken)"; "X-AllowWithAADToken" = "true" } } } Once saved open a new PowerShell windows which has the following PowerShell Modules installed: Microsoft.Graph MSAL.PS Browse to the directory you have saved the script and issue the following command. .\DownloadExportUsingAppCert.ps1 -tenantId “<tenant ID>” -appId “<App ID>” -certPath “<Certificate Path>” -caseId “<CaseID>” -exportId “<ExportID>” -path “<Output Path>” Review the folder which you have specified as the Path to view the downloaded files. Conclusion Congratulations you have now configured your environment to enable access to the eDiscovery APIs! It is a great opportunity to further explore the available Microsoft Purview eDiscovery REST API calls using the Microsoft.Graph PowerShell module. For a full list of API calls available, see Use the Microsoft Purview eDiscovery API. Stay tuned for future blog posts covering other aspects of the eDiscovery APIs and examples on how it can be used to automate existing eDiscovery workflows.Upcoming changes to Microsoft Purview eDiscovery
Today, we are announcing three significant updates to the Microsoft Purview eDiscovery products and services. These updates reinforce our commitment to meeting and exceeding the data security, privacy, and compliance requirements of our customers. To improve security and help protect customers and their data, we have accelerated the timeline for the below changes, which will be enforced by default on May 26. The following features will be retired from the Microsoft Purview portal: Content Search will transition to the new unified Purview eDiscovery experience. The eDiscovery (Standard) classic experience will transition to the new unified Purview eDiscovery experience. The eDiscovery export PowerShell cmdlet parameters will be retired. These updates aim to unify and simplify the eDiscovery user experience in the new Microsoft Purview Portal, while preserving the accessibility and integrity of existing eDiscovery cases. Content Search transition to the new unified Purview eDiscovery experience The classic eDiscovery Content Search solution will be streamlined into the new unified Purview eDiscovery experience. Effective May 26 th , the Content Search solution will no longer be available in the classic Purview portal. Content Search provides administrators with the ability to create compliance searches to investigate data located in Microsoft 365. We hear from customers that the Content Seach tool is used to investigate data privacy concerns, perform legal or incident investigations, validate data classifications, etc. Currently, each compliance search created in the Content Search tool is created outside of the boundaries of a Purview eDiscovery (Standard) case. This means that administrators in Purview Role Groups containing the Compliance Search role can view all Content Searches in their tenant. While the Content Search solution does not enable any additional search permission access, the view of all Content Searches in a customer tenant is not an ideal architecture. Alternatively, when using a Purview eDiscovery case, these administrators only have access to cases in which they are assigned. Customers can now create their new compliance searches within an eDiscovery case using the new unified Purview eDiscovery experience. All content searches in a tenant created prior to May 26, 2025 are now accessible in the new unified Purview eDiscovery experience within a case titled “Content Search”. Although the permissions remain consistent, eDiscovery managers and those with custom permissions will now only be able to view searches from within the eDiscovery cases in which they are assigned, including the “Content Search” case. eDiscovery Standard transition to the new unified Purview eDiscovery experience The classic Purview eDiscovery (Standard) solution experience has transitioned into the new unified Purview eDiscovery experience. Effective May 26 th , the classic Purview eDiscovery (Standard) solution will no longer be available to customers within the classic Purview portal. All existing eDiscovery cases created in the classic purview experience are now available within the new unified Purview eDiscovery experience. Retirement of eDiscovery Export PowerShell Cmdlet parameters The Export parameter within the ComplianceSearchAction eDiscovery PowerShell cmdlets will be retired on May 26, 2025: New-ComplianceSearchAction -Export parameter (and parameters dependent on export such as Report, Retentionreport …) Get-ComplianceSearchAction -Export parameter Set-ComplianceSearchAction -ChangeExportKey parameter We recognize that the removal of the Export parameter may require adjustments to your current workflow process when using Purview eDiscovery (Standard). The remaining Purview eDiscovery PowerShell cmdlets will continue to be supported after May 26 th , 2025: Create and update Compliance Cases New-ComplianceCase, Set-ComplianceCase Create and update Case Holds New-CaseHoldPolicy, Set-CaseHoldPolicy, New-CaseHoldRule, Set-CaseHoldRule Create, update and start Compliance Searches New-ComplianceSearch,Set-ComplianceSearch, Start-ComplianceSearch, Apply Purge action to a Compliance Search New-ComplianceSearchAction -Purge Additionally, if you have a Microsoft 365 E5 license and use eDiscovery (Premium), your organization can script all eDiscovery operations, including export, using the Microsoft Graph eDiscovery APIs. Purview eDiscovery Premium On May 26 th , there will be no changes to the classic Purview eDiscovery (Premium) solution in the classic Purview portal. Cases that were created using the Purview eDiscovery (Premium) classic case experience can also now be accessed in the new unified Purview eDiscovery experience. We recognize that these changes may impact your current processes, and we appreciate your support as we implement these updates. Microsoft runs on trust and protecting your data is our utmost priority. We believe these improvements will provide a more secure and reliable eDiscovery experience. To learn more about the Microsoft Purview eDiscovery solution and become an eDiscovery Ninja, please check out our eDiscovery Ninja Guide at https://aka.ms/eDiscoNinja!Getting Started with the New Purview eDiscovery (E3)
“I heard that classic eDiscovery (Standard) will be retired on May 26 th . How can I get started in the new Purview eDiscovery?” Welcome to the new era of Purview eDiscovery! As we transition from the classic eDiscovery (Standard) to the new Purview eDiscovery, you'll find a more intuitive and user-friendly experience designed to streamline your workflow. This enhanced platform offers additional capabilities such as improved data sources for easier identification of search locations, an upgraded condition builder, better support for modern collaboration, and a more efficient export process. There are a few important notes before we get started with the new Purview eDiscovery user experience: The new Purview eDiscovery is a unified user experience. No longer will there be separate E3 or E5 products for eDiscovery; both E3 and E5 users will enjoy the same new interface. However, Purview eDiscovery users with E5 licenses or advanced SKU license holders will have access to new Premium features, while E3 Purview eDiscovery users will also benefit from new enhancements. Rest assured, you will not need to migrate any of your existing classic cases or content searches. All your current cases and content searches are seamlessly integrated into the new user experience. There are also no changes required for your existing permissions or compliance boundaries. The new Purview eDiscovery respects your existing settings, ensuring a smooth transition. You will see a new case under Purview eDiscovery called “Content Search.” You will find all your existing content searches within this case. You will also be able to access your content search by using the new Purview Content Search shortcut (Learn more about getting started with the new Purview Content Search by going to the following article: https://aka.ms/newcontentsearch). "Where do I get started in the new Purview eDiscovery?" You will be able to access the new Purview eDiscovery by going to the Microsoft Purview portal and signing in using the credentials for a user account assigned eDiscovery permissions. Select the eDiscovery solution card under the Purview portal and then select Cases in the left nav. This will take you to the new Purview eDiscovery. From there, you will be able to select Create case. “Now that I have created my case, what’s next?” Now that you’ve created your case, let’s talk about the new case settings. Click on the Case settings button in the new Purview eDiscovery case view. These are the relevant settings for E3 eDiscovery: The Case details settings are where you can go to disable or enable the eDiscovery (Premium) features (E5) using the eDiscovery (Premium) toggle. ings" page for a Purview eDiscovery case, where users can input the case name, number, and description. This image also shows where you can enable or disable the Premium features for this case. It also provides access to manage permissions, data sources, search & analytics, and review sets for comprehensive case control. You will also be able to close or delete the case using the Actions button under Case details. Permissions settings in eDiscovery allow you to add or remove users to a case and manage role group membership for a case. This is where you will go to give other eDiscovery managers/users access to your case. You can also add a role group to give all members of that role group access to your case. t access is limited to individual users at this stage. The new Data sources section is where you can make changes to the locations you wish to include in tenant-wide searches. NOTE: adding more data sources might cause searches to take longer than normal. The Search & analytics and Review set settings sections are for E5 features. Now that you have managed your Purview eDiscovery settings, the next step is to either create a search or create a hold policy to manage your eDiscovery holds. First, let’s start with the new Purview eDiscovery search experience! Make sure that you are under the Searches tab in your case and click Create a search. Create a search name and search description and select the Create button to create a new search in the new Purview eDiscovery experience. This will take you to the new Purview eDiscovery search experience. Under the Query tab in your new search, you will see the enhanced Data sources on the left side. The new Purview eDiscovery’s enhanced data sources will make it a lot easier for you to set the locations that you would like to search. You can use the enhanced data sources to search for M365 content such as email, documents, and instant messaging conversations in your organization. Use search to find content in these cloud-based Microsoft 365 data sources: Exchange Online mailboxes SharePoint sites OneDrive accounts Microsoft Teams Microsoft 365 Groups Viva Engage In this example, we will be searching Nestor’s mailbox and OneDrive site for an email sent in March 2025 that contains the keyword string “Project 9” Click Add sources under Data sources to add your locations (you can also search all your mailboxes or sites by selecting Add tenant-wide sources if needed) Type in the name of the user or their email address to find the user’s locations that you are wanting to search and then select them. Next, add a group like a Microsoft Team that you would like to search. Click the Manage button to see the locations associated with this user and Team. The enhanced data source experience will automatically identify a user’s mailbox and OneDrive site if they have one enabled. Select Save to continue. Optional: you can exclude either their Mailbox or OneDrive site by unchecking them under the Manage sources view. Now that you have identified the locations that we want to search. The next step is to create a query to define what we are wanting to search for within the locations. Under the Keywords condition, make sure that Equal is selected, and type in Project 9 and hit enter. & Project Team are listed for targeted investigation This will let you specify that you are looking for any chat, email, or document that contains the phrase “Project 9” Next, click on the + Add conditions button to add the date range condition. Select Date from the list and select Apply. Switch the Date operator from Before to Between and select March 1, 2025 through March 31, 2025 as the date range. Click the Run query button to generate the search estimate. Then click Run Query after selecting any additional options that you may want. After the search has run, the Statistics tab will help you verify whether the relevant content was found. You can also generate a sample of the results by going under the Sample tab and selecting the Generate sample results button. a single SharePoint source. Visual charts highlight search hit trends and top location types, while sections for sensitive information types and top users currently show no data. You can export the results of your search after you have verified that the relevant content has been returned by your search by selecting the Export button. Give your export a name and description. In the Export type section, choose one of the following options: Export items report only: Only the summary and item report are created. The various options for organizing data, folder and path structure, condensing paths, and other structures are hidden. Export items with items report: Items are exported with the item report. Other export format options are available with this option in the Export format section. In the Export format section, choose one of the following options: Create PSTs for messages: This option creates .pst files for messages. Create .msg files for messages: This option creates .msg files for messages Select one or more of the following output package options: Organize data from different locations into separate folders or PSTs: This option organizes data into separate folders for each data location. Include folder and path of the source: This option includes the original folder and folder path structure for items. Condense paths to fit within 256 characters: This option condenses the folder path for each item to 259 characters or less. Give each item a friendly name: This option creates a friendly name for each item. After you have selected the options for your export, select the Export button. Click the Export button to go to the Export tab. Select your export once the status shows as “Complete” Select the export packages that you wish to download and hit the Download button. Clicking the Download button will kick off a browser download. The new Content Search does not use classic Content Search and eDiscovery (Standard)’s .NET eDiscovery Export Tool application. NOTE: You may have to disable popup blocking depending on your browser settings. The download report relating to the export is named Reports-caseName-EntityName-ProcessName-timestamp.zip. With EntityName being the user given name to the export. This will include several .CSV files including items.csv which provide details of all items exported, including information such as item ID, location of the item, subject/title of the item, item class/type, and success/error status. The .PST files exported will be included in an export package called PSTs.00x.zip Files exported (e.g. files stored in OneDrive and SharePoint) will be included in an export package called Items.00x.zip “How do I place a hold using the new Purview eDiscovery?” You can create holds in the new Purview eDiscovery to preserve content in mailboxes and sites. This includes mailboxes and sites that are associated with Microsoft Teams, Microsoft 365 groups, and Viva Engage Groups. When you place locations on hold, content is preserved until you remove the hold from the locations or delete/release the hold policy. Like classic eDiscovery (Standard), you will first visit the Hold policies tab. In the hold policies tab, please click New policy to create a new hold policy for your case. Please give your hold policy a unique policy name and policy description. Next, you will add the locations that you would like to place on hold. Please click Add sources under Data sources to start adding locations to your hold. Note: you must select at least one data source to create the hold policy. Put in the name of the custodian that you would like to place on hold. Like the search experience, you will automatically identify the user’s mailbox and OneDrive site when you search by their name. Next, you can enter a group by putting in the name of the group. In this example, I have added a Team called the “Mark 8 Project”. & Project" and received results including two Teams and one Private Shared Channel. The interface allows filtering by scope and type, and each result has a checkbox for selection. Action buttons at the bottom like "Manage," "Save and close," and "Cancel" enable users to finalize or adjust their selections. Please select Manage or Save and close to save your results. If you leave the query blank under the Condition builder section, all the data in the specified locations will be placed on hold. You can also create a query-based hold to put data that matches your query on hold. Note: For the best results when dealing with encrypted or partially indexed items, we recommend limiting conditions to Date, Participants, and Type in query-based holds. Queries aren't effective on other conditions within encrypted or partially indexed items and holds might not be applied to these items. Select Apply hold to enable your hold policy. After creating a hold, check that the hold is applied successfully by navigating to the Details tab for the hold policy. You can check the statuses of all the locations within your hold policy within the Details tab. This is a great way to verify that your hold was successfully deployed. You can also delete the policy, retry the policy, and turn off the policy by selecting Policy actions. This screenshot displays the dashboard for the hold policy titled "H001a - Custodian and Teams Hold," summarizing its application across 6 locations and 2 data sources. A detailed table lists each location along with its hold status, team group, location type, and associated site. Users can filter results, customize columns, and access policy actions such as delete policy, retry policy, or turn it off. You can select a location under the Details tab to learn additional information regarding the held location. You can also select Download Report to get a downloaded report of the hold details. Other important information for creating holds After you create an eDiscovery hold, it might take up to 24 hours for the hold to take effect. For long term data retention not related to eDiscovery investigations, we advise that you use retention policies and retention labels. For more information, see Learn about retention policies and retention labels. When you select a distribution list to be placed on hold, the distribution list expands into the members of the distribution list. Users can choose to place all members' mailboxes and sites on hold or a subset/mix of these data sources on hold. Subsequent changes in distribution list membership don't change or update holds or the policy. Users must add the distribution list to data source again to ensure the latest membership is reflected and expanded. The Recycle Bin in SharePoint sites isn't indexed and therefore unavailable for searching. As a result, eDiscovery searches can't find any Recycle Bin content to place holds. When you create a query-based hold, all content from selected locations is initially placed on hold. Later, any content that doesn't match the specified query is cleared from the hold every seven to 14 days. However, a query-based hold doesn't clear content if more than five holds of any type are applied to a content location, or if any item has indexing issues. The URL for a user's OneDrive account includes their user principal name (UPN) (for example, https://alpinehouse-my.sharepoint.com/personal/sarad_alpinehouse_onmicrosoft_com). In the rare case that a person's UPN is changed, their OneDrive URL will also change to incorporate the new UPN. If a user's OneDrive account is part of an eDiscovery hold, and their UPN is changed, you need to update the hold by adding the user's new OneDrive URL and removing the old one. If the URL for the OneDrive site changes, previously placed holds on the site remain effective and content is preserved. For more information, see How UPN changes affect the OneDrive URL.4.6KViews0likes4CommentsRethinking Data Security and Governance in the Era of AI
The era of AI is reshaping industries, enabling unprecedented innovations, and presenting new opportunities for organizations worldwide. But as organizations accelerate AI adoption, many are focused on a growing concern: their current data security and governance practices are not effectively built for the fast-paced AI innovation and ever-evolving regulatory landscape. At Microsoft, we recognize the critical need for an integrated approach to address these risks. In our latest findings, Top 3 Challenges in Securing and Governing Data for the Era of AI, we uncovered critical gaps in how organizations manage data risk. The findings exemplify the current challenges: 91% of leaders are not prepared to manage risks posed by AI 1 and 85% feel unprepared to comply with AI regulations 2 . These gaps not only increase non-compliance but also put innovation at risk. Microsoft Purview has the tools to tackle these challenges head on, helping organizations move to an approach that protects data, meets compliance regulations, and enables trusted AI transformation. We invite you to take this opportunity to evaluate your current practices, platforms, and responsibilities, and to understand how to best secure and govern your organization for growing data risks in the era of AI. Platform fragmentation continues to weaken security outcomes Organizations often rely on fragmented tools across security, compliance, and data teams, leading to a lack of unified visibility and insufficient data hygiene. Our findings reveal the effects of fragmented platforms, leading to duplicated data, inconsistent classification, redundant alerts, and siloed investigations, which ultimately is causing data exposure incidents related to AI to be on the rise 3 . Microsoft Purview offers centralized visibility across your organization’s data estate. This allows teams to break down silos, streamline workflows, and mitigate data leakage and oversharing. With Microsoft Purview, capabilities like data health management and data security posture management are designed to enhance collaboration and deliver enriched insights across your organization to help further protect your data and mitigate risks faster. Microsoft Purview offers the following: Unified insights across your data estate, breaking down silos between security, compliance, and data teams. Microsoft Purview Data Security Posture Management (DSPM) for AI helps organizations gain unified visibility into GenAI usage across users, data, and apps to address the heightened risk of sensitive data exposure from AI. Built-in capabilities like classification, labeling, data loss prevention, and insider risk insights in one platform. In addition, newly launched solutions like Microsoft Purview Data Security Investigations accelerate investigations with AI-powered deep content analysis, which helps data security teams quickly identify and mitigate sensitive data and security risks within impacted data. Organizations like Kern County historically relied on many fragmented systems but adopted Microsoft Purview to unify their organization’s approach to data protection in preparation for increasing risks associated with deploying GenAI. “We have reduced risk exposure, [Microsoft] Purview helped us go from reaction to readiness. We are catching issues proactively instead of retroactively scrambling to contain them.” – Aaron Nance, Deputy Chief Information Security Officer, Kern County Evolving regulations require continuous compliance AI-driven innovation is creating a surge in regulations, resulting in over 200 daily updates across more than 900 regulatory agencies 4 , as highlighted in our research. Compliance has become increasingly difficult, with organizations struggling to avoid fines and comply with varying requirements across regions. To navigate these challenges effectively, security leaders’ responsibilities are expanding to include oversight across governance and compliance, including oversight of traditional data catalog and governance solutions led by the central data office. Leaders also cite the need for regulation and audit readiness. Microsoft Purview enables compliance and governance by: Streamlining compliance with Microsoft Purview Compliance Manager templates, step-by-step guidance, and insights for region and industry-specific regulations, including GDPR, HIPAA, and AI-specific regulation like the EU AI Act. Supporting legal matters such as forensic and internal investigations with audit trail records in Microsoft Purview eDiscovery and Audit. Activating and governing data for trustworthy analytics and AI with Microsoft Purview Unified Catalog, which enables visibility across your data estate and data confidence via data quality, data lineage, and curation capabilities for federated governance. Microsoft Purview’s suite of capabilities provides visibility and accountability, enabling security leaders to meet stringent compliance demands while advancing AI initiatives with confidence. Organizations need a unified approach to secure and govern data Organizations are calling for an integrated platform to address data security, governance, and compliance collectively. Our research shows that 95% of leaders agree that unifying teams and tools is a top priority 5 and 90% plan to adopt a unified solution to mitigate data related risks and maximize impact 6 . Integration isn't just about convenience, it’s about enabling innovation with trusted data protection. Microsoft Purview enables a shared responsibility model, allowing individual business units to own their data while giving central teams oversight and policy control. As organizations adopt a unified platform approach, our findings reveal the upside potential not only being reduced risk but also cost savings. With AI-powered copilots such as Security Copilot in Microsoft Purview, data protection tasks are simplified with natural-language guidance, especially for under resourced teams. Accelerating AI transformation with Microsoft Purview Microsoft Purview helps security, compliance, and governance teams navigate the complexities of AI innovation while implementing effective data protection and governance strategies. Microsoft partner EY highlights the results they are seeing: “We are seeing 25%–30% time savings when we build secure features using [Microsoft] Purview SDK. What was once fragmented is now centralized. With [Microsoft] Purview, everything comes together on one platform, giving a unified foundation to innovate and move forward with confidence.” – Prashant Garg, Partner of Data and AI, EY We invite you to explore how you can propel your organization toward a more secure future by reading the full research paper at https://aka.ms/SecureAndGovernPaper. Visit our website to learn more about Microsoft Purview. 1 Forbes, Only 9% Of Surveyed Companies Are Ready To Manage Risks Posed By AI, 2023 2 SAP LeanIX, AI Survey Results, 2024 3 Microsoft, Data Security Index Report, 2024 4 Forbes, Cost of Compliance, Thomson Reuters, 2021 5 Microsoft, Audience Research, 2024 6 Microsoft, Customer Requirements Research, 2024Enterprise-grade controls for AI apps and agents built with Azure AI Foundry and Copilot Studio
AI innovation is moving faster than ever, and more AI projects are moving beyond experimentation into deployment, to drive tangible business impact. As organizations accelerate innovation with custom AI applications and agents, new risks emerge across the software development lifecycle and AI stack related to data oversharing and leaks, new vulnerabilities and threats, and non-compliance with stringent regulatory requirements Through 2025, poisoning of software supply chains and infrastructure technology stacks will constitute more than 70% of malicious attacks against AI used in the enterprise 1 , highlighting potential threats that originate early in development. Today, the average cost of a data breach is $4.88 million, but when security issues are caught early in the development process, that number drops dramatically to just $80 per incident 2 . The message is very clear; security can’t be an afterthought anymore. It must be a team sport across the organization, embedded from the start and throughout the development lifecycle. That's why developers and security teams should align on processes and tools that bring security into every stage of the AI development lifecycle and give security practitioners visibility into and the ability to mitigate risks. To address these growing challenges and help customers secure and govern their AI workloads across development and security teams, we are: Enabling Azure AI Foundry and Microsoft Copilot Studio to provide best-in-class foundational capabilities to secure and govern AI workloads Deeply integrating and embedding industry-leading capabilities from Microsoft Purview, Microsoft Defender, and Microsoft Entra into Azure AI Foundry and Microsoft Copilot Studio This week, 3,000 developers are gathering in Seattle for the annual Microsoft Build conference, with many more tuning in online, to learn practical skills for accelerating their AI apps and agents' innovation. To support their AI innovation journey, today we are excited to announce several new capabilities to help developers and organizations secure and govern AI apps and agents. New Azure AI Foundry foundational capabilities to secure and govern AI workloads Azure AI Foundry enhancements for AI security and safety With 70,000 customers, 100 trillion tokens processed this quarter, and 2 billion enterprise search queries each day, Azure AI Foundry has grown beyond just an application layer—it's now a comprehensive platform for building agents that can plan, take action, and continuously learn to drive real business outcomes. To help organizations build and deploy AI with confidence, we’re introducing new security and safety capabilities and insights for developers in Azure AI Foundry Introducing Spotlighting to detect and block prompt injection attacks in real time As AI systems increasingly rely on external data sources, a new class of threats has emerged. Indirect prompt injection attacks embed hidden instructions in documents, emails, and web content, tricking models into taking unauthorized actions without any direct user input. These attacks are difficult to detect and hard to prevent using traditional filters alone. To address this, Azure AI Content Safety is introducing Spotlighting, now available in preview. Spotlighting strengthens the Prompt Shields guardrail by improving its ability to detect and handle potential indirect prompt injections, where hidden adversarial instructions are embedded in external content. This new capability helps prevent the model from inadvertently acting on malicious prompts that are not directly visible to the user. Enable Spotlighting in Azure AI Content Safety to detect potential indirect prompt injection attacks New capabilities for task adherence evaluation and task adherence mitigation to ensure agents remain within scope As developers build more capable agents, organizations face growing pressure to help confirm those agents act within defined instructions and policy boundaries. Even small deviations can lead to tool misuse, broken workflows, or risks like unintended exposure of sensitive data. To solve this, Azure AI Foundry now includes task adherence for agents, now in preview and powered by two components: a real-time evaluation and a new control within Azure AI Content Safety. At the core is a real-time task adherence evaluation API, part of Azure AI Content Safety. This API assesses whether an agent’s behavior is aligned with its assigned task by analyzing the user’s query, system instructions, planned tool calls, and the agent’s response. The evaluation framework is built on Microsoft’s Agent Evaluators, which measure intent resolution, tool selection accuracy, completeness of response, and overall alignment to the original request. Developers can run this scoring logic locally using the Task Adherence Evaluator in the Azure AI Evaluation SDK, with a five-point scale that ranges from fully nonadherent to fully adherent. This gives teams a flexible and transparent way to inspect task-level behavior before it causes downstream issues. Task adherence is enforced through a new control in Azure AI Content Safety. If an agent goes off-task, the control can block tool use, pause execution, or trigger human review. In Azure AI Agent Service, it is available as an opt-in feature and runs automatically. Combined with real-time evaluation, this control helps to ensure that agents stay on task, follow instructions, and operate according to enterprise policies. Learn more about Prompt Shields in Azure AI Content Safety. Azure AI Foundry continuous evaluation and monitoring of agentic systems Maintaining high performance and compliance for AI agents after deployment is a growing challenge. Without ongoing oversight, issues like performance degradation, safety risks, or unintentional misuse of resources can slip through unnoticed. To address this, Azure AI Foundry introduces continuous evaluation and monitoring of agentic systems, now in preview, provides a single pane of glass dashboard to track key metrics such as performance, quality, safety, and resource usage in real time. Continuous evaluation runs quality and safety evaluations at a sampled rate of production usage with results made available in the Azure AI Foundry Monitoring dashboard and published to Application Insights. Developers can set alerts to detect drift or regressions and use Azure Monitor to gain full-stack visibility into their AI systems. For example, an organization using an AI agent to assist with customer-facing tasks can monitor groundedness and detect a decline in quality when the agent begins referencing irrelevant information, helping teams to act before it potentially negatively affects trust of users. Azure AI Foundry evaluation integrations with Microsoft Purview Compliance Manager, Credo AI, and Saidot for streamlined compliance AI regulations and standards introduce new requirements for transparency, documentation, and risk management for high-risk AI systems. As developers build AI applications and agents, they may need guidance and tools to help them evaluate risks based on these requirements and seamlessly share control and evaluation insights with compliance and risk teams. Today, we are announcing previews for Azure AI Foundry evaluation tool’s integration with a compliance management solution, Microsoft Purview Compliance Manager, and AI governance solutions, Credo AI and Saidot. These integrations help define risk parameters, run suggested compliance evaluations, and collect evidence for control testing and auditing. For example, for a developer who’s building an AI agent in Europe may be required by their compliance team to complete a Data Protection Impact Assets (DPIA) and Algorithmic Impact Assessment (AIA) to meet internal risk management and technical documentation requirements aligned with emerging AI governance standards and best practices. Based on Purview Compliance Manager’s step-by-step guidance on controls implementation and testing, the compliance teams can evaluate risks such as potential bias, cybersecurity vulnerabilities, or lack of transparency in model behavior. Once the evaluation is conducted in Azure AI Foundry, the developer can obtain a report with documented risk, mitigation, and residual risk for compliance teams to upload to Compliance Manager to support audits and provide evidence to regulators or external stakeholders. Assess controls for Azure AI Foundry against emerging AI governance standards Learn more about Purview Compliance Manager. Learn more about the integration with Credo AI and Saidot in this blogpost. Leading Microsoft Entra, Defender and Purview value extended to Azure AI Foundry and Microsoft Copilot Studio Introducing Microsoft Entra Agent ID to help address agent sprawl and manage agent identity Organizations are rapidly building their own AI agents, leading to agent sprawl and a lack of centralized visibility and management. Security teams often struggle to keep up, unable to see which agents exist and whether they introduce security or compliance risks. Without proper oversight, agent sprawl increases the attack surface and makes it harder to manage these non-human identities. To address this challenge, we’re announcing the public preview of Microsoft Entra Agent ID, a new capability in the Microsoft Entra admin center that gives security admins visibility and control over AI agents built with Copilot Studio and Azure AI Foundry. With Microsoft Entra Agent ID, an agent created through Copilot Studio or Azure AI Foundry is automatically assigned an identity with no additional work required from the developers building them. This is the first step in a broader initiative to manage and protect non-human identities as organizations continue to build AI agents. : Security and identity admins can gain visibility into AI agents built in Copilot Studio and Azure AI Foundry in the Microsoft Entra Admin Center This new capability lays the foundation for more advanced capabilities coming soon to Microsoft Entra. We also know that no one can do it alone. Security has always been a team sport, and that’s especially true as we enter this new era of protecting AI agents and their identities. We’re energized by the momentum across the industry; two weeks ago, we announced support for the Agent-to-Agent (A2A) protocol and began collaborating with partners to shape the future of AI identity workflows. Today, we’re also excited to announce new partnerships with ServiceNow and Workday. As part of this, we’ll integrate Microsoft Entra Agent ID with the ServiceNow AI Platform and the Workday Agent System of Record. This will allow for automated provisioning of identities for future digital employees. Learn more about Microsoft Entra Agent ID. Microsoft Defender security alerts and recommendations now available in Azure AI Foundry As more AI applications are deployed to production, organizations need to predict and prevent potential AI threats with natively integrated security controls backed by industry-leading Gen AI and threat intelligence for AI deployments. Developers need critical signals from security teams to effectively mitigate security risks related to their AI deployments. When these critical signals live in separate systems outside the developer experience, this can create delays in mitigation, leaving opportunities for AI apps and agents to become liabilities and exposing organizations to various threats and compliance violations. Now in preview, Microsoft Defender for Cloud integrates AI security posture management recommendations and runtime threat protection alerts directly into the Azure AI Foundry portal. These capabilities, previously announced as part of the broader Microsoft Defender for Cloud solution, are extended natively into Azure AI Foundry enabling developers to access alerts and recommendations without leaving their workflows. This provides real-time visibility into security risks, misconfigurations, and active threats targeting their AI applications on specific Azure AI projects, without needing to switch tools or wait on security teams to provide details. Security insights from Microsoft Defender for Cloud help developers identify and respond to threats like jailbreak attacks, sensitive data leakage, and misuse of system resources. These insights include: AI security posture recommendations that identify misconfigurations and vulnerabilities in AI services and provide best practices to reduce risk Threat protection alerts for AI services that notify developers of active threats and provide guidance for mitigation, across more than 15 detection types For example, a developer building an AI-powered agent can receive security recommendations suggesting the use of Azure Private Link for Azure AI Services resources. This reduces the risk of data leakage by handling the connectivity between consumers and services over the Azure backbone network. Each recommendation includes actionable remediation steps, helping teams identify and mitigate risks in both pre- and post-deployment phases. This helps to reduce risks without slowing down innovation. : Developers can view security alerts on the Risks + alerts page in Azure AI Foundry : Developers can view recommendations on the Guardrails + controls page in Azure AI Foundry This integration is currently in preview and will be generally available in June 2025 in Azure AI Foundry. Learn more about protecting AI services with Microsoft Defender for Cloud. Microsoft Purview capabilities extended to secure and govern data in custom-built AI apps and agents Data oversharing and leakage are among the top concerns for AI adoption, and central to many regulatory requirements. For organizations to confidently deploy AI applications and agents, both low code and pro code developers need a seamless way to embed security and compliance controls into their AI creations. Without simple, developer-friendly solutions, security gaps can quickly become blockers, delaying deployment and increasing risks as applications move from development to production. Today, Purview is extending its enterprise-grade data security and compliance capabilities, making it easier for both low code and pro code developers to integrate data security and compliance into their AI applications and agents, regardless of which tools or platforms they use. For example, with this update, Microsoft Purview DSPM for AI becomes the one place data security teams can see all the data risk insights across Microsoft Copilots, agents built in Agent Builder and Copilot Studio, and custom AI apps and agents built in Azure AI Foundry and other platforms. Admins can easily drill into security and compliance insights for specific AI apps or agents, making it easier to investigate and take action on potential risks. : Data security admins can now find data security and compliance insights across Microsoft Copilots, agents built with Agent Builder and Copilot Studio, and custom AI apps and agents in Microsoft Purview DSPM for AI In the following sections, we will provide more details about the updates to Purview capabilities in various AI workloads. 1. Microsoft Purview data security and compliance controls can be extended to any custom-built AI application and agent via the new Purview SDK or the native Purview integration with Azure AI Foundry. The new capabilities make it easy and effortless for security teams to bring the same enterprise-grade data security compliance controls available today for Microsoft 365 Copilot to custom AI applications and agents, so organizations can: Discover data security risks, such as sensitive data in user prompts, and data compliance risks, such as harmful content, and get recommended actions to mitigate risks proactively in Microsoft Purview Data Security Posture Management (DSPM) for AI. Protect sensitive data against data leakage and insider risks with Microsoft Purview data security policies. Govern AI interactions with Audit, Data Lifecycle Management, eDiscovery, and Communication Compliance. Microsoft Purview SDK Microsoft Purview now offers Purview SDK, a set of REST APIs, documentation, and code samples, currently in preview, enabling developers to integrate Purview's data security and compliance capabilities into AI applications or agents within any integrated development environment (IDE). : By embedding Purview APIs into the IDE, developers help enable their AI apps to be secured and governed at runtime For example, a developer building an AI agent using an AWS model can use the Purview SDK to enable their AI app to automatically identify and block sensitive data entered by users before it’s exposed to the model, while also providing security teams with valuable signals that support compliance. With Purview SDK, startups, ISVs, and partners can now embed Purview industry-leading capabilities directly into their AI software solutions, making these solutions Purview aware and easier for their customers to secure and govern data in their AI solutions. For example, Infosys Vice President and Delivery Head of Cyber Security Practice, Ashish Adhvaryu indicates, “Infosys Cyber Next platform integrates Microsoft Purview to provide enhanced AI security capabilities. Our solution, the Cyber Next AI assistant (Cyber Advisor) for the SOC analyst, leverages Purview SDK to drive proactive threat mitigation with real-time monitoring and auditing capabilities. This integration provides holistic AI-assisted protection, enhancing cybersecurity posture." Microsoft partner EY (previously known as Ernst and Young) has also leveraged the new Purview SDK to embed Purview value into their GenAI initiatives. “We’re not just building AI tools, we are creating Agentic solutions where trust, security, and transparency are present from the start, supported by the policy controls provided through the Purview SDK. We’re seeing 25 to 30 percent time savings when we build secure features using the Purview SDK,” noted Sumanta Kar, Partner, Innovation and Emerging Tech at EY. Learn more about the Purview SDK. Microsoft Purview integrates natively with Azure AI Foundry Organizations are developing an average of 14 custom AI applications. The rapid pace of AI innovation may leave security teams unaware of potential data security and compliance risks within their environments. With the update announced today, Azure AI Foundry signals are now directly integrated with Purview Data Security Posture Management for AI, Insider Risk Management, and data compliance controls, minimizing the need for additional development work. For example, for AI applications and agents built with Azure AI Foundry models, data security teams can gain visibility into AI usage and data risks in Purview DSPM for AI, with no additional work from developers. Data security teams can also detect, investigate, and respond to both malicious and inadvertent user activities, such as a departing employee leveraging an AI agent to retrieve an anomalous amount of sensitive data, with Microsoft Purview Insider Risk Management (IRM) policies. Lastly, user prompts and AI responses in Azure AI apps and agents can now be ingested into Purview compliance tools as mentioned above. Learn more about Microsoft Purview for Azure AI Foundry. 2. Purview data protections extended to Copilot Studio agents grounded in Microsoft Dataverse data Coming to preview in June, Purview Information Protection extends auto-labeling and label inheritance coverage to Dataverse to help prevent oversharing and data leaks. Information Protection makes it easier for organizations to automatically classify and protect sensitive data at scale. A common challenge is that sensitive data often lands in Dataverse from various sources without consistent labeling or protection. The rapid adoption of agents built using Copilot Studio and grounding data from Dataverse increases the risk of data oversharing and leakage if data is not properly protected. With auto-labeling, data stored in Dataverse tables can be automatically labeled based on policies set in Microsoft Purview, regardless of its source. This reduces the need for manual labeling effort and protects sensitive information from the moment it enters Dataverse. With label inheritance, AI agent responses grounded in Dataverse data will automatically carry and honor the source data’s sensitivity label. If a response pulls from multiple tables with different labels, the most restrictive label is applied to ensure consistent protection. For example, a financial advisor building an agent in Copilot Studio might connect multiple Dataverse tables, some labeled as “General” and others as “Highly Confidential.” If a response pulls from both, it will inherit the most restrictive label, in this case, "Highly Confidential,” to prevent unauthorized access and ensure appropriate protections are applied across both maker and users of the agent. Together, auto-labeling and label inheritance in Dataverse support a more secure, automated foundation for AI. : Sensitivity labels will be automatically applied to data in Dataverse : AI-generated responses will inherit and honor the source data’s sensitivity labels Learn more about protecting Dataverse data with Microsoft Purview. 3. Purview DSPM for AI can now provide visibility into unauthenticated interactions with Copilot Studio agents As organizations increasingly use Microsoft Copilot Studio to deploy AI agents for frontline customer interactions, gaining visibility into unauthenticated user interactions and proactively mitigating risks becomes increasingly critical. Building on existing Purview and Copilot Studio integrations, we’ve extended DSPM for AI and Audit in Copilot Studio to provide visibility into unauthenticated interactions, now in preview. This gives organizations a more comprehensive view of AI-related data security risks across authenticated and unauthenticated users. For example, a healthcare provider hosting an external, customer-facing agent assistant must be able to detect and respond to attempts by unauthenticated users to access sensitive patient data. With these new capabilities in DSPM for AI, data security teams can now identify these interactions, assess potential exposure of sensitive data, and act accordingly. Additionally, integration with Purview Audit provides teams with seamless access to information needed for audit requirements. : Gain visibility into all AI interactions, including those from unauthenticated users Learn more about Purview for Copilot Studio. 4. Purview Data Loss Prevention extended to more Microsoft 365 agent scenarios To help organizations prevent data oversharing through AI, at Ignite 2024, we announced that data security admins could prevent Microsoft 365 Copilot from using certain labeled documents as grounding data to generate summaries or responses. Now in preview, this control also extends to agents published in Microsoft 365 Copilot that are grounded by Microsoft 365 data, including pre-built Microsoft 365 agents, agents built with the Agent Builder, and agents built with Copilot Studio. This helps ensure that files containing sensitive content are used appropriately by AI agents. For example, confidential legal documents with highly specific language that could lead to improper guidance if summarized by an AI agent, or "Internal only” documents that shouldn’t be used to generate content that can be shared outside of the organization. : Extend data loss prevention (DLP) policies to Microsoft 365 Copilot agents to protect sensitive data Learn more about Data Loss Prevention for Microsoft 365 Copilot and agents. The data protection capabilities we are extending to agents in Agent Builder and Copilot Studio demonstrate our continued investment in strengthening the Security and Governance pillar of the Copilot Control System (CSS). CCS provides integrated controls to help IT and security teams secure, manage, and monitor Copilot and agents across Microsoft 365, spanning governance, management, and reporting. Learn more here. Explore additional resources As developers and security teams continue to secure AI throughout its lifecycle, it’s important to stay ahead of emerging risks and ensure protection. Microsoft Security provides a range of tools and resources to help you proactively secure AI models, apps, and agents from code to runtime. Explore the following resources to deepen your understanding and strengthen your approach to AI security: Learn more about Security for AI solutions on our webpage Learn more about Microsoft Purview SDK Get started with Azure AI Foundry Get started with Microsoft Entra Get started with Microsoft Purview Get started with Microsoft Defender for Cloud Get started with Microsoft 365 Copilot Get started with Copilot Studio Sign up for a free Microsoft 365 E5 Security Trial and Microsoft Purview Trial 1 Predicts 2025: Navigating Imminent AI Turbulence for Cybersecurity, Jeremy D'Hoinne, Akif Khan, Manuel Acosta, Avivah Litan, Deepak Seth, Bart Willemsen, 10 February 2025 2 IBM. "Cost of a Data Breach 2024: Financial Industry." IBM Think, 13 Aug. 2024, https://www.ibm.com/think/insights/cost-of-a-data-breach-2024-financial-industry; Cser, Tamas. "The Cost of Finding Bugs Later in the SDLC." Functionize, 5 Jan. 2023, https://www.functionize.com/blog/the-cost-of-finding-bugs-later-in-the-sdlcGetting started with the new Purview Content Search
“I’m looking to get started with the new Content Search experience in Purview. Where do I get started?” Welcome to the exciting new world of Content Search! This revamped experience is designed to be more intuitive, making it easier for you to navigate and find what you need. The modern Content Search experience offers additional capabilities like enhanced data sources to make it easier to identify the locations that you want to search, an improved condition builder, and a streamlined export experience. Also, you will now be able to take advantage of Premier features if you have E5 licensing, further elevating your search experience. Privacy is a key focus in this update, allowing you to restrict access to your content searches and ensure that sensitive information remains secure. Additionally, the ability to configure Role-Based Access Control (RBAC) permissions means you can customize Content Search functionality to suit your needs, granting or limiting access as necessary. There are two different ways of accessing Content Search. You can access content search by clicking on the eDiscovery solution card under the Purview portal and select Content Search on the left nav. ation pane within the eDiscovery section. The "Content Search" option is highlighted, indicating its selection for searching emails, documents, and other content across Microsoft 365. This is a shortcut that will take you to the Content Search case in the new unified Purview eDiscovery. You will see all of your existing content searches here. “What do I need to do first?” First, let’s talk about permissions and privacy. The first step in using the new content search is to make sure that you have access to the new Content Search. eDiscovery managers and administrators will automatically have access to new content search. However, if you are not a member of either of these built-in role groups or in a custom role group, you may need to have either an eDiscovery manager or an eDiscovery administrator grant you access to the new content search. You will need to take the following steps if you receive this message when attempting to access the new content search: Figure 2: A screenshot of a web application displaying a 'Permission Error' message in a pop-up window, indicating that the user does not have access to the requested page. Here are the steps for assigning a custom RBAC group or individual user to the Content Search: 1) NOTE: You will need to have someone with eDiscovery manager or eDiscovery admin permissions to assign these permissions. This is done through the Case settings button under Content Search: & Eliza Gallager Incident" is listed with details such as description, query text, created by, created date, modified by, and modified date. 2) This will take you to the case settings page. You will need to click Permissions. After you select Permissions, you will have the options to add an individual user (Users) or all members of a built-in or custom role group (Role groups) You can see where I have added a custom role group named “Content Search” in this example. 3) Once you have added either the user or the role group, they will then be able to access the new Content Search! “Thanks! I can now access the new Content Search, but it looks like I now have access to holds. My team should not have the ability to place holds. What can we do?” Have no fear! The new Content Search will not provide admins the permission to apply holds. This is tightly controlled via the Purview roles assigned to you by an authorized administrator. If the holds tab is present in the new Content Search case, it is because you already have the Hold Purview role assigned to you. You can learn more about the different roles that eDiscovery and Content Search use in this article: Assign permissions in eDiscovery. You can customize what content search activities a user can perform by using Purview custom role groups. Let’s say that you want to restrict the ability to create and manage holds with Microsoft Purview. We are going to do that by creating a new custom role group named Content Search. Here are the steps for creating a custom role group. 1) The Microsoft Purview portal supports directly managing permissions for users who perform tasks within Microsoft Purview including eDiscovery and Content Search. Using the Roles and scopes area in Settings in the Purview portal, you can manage permissions for your users. IMPORTANT: To view Role groups in the Roles and scopes area in the Microsoft Purview portal, users need to be a global administrator or need to be assigned the Role Management role (a role is assigned only to the Organization Management role group). The Role Management role allows users to view, create, and modify role groups. 2) Next, click the +Create role group button to create a new role group in Purview. You can learn more about the different roles that eDiscovery and Content Search use in this article: Assign permissions in eDiscovery. After reviewing the different Content Search-related roles, select the ones applicable to your Content Search users. Here are the roles that we selected for our Content Search users: 3) Microsoft always recommends that you use roles with the fewest permissions. When planning your access control strategy, it's a best practice to manage access for the least privilege for your eDiscovery and Content Search users. Least privilege means you grant your administrators exactly the permission they need to do their job. 4) Please refer to this article if you need any other assistance creating custom role groups in Purview: Permissions in the Microsoft Purview portal. “Excellent! I can’t see the holds tab anymore. However, I’m noticing that I have access to E5 features like review sets. We only have E3 licenses. What can we do to disable the Premium features?” Depending on your tenant configuration, the new Content Search may have eDiscovery (Premium) features enabled (these features include review sets, advanced indexing, cloud attachment support, and many others). The eDiscovery (Premium) features can be disabled via the Content Search case settings. This can be done by clicking on the Case settings button from the new Content Search. Within the Case details page there is a toggle to enable or disable the eDiscovery (Premium) features. & analytics, and Review sets. The Case details section shows information such as the license type (eDiscovery Premium), premium features toggle, case name ('Content Search'), case number, and a description field. The status of the case is marked as active with a creation date and time. “Thanks! It looks like I have the correct permissions and settings. Where do I get started?” 1) Let’s start with creating a new search. Under the new Content Search, you’re going to click the Create a search button. ry text, created by, last modified date, and status. 2) Give your new search a unique name and description. 3) Under the Query tab in your new search, you will see Data sources on the left side. The new Content Search’s enhanced data sources will make it a lot easier for you to set the locations that you would like to search. You can use Content Search to search for M365 content such as email, documents, and instant messaging conversations in your organization. Use search to find content in these cloud-based Microsoft 365 data sources: Exchange Online mailboxes SharePoint sites OneDrive accounts Microsoft Teams Microsoft 365 Groups Viva Engage In this example, we will be searching a Nestor’s mailbox and OneDrive site for an email sent in March 2025 that contains the keyword string “Project 9” 4) Click Add sources under Data sources to add your locations (you can also search all your mailboxes or sites by selecting Add tenant-wide sources if needed) 5) Type in the name of the user or their email address to find the user that you’re wanting to search and then select them. reenshot shows the 'Search for sources' interface in Microsoft 365 compliance center, where users can add people, groups, SharePoint sites, OneDrive accounts, and Microsoft Teams as sources. The search results display one item matching the query 'Nestor Wilke,' with an option to select or deselect it. 6) Click the Manage button to see the locations associated with this user. The enhanced data source experience will automatically identify a user’s mailbox and OneDrive site if they have one enabled. 7) Select Save to continue. Optional: you can exclude either their Mailbox or OneDrive site by unchecking them under the Manage sources view. 8) Now that we have identified the locations that we want to search. The next step is to create a query to define what we are wanting to search for within the locations. 9) Under the Keywords condition, make sure that Equal is selected, and type in Project 9 and hit enter. This will let you specify that you are looking for any chat, email, or document that contains the phrase “Project 9” 10) Next, click on the + Add conditions button to add the date range condition. Select Date from the list and hit Apply. 11) Switch the Date operator from Before to Between and select March 1, 2025 through March 31, 2025 as the date range. 12) Click the Run query button to generate the search estimate. Then click Run Query after selecting any additional options that you may want. 13) After the search has run, the Statistics tab will help you verify whether the relevant content was found. You can also generate a sample of the results by going under the Sample tab and hitting the Generate sample results button. 14) You can export the results of your search after you have verified that the relevant content has been returned by your search by selecting the Export button. Please give your export a name and description. 15) You can choose what format you want the results to be exported in by scrolling down. enshot displays the "Export" settings window from a software application, detailing options for exporting data. Users can choose to include Teams and Viva Engage conversations, organize conversations into an HTML transcript, and collect items linked to SharePoint or OneDrive. Additional settings allow users to select the export type, format the export into PSTs or .msg files, organize data into separate folders, condense paths to fit within 259 characters, and give items a friendly name. In the Export type section, choose one of the following options: Export items report only: Only the summary and item report are created. The various options for organizing data, folder and path structure, condensing paths, and other structures are hidden. Export items with items report: Items are exported with the item report. Other export format options are available with this option in the Export format section. In the Export formatsection, choose one of the following options: Create PSTs for messages: This option creates .pst files for messages. Create .msg files for messages: This option creates .msg files for messages Select one or more of the following output package options: Organize data from different locations into separate folders or PSTs: This option organizes data into separate folders for each data location. Include folder and path of the source: This option includes the original folder and folder path structure for items. Condense paths to fit within 256 characters: This option condenses the folder path for each item to 259 characters or less. Give each item a friendly name: This option creates a friendly name for each item. 16) After you have selected the options for your export, select the Export button. 17) Click the Export button to go to the Export tab. 18) Select your export once the status shows as “Complete” 19) Select the export packages that you wish to download and hit the Download button. Clicking the Download button will kick off a browser download. The new Content Search does not use classic Content Search and eDiscovery (Standard)’s .NET eDiscovery Export Tool application. NOTE: You may have to disable popup blocking depending on your browser settings. The download report relating to the export is named Reports-caseName-EntityName-ProcessName-timestamp.zip. With EntityName being the user given name to the export. This will include several .CSV files including items.csv which provides details of all items exported, including information such as item ID, location of the item, subject/title of the item, item class/type, and success/error status. The .PST files exported will be included in an export package called "PSTs.00x.zip" 20) Files exported (e.g. files stored in OneDrive and SharePoint) will be included in an export package called Items.00x.zip To learn more about the Microsoft Purview eDiscovery and Content Search solutions and become an eDiscovery Ninja, please check out our eDiscovery Ninja Guide at https://aka.ms/eDiscoNinja!5.2KViews0likes0Comments