Advanced Incident Management for Office and Endpoint DLP using Azure Sentinel

Published 10-26-2020 06:43 AM 8,979 Views

A common question we get from organizations that use Microsoft Information Protection is, how can we receive a single pane of glass across not only DLP and other information protection events but correlate with the entire IT estate? How can I effectively use the richness of data for incident management and reporting?


In this post we will focus on how this can be achieved with Azure Sentinel, by utilizing a custom Azure Function for ingestion. Let's start with a few teasers.


Below is a sample where an Office DLP incident is connected with other incidents as well as the Microsoft Defender for Endpoint alerts from the device. Over time this native Azure Sentinel feature will evolve to support more entities for automated correlation.



This is a Workbook sample of reporting of DLP incidents across departments and geography.



In this GRAPH sample using an Workbook, we have selected a document node (Darkness), which expands a table with SharePoint DLP alerts as well as SharePoint Activity for that document to instantly go deeper in the investigation.



The code and instructions for ingestion of Endpoint and Office DLP events can be found here, (Although the naming is endpoint it includes both Office and Endpoint data). Please note that  the code for endpoint will change as soon as the endpoint DLP events are included in dlp.all.


  1. Register a new application in Azure AD 
    • Microsoft GRAPH (Application permissions)
      • Group.Read.All
      • User.Read.All
    • Office 365 Management APIs (Application permissions)
      • ActivityFeed.Read
      • ActivityFeed.ReadDlp (Needed for detailed DLP events)Picture6.png
  2. Collect the identity and secret for the new App created in step 1. For production, store the secret in Azure Key vault, generate the keys now delegate access to the function in step 7.
    • clientID
    • clientSecret
    • TenantGuid
    • exuser (User account to allow for mapping to sensitive info types, it should only have the permissions to run Get-DlpSensitiveInformationType)
    • Azure Sentinel Workspace Name
  3. Get the WorkSpace ID and Workspace Key for your Sentinel Workspace.
    • Select the workspace from the Log Analytics workspaces menu in the Azure portal. Then select Agents management in the Settings section.Picture7.png
  4. Click deploy to Azure in the repo
  5. Provide the parameters needed for the function.
    • SPUS is only used if you are going to deploy ingestion of emails to  SharePoint to be able to retrieve a full copy of the emails from the incidents.Picture9.png
    • Click Review and create
    • Click Create, if all parameters passed
    • The Deployment will startPicture11.png
    • On completion it is likely that several of the functions will have an error. The actual function code is deployed in the next step so the errors are expected.Picture12.png
  6. Go to the Resource Group where you deployed the app, you will see the core services deployed. Click the Function App, we will come back here in a momentPicture13.png
  7. To enable the app to automatically synch DLP policies to Sentinel run the following commands it will allow the APP to fully manage Sentinel. You need to define the RG where the Log Analytics database is hosted for Sentinel.
    • Start Powershell and ensure that the Az module is installed.
    • $id = (Get-AzADServicePrincipal -DisplayNameBeginsWith YourAPP).id
    • New-AzRoleAssignment -ResourceGroupName YOURRGWITHSENTINEL -RoleDefinitionName "Azure Sentinel Contributor" -ObjectId $id 
    • You can use the UI as well under Identity of the function, this process can also be used granting access to your key vault on completion of the setup of the function.Picture14.png
  8. Deploy the code used for the functions.
    • Download the deployment zip ( )
    • Start Powershell and ensure that the Az module is installed.
    • Connect-AzAccount
  9. Run Publish-AzWebApp -ResourceGroupName REPLACEWITHYOURRG -Name REPLACEWITHYOURAPPNAME -ArchivePath C:\path\enpointdlpservice.zipPicture15.png
  10. To initialize the variables in the app
    • Navigate to the Enablement function in your Function AppPicture16.png
    • open the function under functions, open "Code + Test" , click Test/Run, click RunPicture17.png
  11. Note if there are any errors generated in this run, you will see it in the logging window. If there is a typo or similar in your configuration files. Go back to the main window for the App and click Configuration to update the parameterPicture18.png
  12. Note, the Analytic Rules functions will not be successful until you have ingested both SharePoint, Exchange events and in the case of Endpoint you need Endpoint events.
    • The API actively refuses queries that are in-valid.
  13. If the Log Analytic rules that corresponds to DLP Policies aren't created after data ingestion, run the Enablement function again. It will reset the time scope of the functions.
  14. If you want to ingest original email content to SharePoint please see
  15. To setup the reporting please follow
  16. If you want to try out ingestion of documents from endpoints look at this 
  17. In the repo see the "Important Additional Customization"


This is just a starting point to get DLP incident data in to Azure Sentinel. There is enrichment code to add details from Microsoft GRAPH that can be customized. You can customize the code to send events to different Azure Sentinel Workspaces based on geography and other details. In Azure Sentinel you can start to create automated actions using Playbooks, you can create your own Kusto queries to receive new insights. More on that in a later post. And yes, we are investigating the option to provide native integration with Azure Sentinel as well.




Occasional Contributor

Quite powerful and detailed explanations.

Thanks @Jon Nordström for that

Hopefully Microsoft will introduce a more integrated experience notably for SMB (and eventually large org) which can not deploy such solutions or are afraid to for multiple reasons


How does this overlap and interact with E0Discovery. Also since the email are, as I read the guide, being copied to a Sharepoint library how do we lock that SPO library down? 


Might a better approach, although I can see the potential issues with speed of investigation and data access, be to leverage the Ediscovery API's and have the mails in question copied to another mailbox or a pst file that only the Sentinel investigator has access to? 


It raises several interesting questions around role separation and data access governance in the age of GDPR etc... 




Hi Peter, I love your question.  The benefit with placing it in SPO is that the content is fully audited. You can quickly run a query on OfficeActivity in Sentinel to return access down to item level.  In the case of a DSR's you know where the DLP data is stored for AeD, which may be harder for PST. Optionally you can also apply Sensitivity labels, retention labels or extend with additional metadata to the content. The logic app can also be used to set permissions to items being stored. Generally ensure that all external sharing capabilities etc... are disabled for the site collection. Restrict permissions to the site collectionrequire MFA etc...

Most privacy regulations comes down to transparency and proportionality when it comes to DLP (I know over simplified). This approach provides transparency, the decision if collecting content on detection is more on the proportionality side and is fully controlled by your team. A really good and important subject that we are taking in to consideration for our product planning. 


Thank you

Version history
Last update:
‎May 11 2021 02:03 PM
Updated by: