How To Export Data from Defender for Endpoint to Azure Data Explorer
Published Mar 19 2024 04:30 AM 14.7K Views

In the lush grass of my backyard, Raven, my miniature Schnauzer, finds her bliss. She lies there, soaking in the sun’s rays, much like the data we export from Defender for Endpoint into Azure Storage. It sits there, in the cloud’s protective realm, unassuming yet essential. Raven, ever alert for critters that might encroach on her territory, is like the data that lies in wait—seemingly still but always ready to serve our needs.

This blog will lead you through the steps of moving crucial data to a secure spot, just as Raven seeks out her sunny haven. We’ll explore the technical aspects of data exportation, ensuring it’s accessible for future use, similar to Raven’s watchful relaxation in her favorite sunny spot. Join me in learning how our digital guardians keep our valuable information safe, offering a sense of security as dependable as Raven’s attentive repose in her cherished sunny patch.


Microsoft XDR is a comprehensive security solution that includes several component services:

  • Microsoft Defender for Endpoint (MDE): Provides robust endpoint protection and response capabilities. It collects and analyzes security data from various sources, including device events, alerts, incidents, and vulnerabilities. The default retention period for this data is 30 days accessible via a query, and up to 180 days visible across the portal.
  • Microsoft Defender for Identity: Protects against advanced targeted attacks by automatically analyzing, learning, and identifying normal and abnormal entity behavior. The data retention period for Microsoft Defender for Identity is 90 days for audit trail. However, Defender for Identity is gradually rolling out extended data retentions on identity details to more than 30 days.
  • Microsoft Defender for Office 365: Protects your organization against malicious threats posed by email messages, links (URLs), and collaboration tools. By default, data across different features in Microsoft Defender for Office 365 is retained for a maximum of 30 days.
  • Microsoft Defender for Cloud Apps: Provides visibility into your cloud apps and services, provides sophisticated analytics to identify and combat cyber threats, and enables you to control how your data travels. Defender for Cloud Apps retains data as follows: Activity log: 180 days, Discovery data: 90 days, Alerts: 180 days, Governance log: 120 days.

These components work together to provide a comprehensive view of data across your organization, allowing for robust threat detection, investigation, and response. However, as data ages out in Microsoft XDR, the ability to conduct long-term historical analysis can be lost.

The data ingested into the Log Analytics Workspace used by Microsoft Sentinel from Microsoft XDR includes all incidents and alerts, along with their alerts, entities, and other relevant information. This data also includes alerts from Microsoft XDR’s component services such as Microsoft Defender for Endpoint, Microsoft Defender for Identity, Microsoft Defender for Office 365, and Microsoft Defender for Cloud Apps. The connector also lets you stream advanced hunting events from all of the above Defender components into the Log Analytics Workspace. This allows you to copy those Defender components’ advanced hunting queries into Microsoft Sentinel, enrich Sentinel alerts with the Defender components’ raw event data to provide additional insights, and store the logs with increased retention in Log Analytics.

If you don’t use Sentinel or you would like to have the ability for further processing, reporting, or investigation, you may want to export the security data from MDE to an external storage or analytics platform. One such platform is Azure Data Explorer (ADX), a fast and scalable data exploration service that allows you to query and visualize the security data.

From a cost perspective, archiving data can lead to substantial savings. Storing large volumes of data in the Log Analytics Workspace used by Sentinel can be expensive, especially over the long term. By archiving aged data to Azure Storage Accounts, you can move it to a more cost-effective storage solution while still retaining the ability to access and analyze it as needed. This approach allows you to balance between the need for immediate access to recent data in the Log Analytics Workspace, and the cost-effective, long-term storage of older data in Azure Storage Accounts. Azure Storage can lead to cost savings with its various storage tiers (Hot, Cool, and Archive), effectively managing storage costs based on access needs and budget.

Once you ingest data into the Log Analytics Workspace used by Sentinel, you can retain that data for 90 days at no additional cost. You can also purchase interactive tier retention for over 90 days up to two full years. Therefore, depending on your specific needs and budget, Microsoft Sentinel could provide more flexibility in terms of data retention.

Therefore, archiving data from Microsoft XDR to Azure Storage Accounts before it ages out is a strategic move that enhances your security posture and saves your organization money. It allows for in-depth, long-term analysis and helps optimize costs, making it a valuable practice for any organization committed to robust and cost-effective cybersecurity. Once data is archived to Azure Storage Accounts, Azure Data Explorer (ADX) can be used to interrogate that data. ADX is a fast, fully managed data analytics service for real-time analysis on large volumes of data streaming from applications, websites, IoT devices, and more. It can be used to run complex queries and deliver insights on both real-time and historical data, making it a powerful tool for analyzing the archived data. 

In this document, you will be guided through the process of exporting data from MDE to an Azure Storage Account and then connecting the exported logs to ADX. This strategy is worth considering for organizations looking to maximize their cybersecurity investments. 

Please ensure you have the following prerequisites to complete this task:

  • An Azure subscription
  • A Defender for Endpoint subscription and administrator role

Create an Azure Storage Account

Before data can be configured to be sent from Defender for Endpoint a storage account needs to be set up.

  • Sign-in to Azure, browse to the Azure Storage Account blade and select “+Create”.



Complete the required fields that are needed to build the storage account, once complete…”

  • Select “Review + Create”



The new storage account should now be available.



The new storage account container can be seen as empty.



  • On the “Overview” blade, click on “JSON View”



  • Copy the “Resource ID” and paste it to a temp location. This will be needed for the MDE export step later.



Configure Data Export Settings in Defender for Endpoint

To configure the data export settings in Defender for Endpoint. You can choose which data types you want to export and specify the destination Azure Storage Account where the data will be stored.

To configure the data export settings, follow these steps:

  • Go to Settings > Microsoft Defender XDR > Streaming API.
    • Select “+ Add”



  • Enter the following:
    • Name
    • Select “Forward events to Azure Storage” (Container).
      • Paste the “Resource Id” from the Storage Account copied earlier.
    • Select the tables that should be sent to the Storage Account.
  • Select “Submit”



The Streaming API definition should now appear.



Create an Azure Data Explorer Cluster to Provide Access to the Exported Logs

Note: At the point of the creation of this document, there was a minor issue with connecting the exported logs to Azure Data Explorer (ADX) if Azure was used to connect to ADX. Therefore, please use (

A ADX cluster needs to be created

  • Select the “My Cluster” blade.
    • Click “Create cluster and database”.



  • Enter the ADX Cluster information.
    • Cluster display name.
    • Database name
    • Cluster location
    • Agree to terms and conditions
  • Click on “Create”



The new cluster should now be built.



Add all the tables that are now being streamed to the Azure Storage account.

  • Click on “Get data” > “Azure Storage”



The ability to configure the source for the data is now exposed.

Connect the Exported Logs to Azure Data Explorer

To connect the exported logs to Azure Data Explorer. You can use the Azure Data Explorer Data Connector for Azure Storage, which enables you to ingest data from Azure Storage blobs into Azure Data Explorer tables.

In the following example, the creator will want to know what tables are currently being populated within the container of the Azure Storage Account. In the screen grab of my Storage account, it appears I only have 16 of the 21 tables that are being exported have data in them at this time.





Using table names from MDE (The name is appended to the end of the container) (alertevidence, alertinfo, etc…) I will build a connector for each container in the Storage account.


  • Click on “+New table”.



  • Complete the entries on the screen:
    • Enter the first MDE table to connect to
      • In this example it is “alertevidence”.
    • Select source – Select “Container”.
    • Subscription – Enter the subscription of where the Storage account resides.
    • Storage account – Select the storage account that hosts the data.
    • Container – Select the associated container.
      • In this example it is the container with “alertevidence” appended on the end.
    • Click on “Next”



  • The screen to ensure the expected data is exposed.
    • Verify and select “Finish”



The window should now return the to ADX home screen with the “alertevindence” container now connected.



Before continuing on to mapping the other containers, it is advised to ensure that there is data populating within the container.

  • Double click on the container and it will auto-populate within the KQL query editor.
    • Remove the “|” (pipe) if it was appended.
  • Click on “Run”.



Once you are satisfied that data is returned. Walk through each additional container that needs to be connected.

In my demo lab, the connectors to the containers appears as below.




1 Comment
Version history
Last update:
‎Apr 02 2024 09:05 PM
Updated by: