Microsoft Defender for Cloud export to Azure Data Explorer
Published May 26 2022 09:16 AM 2,584 Views

Are you interested in sending your Microsoft Defender for Cloud logs to Azure Data Explorer? The usual pattern for this would be to utilize Azure Log Analytic continuous export functionality but the Defender logs aren’t currently supported so let’s take a look at another option. Did you know that Microsoft Defender for Cloud has Continuous Export built into the product?


Continuously export Microsoft Defender for Cloud data


Microsoft Defender for Cloud Continuous Export

So lets walk through the steps to use this to export the data to Event Hub and then ingest it into Azure Data Explorer.


  1. In the Azure Portal open the Microsoft Defender for Cloud blade.
  2. Under Management select “Environment settings”


  1. Find the subscription that you want to enable this for and select it. 
  2. Select the “Continuous export” tab and configure your export to an existing Event Hub. I’m exporting the “Security recommendations”, “Secure score”, “Security alerts”, and “Regulatory compliance”
  3. Click on “Save” and your export will be setup.


Azure Data Explorer Configuration

Now we need to setup our ingestion from Event Hub to Azure Data Explorer. At a very high level the process will be to:

  • Utilize OneClick UI to setup the ingestion of data from Event Hub into a raw table.
  • Create tables for the different log types (“Security recommendations”, “Secure score”, “Security alerts”, and “Regulatory compliance”)
  • Use update policies to transform the data from the raw table to the appropriate destination table
  • Modify the retention period on the raw table to a few hours


OneClick to Configure Ingestion from Event Hub

If you haven’t used OneClick Ingestion I would suggest looking at the following video.


Azure Data Explorer One Click Ingestion for Azure Event Hub - YouTube


  1. Go to and connect to your ADX cluster
  2. Right click on the database where you wish to ingest the Azure Defender for Cloud logs and select “Ingest new data”




  1. In the “Destination” tab we’ll create a new table. I called mine defenderraw but you can utilize whatever name you wish.



  1. In the “Source” tab select “Event Hub” for the type and then choose the Event Hub your exporting the logs to.





  1. Once it pulls some messages from Event Hub modify your data type from string to dynamic.



  1. Click “Next: Start Ingesting” to create the table, ingestion mapping, and the data connection to Event Hub!


Update Policies

Now that you have the raw data landing in ADX you probably want to fork that out to individual tables via update policies based on the type of event. Below we’ll walk through the steps for one of those event types, subAssessments.


Here are the high-level steps we’ll take:

  1. Create the KQL query to filter and parse the event type
  2. Create the destination table based on this query
  3. Create a function to use during the update policy
  4. Create the update policy on our newly created table


KQL Query

So lets start with the KQL query. First step is to filter down to only the event types that we want to see. We can utilize the data.Type property within the payload to do this



| where data.Type == 'Microsoft.Security/assessments/subAssessments'



Now that we have the events filtered we want to turn the properties in the data column (dynamic json) into individual columns.  You could create individual columns using data.<field> but there is an easier way. KQL has bag_unpack that will do all the hard work for me.



| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)



This results in the json being flattened one level without you having to call into it


Before creating a new table based on the query lets check the DataTypes for the columns and determine which columns we want.



| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)
| getschema






Final step is to only select the columns that we want



| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)
| project Type, Id, Name, Properties, SecurityEventDataEnrichment, SubAssessmentEventDataEnrichment, TenantId



Create Table

Once you have the query that will be used in your update policy there is a really simple method to create the table. Take 0 rows in the query (just return the schema) and use that in a .set command



.set subAssessments <|
| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)
| project Type, Id, Name, Properties, SecurityEventDataEnrichment, SubAssessmentEventDataEnrichment, TenantId
| take 0



Running this will create a table called subAssessments with the correct schema.


Create the Function

Now we’ll take this query, minus the take 0, and create a function.



.create function updateSubAssessments() {
    | where data.Type == 'Microsoft.Security/assessments/subAssessments'
    | evaluate bag_unpack(data)
    | project Type, Id, Name, Properties, SecurityEventDataEnrichment, 
    SubAssessmentEventDataEnrichment, TenantId



Now we have a function called “updateSubAssessments” that can be used in our update policy.


Create the Update Policy

Last step is to create an update policy that will monitor the raw table, grab any subAssessments, run the function, and place the results in our subAssessments table.



.alter table subAssessments policy update @'[{"IsEnabled": true, "Source": "defenderraw", "Query": "updateSubAssessments()", "IsTransactional": true, "PropagateIngestionProperties": false}]'



Once the update policy is in place any new “subAssessments” event type that lands in “defenderraw” will automatically be parsed and placed in the “subAssessments” table.

Follow these steps for any additional event types that you wish to parse from the “defenderraw” table.


Verify Results

You need to wait until at least one event of type “subAssessments” lands in the “defenderraw” table. Once events have landed in the raw table you should be able to query the “subAssessments” table to verify that parsed events are landing correctly.




Modify Retention Settings

Once you have the data forked to individual tables you’ll want to adjust the retention setting on the raw table. You can go all the way down to 0 retention if you wish and the update policies will still work. But my suggestion would be to keep a few hours so that if you needed to troubleshoot or look at the raw data coming in you can do this. The below command will set the raw table to 2 hours of retention:

.alter table defenderraw policy retention "{\"SoftDeletePeriod\": \"00.02:00:00\", \"Recoverability\": \"Enabled\"}"


We’ve walked through the process of sending Microsoft Defender for Cloud logs to Azure Data Explorer using continuous export to Azure Event Hub.  At a high level you need to configure:

  1. Continuous Export of data from Microsoft Defender for Cloud to Event Hub
  2. Data ingestion from Azure Event Hub to Azure Data Explorer (
  3. Azure Data Explorer Update Policy to fork the data into appropriate tables (

For the purposes of this article we walked through the manual steps but all of this can be automated using scripts or a Bicep Template


Even the tables, functions, update policies, etc.. can be created in the template using configuration KQL scripts.


Version history
Last update:
‎May 17 2022 02:32 PM
Updated by: