In the lush grass of my backyard, Raven, my miniature Schnauzer, finds her bliss. She lies there, soaking in the sun’s rays, much like the data we export from Defender for Endpoint into Azure Storage. It sits there, in the cloud’s protective realm, unassuming yet essential. Raven, ever alert for critters that might encroach on her territory, is like the data that lies in wait—seemingly still but always ready to serve our needs.
This blog will lead you through the steps of moving crucial data to a secure spot, just as Raven seeks out her sunny haven. We’ll explore the technical aspects of data exportation, ensuring it’s accessible for future use, similar to Raven’s watchful relaxation in her favorite sunny spot. Join me in learning how our digital guardians keep our valuable information safe, offering a sense of security as dependable as Raven’s attentive repose in her cherished sunny patch.
Microsoft XDR is a comprehensive security solution that includes several component services:
These components work together to provide a comprehensive view of data across your organization, allowing for robust threat detection, investigation, and response. However, as data ages out in Microsoft XDR, the ability to conduct long-term historical analysis can be lost.
The data ingested into the Log Analytics Workspace used by Microsoft Sentinel from Microsoft XDR includes all incidents and alerts, along with their alerts, entities, and other relevant information. This data also includes alerts from Microsoft XDR’s component services such as Microsoft Defender for Endpoint, Microsoft Defender for Identity, Microsoft Defender for Office 365, and Microsoft Defender for Cloud Apps. The connector also lets you stream advanced hunting events from all of the above Defender components into the Log Analytics Workspace. This allows you to copy those Defender components’ advanced hunting queries into Microsoft Sentinel, enrich Sentinel alerts with the Defender components’ raw event data to provide additional insights, and store the logs with increased retention in Log Analytics.
If you don’t use Sentinel or you would like to have the ability for further processing, reporting, or investigation, you may want to export the security data from MDE to an external storage or analytics platform. One such platform is Azure Data Explorer (ADX), a fast and scalable data exploration service that allows you to query and visualize the security data.
From a cost perspective, archiving data can lead to substantial savings. Storing large volumes of data in the Log Analytics Workspace used by Sentinel can be expensive, especially over the long term. By archiving aged data to ADX, you can move it to a more cost-effective storage solution while still retaining the ability to access and analyze it as needed. This approach allows you to balance between the need for immediate access to recent data in the Log Analytics Workspace, and the cost-effective, long-term storage of older data in ADX Cluster.
Once you ingest data into the Log Analytics Workspace used by Sentinel, you can retain that data for 90 days at no additional cost. You can also purchase interactive tier retention for over 90 days up to two full years. Therefore, depending on your specific needs and budget, Microsoft Sentinel could provide more flexibility in terms of data retention.
Therefore, archiving data from Microsoft XDR to ADX before it ages out is a strategic move that enhances your security posture and saves your organization money. It allows for in-depth, long-term analysis and helps optimize costs, making it a valuable practice for any organization committed to robust and cost-effective cybersecurity. Once data is archived to, Azure Data Explorer (ADX) can be used to interrogate that data. ADX is a fast, fully managed data analytics service for real-time analysis on large volumes of data streaming from applications, websites, IoT devices, and more. It can be used to run complex queries and deliver insights on both real-time and historical data, making it a powerful tool for analyzing the archived data.
In this document, you will be guided through the process of exporting data from XDR to an Azure Event Hub and then connecting the logs to ADX. This strategy is worth considering for organizations looking to maximize their cybersecurity investments.
Please ensure you have the following prerequisites to complete this task:
Before data can be configured to be sent from Defender for Endpoint an Event Hub needs to be set up.
To configure the data export settings in Defender for Endpoint. You can choose which data types you want to export and specify the destination where the data will be shipped to.
To configure the data export settings, follow these steps:
The Streaming API definition should now appear.
Connect to https://dataexplorer.azure.com (This is the ADX cluster portal).
An ADX cluster needs to be created so the data in transit has a storage location.
The new cluster should now be built.
Connect the previously created Event Hub to the ADX Cluster/Database.
Lets take a look at the ADX Cluster and its configuration.
The “Databases” defined for this cluster are now displayed.
If you click on the “Data connections” tab, it will display the connection to the Event Hub that was just completed
If you click on the name of the Event Hub, it will display the connection configuration.
Click on the “Query” icon, in the upper left corner.
.show database ADXDatabase schema
You will notice there is only 1 column, and it is named “records”.
If you would like to expand the single “records” column you can parse and expand the data. For example, to get a summary of all “Category(s)”.
pbbergsXDR
| extend records = parse_json(records)
| mv-expand records
| extend category = tostring(records.category)
| summarize count() by category
To take a look at ALL the “Alerts” the following is some example code.
The below script is designed to process and analyze data from a table named ‘pbbergsXDR’. The data in this table is from the XDR export and only has a single column named ‘records’ which contains JSON-formatted data. The script begins by parsing this JSON data into a more manageable format using the parse_json() function.
The mv-expand function is then used to transform each item in the ‘records’ column into a separate row. This is useful when the ‘records’ column contains arrays of data, as it allows each item in the array to be analyzed individually.
Next, the script extracts various fields from each record, such as ‘category’, ‘time’, ‘tenantId’, ‘operationName’, and several properties. These fields are converted to strings for easier processing and analysis. Notably, the ‘time’ field is converted to a datetime format, adjusted from UTC to Central Standard Time (CST), and then formatted as ‘MM-dd-yyyy hh:mm:ss tt’.
The script then filters the records to include only those where the ‘category’ is ‘AdvancedHunting-AlertInfo’.
Next, the script assigns a numeric value to each severity level using the case() function. This is done to facilitate sorting of the records based on severity level. The records are then sorted first by severity (from high to low) and then by time (from newest to oldest).
Finally, the script reorders the columns to have ‘time’, ‘Title’, ‘Category’, ‘Severity’, ‘ServiceSource’, ‘DetectionSource’, and ‘AttackTechniques’ at the beginning. It also removes the ‘SeverityOrder’, ‘records’, and ‘tenantId’ columns from the output.
The result of running this script is a table of data where each row represents a record from the ‘pbbergsXDR’ table that meets the specified criteria. The data is sorted by severity and time, and the columns are ordered in a specific way. This makes it easier to analyze the data and draw meaningful conclusions. For example, you can easily see which alerts are the most severe and when they occurred. You can also see the title, category, service source, detection source, and attack techniques for each alert, which can provide valuable insights into the nature of the alerts.
Note: Special thanks to Seyed Amirhossein Nouraie, on help with this article.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.