Learn how to seamlessly ingest custom application logs in Text/JSON format into Microsoft Sentinel for enhanced security monitoring and analysis.
This blog is in continuation to my previous blog on Demystifying Log Ingestion API where I discussed on ingesting custom log files to Microsoft Sentinel via Log Ingestion API approach. In this blog post I will delve into ingesting custom application logs in Text/JSON format to Microsoft Sentinel.
Note: For my demo purposes I will use the log in JSON format.
First, lets start with WHY is this important.
Many applications and services will log information to a JSON/Text files instead of standard logging services such as Windows Event log or Syslog. There are several use cases where custom application logs are mandatory to be monitored and that’s why this integration becomes crucial part of SOC monitoring.
How to implement this integration?
Custom application logs in Text/JSON format can be collected with Azure Monitor Agent and stored in a Log Analytics workspace with data collected from other sources. There are two ways to do it:
- Creating DCR-based custom table and link it with Data Collection Rule and Data Collection Endpoint.
- Leverage Custom logs via AMA content hub solution.
I will discuss both approaches in this blog.
Let’s see it in action now.
Leveraging DCR-based custom table to ingest custom application logs
Using this approach, we will first create a DCR-based custom table and link it with DCR and DCE.
Prerequisites for this approach:
- Log Analytics workspace where you have at least contributor rights.
- A data collection endpoint (DCE) in the same region as the Log Analytics workspace. See How to set up data collection endpoints based on your deployment for details.
- Either a new or existing DCR described in Collect data with Azure Monitor Agent.
Basic Operations:
The following diagram shows the basic operation of collecting log data from a json file.
- The agent watches for any log files that match a specified name pattern on the local disk.
- Each entry in the log is collected and sent to Azure Monitor. The incoming stream defined by the user is used to parse the log data into columns.
- A default transformation is used if the schema of the incoming stream matches the schema of the target table.
Detailed steps as follows:
- Browse to Log Analytics Workspace > Settings > Tables > New custom log (DCR-based)
- Enter the table name, please note the suffix _CL will be automatically added. Use existing or create a new DCR and link a DCE.
- Upload the sample log file in JSON format to create table schema
- In my use case, I’ve created few columns like TimeGenerated, FilePath and Computer using the transformation query mentioned below:
source | extend TimeGenerated = todatetime(Time), FilePath = tostring('C:\\Custom Application\\v.1.*.json'), Computer = tostring('DC-WinSrv22')
- Review and create the table.
- Go to the Data Collection Rule > Resources and Add the Application Server and link it with the DCE.
If all configurations are correct, then in few minutes the data should populate in the custom table as shown below:
Note: Ensure that the Application Server is reporting to the correct Log Analytics Workspace and the DCR, DCE are linked to the Server.
Details of DCRs associated with a VM can be fetched from the following PowerShell Script:
Get-AzDatacollectionRuleAssociation -TargetResourceId {ResourceID}
Please note, 'Custom JSON log' data source configuration is currently unavailable through the Portal, you can use Azure CLI or ARM template for the configurations. However, ‘Custom Text Logs’ data source can be configured from Azure Portal (DCR>Data Sources)
Leveraging Custom logs via AMA Data Connector
We’ve recently released a content hub solution for ingesting custom logs via AMA. This approach is straightforward as the required columns like TimeGenerated and RawData gets created automatically.
Detailed steps as follows:
- Browse to Microsoft Sentinel > Content Hub > Custom Logs AMA and install this solution
- Go to Manage > Open the connector page > Create Data Collection Rule
- Enter the Rule name, target VM and specify if you wish to create a new table. If so, provide a table name. You’ll also need to provide the file pattern (wildcards are supported) along with transformation logic (if applicable). In my use case, I am not using any transformation.
- Once DCR is created, wait for some time and validate if logs are streaming or not.
If all the configurations are correct, then you’ll see the logs in the table as shown below:
Please note, since we have used DCR-based custom tables we can switch the table plan to Basic if needed.
Additionally, DCR-based custom tables support transformation so irrelevant data can be dropped or the incoming data can be split to multiple tables.
References:
Collect logs from a JSON file with Azure Monitor Agent - Azure Monitor | Microsoft Learn
Demystifying Log Ingestion API | Microsoft Community Hub
Workspace & DCR Transformation Simplified | Microsoft Community Hub