Data can be sent to Log Analytics in various ways, depending on the type of data you want to ingest. For example, you can send custom logs or system information from your computers to Log Analytics, or any data relevant to your business needs.
Ways of Ingesting Data into Log Analytics
Azure Log Analytics supports several data ingestion methods, including:
1. Agent-Based Ingestion:
- Azure Monitor Agent (AMA) or Log Analytics Agent for system logs and metrics.
2. Diagnostic Settings:
- Directly stream Azure resource diagnostic logs to Log Analytics.
3. Data Collection Rules (DCR):
- Configure rules to filter, transform, and send data to Log Analytics.
4. Custom Logs:
- Upload custom log files for ingestion into specific Log Analytics tables.
5. Azure Monitor Data Sources:
- Integration with Azure services like App Insights or Azure Security Center.
6. Logs Ingestion API:
- Send custom data programmatically into Log Analytics via APIs.
In the example below, I demonstrate how to ingest Entra Application details into Azure Log Analytics, including key information such as Application Name, Application ID, and Creation Date, by leveraging the Log Ingestion API with PowerShell for seamless and efficient data integration.
Example: Ingesting Entra Application Data into Log Analytics Custom Table Using APIs
When not using the Data Collection Endpoint, you must leverage DCR endpoints to connect via APIs. However, note that DCR endpoints will only appear if you avoid creating tables through the Azure Portal. Instead, use ARM templates for this setup. Below is a step-by-step guide for ingesting Entra Application data into a Log Analytics table.
Step1: Create an Entra Application having (Application.Read.All) API Permissions
Step 2: Define and Create a Custom Table in Log Analytics
First, create a table in Log Analytics and define the required columns. For this example, we aim to ingest the following details:
- Application Name
- Application ID
- Application Created Date
- Publisher Domain
- Identifier URIs
PowerShell Script to Create the Table:
<#
Disclaimer
Test this thoroughly before using it in production. The sample code is for illustration only and provided "as is," without warranties or support.
#>
$tableParams = @'
{
"properties": {
"schema": {
"name": "EntraApplications_CL",
"columns": [
{ "name": "TimeGenerated", "type": "datetime", "description": "The time at which the data was generated" },
{ "name": "AppName", "type": "string", "description": "Entra Application Name" },
{ "name": "AppId", "type": "string", "description": "Entra Application ID" },
{ "name": "AppCreatedDate", "type": "datetime", "description": "Application Creation Date" },
{ "name": "PublisherDomain", "type": "string", "description": "Publisher Domain Details" },
{ "name": "IdentifierUris", "type": "string", "description": "Entra Application Identifier URIs" }
]
}
}
}
'@
Invoke-AzRestMethod -Path "/subscriptions/<subscription-id>/resourcegroups/<resource-group>/providers/microsoft.operationalinsights/workspaces/<workspace-name>/tables/EntraApplications_CL?api-version=2022-10-01" -Method PUT -payload $tableParams
This script creates the EntraApplications_CL table with the specified schema.
Step 3: Create a Data Collection Rule (DCR)
ARM Template for Data Collection Rule:
Change the defaultValue in below template:
Like Location, DCR Rule Name, ResourceID
{
"$schema": "https://schema.management.azure.com/schemas/2019-08-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"dataCollectionRuleName": { "type": "string", "defaultValue": "AADDCR" },
"location": { "type": "string", "defaultValue": "Central India" },
"workspaceResourceId": {
"type": "string",
"defaultValue": "/subscriptions/<subscription-id>/resourcegroups/<resource-group>/providers/microsoft.operationalinsights/workspaces/<workspace-name>"
}
},
"resources": [
{
"type": "Microsoft.Insights/dataCollectionRules",
"name": "[parameters('dataCollectionRuleName')]",
"location": "[parameters('location')]",
"apiVersion": "2023-03-11",
"properties": {
"streamDeclarations": {
"Custom-EntraApplicationsRAW": {
"columns": [
{ "name": "TimeGenerated", "type": "datetime" },
{ "name": "AppName", "type": "string" },
{ "name": "AppId", "type": "string" },
{ "name": "AppCreatedDate", "type": "datetime" },
{ "name": "PublisherDomain", "type": "string" },
{ "name": "IdentifierUris", "type": "string" }
]
}
},
"destinations": {
"logAnalytics": [
{
"workspaceResourceId": "[parameters('workspaceResourceId')]",
"name": "signinlogs-la"
}
]
},
"dataFlows": [
{
"streams": ["Custom-EntraApplicationsRAW"],
"destinations": ["signinlogs-la"],
"transformKql": "source | project TimeGenerated, AppName, AppId, AppCreatedDate, PublisherDomain, IdentifierUris",
"outputStream": "Custom-EntraApplications_CL"
}
]
}
}
]
}
Deploy the template using the Azure Portal's Custom Template Deployment.
Step 4: Review DCR
Click on the Json View you've the log ingestion endpoint available. We will be using it in the script to ingest the data.
Step 5: Assign Permissions to Your Application
Ensure the Entra application has the necessary Monitoring Metrics Publisher permissions.
As of now there is no data in the table
Step 6: Use the Script to Ingest Data
To ingest data, you’ll need two tokens:
- Graph API Token: For retrieving application data.
- Logs Ingestion API Token: For sending data to Log Analytics.
PowerShell Script for Data Ingestion:
<#
Disclaimer
Test this thoroughly before using it in production. The sample code is for illustration only and provided "as is," without warranties or support.
#>
$DcrURI = "https://<dcr-endpoint>.ingest.monitor.azure.com" # DCR Endpoint, highlighted in DCR review section
$tenantId = "<tenant-id>"
$appId = "<app-id>"
$appSecret = "<app-secret>"
$Table = "Custom-EntraApplicationsRAW"
$DcrImmutableId = "<dcr-immutable-id>" # As higligted in DCR review section
# Function to Retrieve Access Token (for Microsoft Graph)
function Get-AccessToken {
Write-Host "Obtaining Azure AD bearer token for Microsoft Graph..." -ForegroundColor Cyan
$tokenUri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
$body = @{
client_id = $appId
client_secret = $appSecret
scope = "https://graph.microsoft.com/.default"
grant_type = "client_credentials"
}
$headers = @{ "Content-Type" = "application/x-www-form-urlencoded" }
try {
$response = Invoke-RestMethod -Uri $tokenUri -Method Post -Body $body -Headers $headers
return $response.access_token
}
catch {
Write-Host "Failed to retrieve bearer token for Microsoft Graph: $_" -ForegroundColor Red
exit
}
}
function Get-AllApplications {
$applications = @()
$url = "https://graph.microsoft.com/v1.0/applications"
$headers = @{
"Authorization" = "Bearer $graphAccessToken"
"Content-Type" = "application/json"
}
do {
try {
# Fetch the current batch of applications
$response = Invoke-RestMethod -Uri $url -Method Get -Headers $headers
if ($response.value) {
# Add applications to the collection
$applications += $response.value
}
# Update the URL to the next page if it exists
$url = $response.'@odata.nextLink'
} catch {
Write-Host "Failed to fetch applications: $_" -ForegroundColor Red
exit
}
} while ($url)
# Return the complete list of applications
return $applications
}
Write-Host "Total applications retrieved: $($applications.Count)" -ForegroundColor Green
# Transform Applications Data for Logs Ingestion API
Write-Host "Transforming application data for ingestion..." -ForegroundColor Cyan
$payload = @()
foreach ($app in $applications) {
$payload += @{
TimeGenerated = ([datetime]::UtcNow).ToString("o")
AppName = $app.displayName
AppId = $app.appId
AppCreatedDate = $app.createdDateTime
PublisherDomain = $app.publisherDomain
IdentifierUris = ($app.identifierUris -join ",")
}
}
# Exit if no data to ingest
if ($payload.Count -eq 0) {
Write-Host "No applications found. Exiting..." -ForegroundColor Yellow
exit
}
$logIngestionScope = [System.Web.HttpUtility]::UrlEncode("https://monitor.azure.com//.default")
$logIngestionBody = "client_id=$appId&scope=$logIngestionScope&client_secret=$appSecret&grant_type=client_credentials"
$logIngestionHeaders = @{ "Content-Type" = "application/x-www-form-urlencoded" }
$tokenUri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token"
try {
$logIngestionToken = (Invoke-RestMethod -Uri $tokenUri -Method Post -Body $logIngestionBody -Headers $logIngestionHeaders).access_token
Write-Host "Bearer token for Logs Ingestion API retrieved successfully." -ForegroundColor Green
}
catch {
Write-Host "Failed to retrieve bearer token for Logs Ingestion API: $_" -ForegroundColor Red
exit
}
# Send Data to Log Analytics (using the correct token and body)
Write-Host "Sending data to Log Analytics via Logs Ingestion API..." -ForegroundColor Cyan
$ingestionBody = $payload | ConvertTo-Json # Make sure this is the correct data
$ingestionHeaders = @{
"Authorization" = "Bearer $logIngestionToken"
"Content-Type" = "application/json"
}
$ingestionUri = "$DcrURI/dataCollectionRules/$DcrImmutableId/streams/$Table"+"?api-version=2023-01-01"
try {
$uploadResponse = Invoke-RestMethod -Uri $ingestionUri -Method Post -Body $ingestionBody -Headers $ingestionHeaders
Write-Host "Data successfully pushed to Log Analytics custom table: $Table" -ForegroundColor Green
Write-Output $uploadResponse
}
catch {
Write-Host "Failed to push data to Log Analytics: $_" -ForegroundColor Red
}
You can now view the data in your Log Analytics workspace.
Conclusion
This process helps you easily send custom data to Azure Log Analytics. By using Data Collection Rules and the Logs Ingestion API, you can manage and monitor your data efficiently.
Updated Apr 01, 2025
Version 1.0harjsing
Microsoft
Joined December 24, 2024
Core Infrastructure and Security Blog
Follow this blog board to get notified when there's new activity