sentinel
31 TopicsSplitting single-tenant Microsoft Defender XDR Sentinel logs in multiple company scenarios
This article describes a simple, yet effective solution for the problem of segregating Microsoft Defender XDR and Entra ID Sentinel logs ingestion in a single-tenant with multiple companies scenario, leveraging Log Analytics workspace transformations and some simple KQL query statements.“Automating Export of Microsoft Sentinel Analytic Rules mapped to MITRE Tactics & Techniques”
1. Introduction In modern Security Operations Centers (SOCs), mapping detections to the MITRE ATT&CK framework is critical. MITRE ATT&CK provides a structured, globally recognized model of adversary behavior, categorized into Tactics (goals) and Techniques (methods). Microsoft Sentinel analytic rules frequently include MITRE mappings, but viewing or exporting these at scale isn’t straightforward within the portal. Security teams often need: • A centralized view of existing detections mapped to MITRE ATT&CK • A CSV export for reporting, audits, and threat coverage assessments • Insights for SOC maturity, gap analysis, and threat-informed defense Having an automated way to extract this information ensures accuracy, consistency, and faster operational insights — all essential for high-performing SOCs. 2) Why This Script Is Required While Sentinel analytic rules individually display MITRE mappings, organizations typically need a workspace-wide export for: Detection Coverage & Gap Analysis Identify which Tactics & Techniques are covered. Highlight missing ATT&CK areas. Support threat modelling or purple team exercises. Security Operations Reporting Governance and oversight meetings Compliance documentation SOC KPI reporting and dashboards. Detection Engineering Lifecycle Maintaining a detection catalogue. Versioning and documentation Supporting change management and audits Exporting rules into a CSV using automation avoids manual errors, saves analyst time, and ensures accurate, up-to-date data. 3) Script to Export Microsoft Sentinel Analytic Rules with MITRE Tactics & Techniques (TO BE RUN FROM AZURE CLI) Important: This Bash script must be executed from Azure CLI, either: Azure Cloud Shell, or A workstation/server with Azure CLI installed (Linux, macOS, or WSL on Windows) The script uses az rest and jq to pull analytic rules and generate a CSV containing MITRE Tactics, Techniques, Severity, Enabled state, and KQL query. Bash Script (Run in Azure CLI) ------------------------------------------------------------------------------------------------------------------------ #!/bin/bash # Variables export SUB="99005f96-e572-4035-b476-836fd9d83d64" export RG="CyberSOC" export WS="CyberSOC" API="2024-03-01" # Step 1: Set subscription az account set --subscription "$SUB" # Step 2: Fetch alert rules echo "Fetching alert rules..." az rest \ --method GET \ --uri "https://management.azure.com/subscriptions/$SUB/resourceGroups/$RG/providers/Microsoft.OperationalInsights/workspaces/$WS/providers/Microsoft.SecurityInsights/alertRules?api-version=$API" \ --output json > rules.json # Step 3: Validate file if [ ! -s rules.json ]; then echo "Error: rules.json is empty or missing." exit 1 fi echo "Total rules found:" jq '.value | length' rules.json # Step 4: Generate CSV with MITRE mapping, severity, enabled echo "Generating CSV..." jq -r ' (["RuleName","Tactics","Techniques","MITRE_Map","Severity","Enabled","Query"] | @csv), ( .value[] | select(.kind == "Scheduled") | . as $r | ($r.properties // {}) as $p | [ ($p.displayName // $r.name // "N/A"), (( $p.tactics // $p.attackTactics // [] ) | join(";")), (( $p.techniques // $p.attackTechniqueIds // [] ) | join(";")), ( ( ($p.tactics // []) | map("TA" + (.[2:]? // "")) ) as $tacs | ( ($p.techniques // []) | map(.) ) as $techs | ( [$tacs[], $techs[]] | join(";")) ), ($p.severity // "N/A"), ($p.enabled | tostring), (( $p.query // "" ) | gsub("\\r?\\n"; " ")) ] | @csv ) ' rules.json > Scheduled_Rules_TTP_Query.csv ------------------------------------------------------------------------------------------------------------------------ 4) Summary Mapping detections to MITRE ATT&CK is a cornerstone of threat-informed defense. This script simplifies the process of exporting Microsoft Sentinel analytic rules with their MITRE mappings into a CSV — enabling SOC teams to: Analyze coverage across ATT&CK Identify detection gaps. Strengthen red–blue team collaboration. Build dashboards and ATT&CK heatmaps. Enhance SOC governance & reporting. Automating this export ensures faster insights, reduces manual workload, and supports a mature detection engineering program.Automating Sentinel Triage with Microsoft Security Copilot
Automating Sentinel Triage with Microsoft Security Copilot 🤖🧠🛡️ We’re diving deep into the transformative world of AI-driven automation in cybersecurity. This session will explore how Microsoft Security Copilot, integrated with Logic Apps, can supercharge the triage process in Microsoft Sentinel. 💡 What you’ll take away: ✔️ Practical applications of AI in triage and incident response ✔️ How to reduce manual effort and operational costs ✔️ Innovative strategies to elevate efficiency in your SOC Join us as we explore how cutting-edge AI reshapes security operations and empowers teams to focus on what matters most. 🗓️ Date: 29 September 2025 ⏰ Time: 17:00 (AEST) 🎙️ Speaker: Anthony Porter 📌 Topic: Automating Sentinel Triage with Microsoft Security Copilot95Views1like1CommentOptimizing Microsoft Sentinel: Resolving AMA-Induced Syslog & CEF Duplicates
2) Recommended Solutions When collecting both Syslog and CEF logs from the same Linux collector using the Azure Monitor Agent (AMA) in Microsoft Sentinel, duplicate log entries can occur. These duplicates arise because the same event may be ingested through both the Syslog and CEF pipelines, leading to redundancy in the Log Analytics Workspace (LAW). The following solutions aim to eliminate or reduce duplicate log ingestion, ensuring that: CEF events are parsed correctly and only once. Syslog data remains clean and non-redundant. Storage and analytics efficiency is improved. Alerting and incident investigation are not skewed by duplicate entries. Each option provides a different strategy based on your environment’s flexibility and configuration capabilities—from facility-level separation, to ingestion-time filtering, to daemon-side log routing. Option 1: Facility Separation (Preferred) Configure devices to emit CEF logs on a dedicated facility (for example, 'local4'), and adjust the Data Collection Rules (DCRs) so that the CEF stream includes only that facility, while the Syslog stream excludes it. This ensures CEF events are parsed once into 'CommonSecurityLog' and never land in 'Syslog'. CEF via AMA DCR (include only CEF facility): { "properties": { "dataSources": { "syslog": [ { "streams": ["Microsoft-CommonSecurityLog"], "facilityNames": ["local4"], "logLevels": ["*"], "name": "cefDataSource" } ] }, "dataFlows": [ { "streams": ["Microsoft-CommonSecurityLog"], "destinations": ["laDest"] } ] } } Syslog via AMA DCR (exclude CEF facility): { "properties": { "dataSources": { "syslog": [ { "streams": ["Microsoft-Syslog"], "facilityNames": [ "auth","authpriv","cron","daemon","kern","mail", "syslog","user","local0","local1","local2","local3", "local5","local6","local7" ], "logLevels": ["*"], "name": "syslogDataSource" } ] }, "dataFlows": [ { "streams": ["Microsoft-Syslog"], "destinations": ["laDest"] } ] } } Option 2: Ingest-time Transform (Drop CEF from Syslog) If facility separation is not feasible, apply a transformation to the Syslog stream in the DCR so that any CEF-formatted messages are dropped during ingestion. Syslog stream transformKql: { "properties": { "dataFlows": [ { "streams": ["Microsoft-Syslog"], "transformKql": "source | where not(SyslogMessage startswith 'CEF:')", "destinations": ["laDest"] } ] } } Option 3: Daemon-side Filtering/Rewriting (rsyslog/syslog-ng) Filter or rewrite CEF messages before AMA sees them. For example, route CEF messages to a dedicated facility using syslog-ng and stop further processing: # Match CEF filter f_cef { message("^CEF:"); }; # Send CEF to local5 and stop further processing log { source(s_src); filter(f_cef); rewrite { set_facility(local5); }; destination(d_azure_mdsd); flags(final); } 3) Verification Steps with KQL Queries Detect CEF messages that leaked into Syslog: Syslog | where TimeGenerated > ago(1d) | where SyslogMessage startswith "CEF:" | summarize count() by Computer | order by count_ desc Estimate duplicate count across Syslog and CommonSecurityLog: let sys = Syslog | where TimeGenerated > ago(1d) | where SyslogMessage startswith "CEF:" | extend key = hash_sha256(SyslogMessage); let cef = CommonSecurityLog | where TimeGenerated > ago(1d) | extend key = hash_sha256(RawEvent); cef | join kind=innerunique (sys) on key | summarize duplicates = count() Note : You should identify the RawEvent that might be causing the duplicates. 3.1) Duplicate Detection Query Explained This query helps quantify duplicate ingestion when both Syslog and CEF connectors ingest the same events. It works as follows: Build the Syslog set (sys): Filter the 'Syslog' table for the last day and keep only messages that start with 'CEF:'. Compute a SHA-256 hash of the entire message as a stable join key ("key"). Build the CEF set (cef): Filter the 'CommonSecurityLog' table for the last day and compute a SHA-256 hash of the 'RawEvent' field as the same-style join key. Join on the key: Use 'join kind=innerunique' to find messages that exist in both sets (i.e., duplicates). Summarize: Count the number of matching rows to get a duplicate total. 4) Common Pitfalls - Overlapping DCRs applied to the same collector VM causing overlapping facilities/severities. - CEF and Syslog using the same facility on sources, leading to ingestion on both streams. - rsyslog/syslog-ng filters placed after AMA’s own configuration include (ensure your custom rules run before '10-azuremonitoragent.conf'). 5) References - Microsoft Learn: Ingest syslog and CEF messages to Microsoft Sentinel with AMA (https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-syslog-ama)