KQL
297 Topicsneed to create monitoring queries to track the health status of data connectors
I'm working with Microsoft Sentinel and need to create monitoring queries to track the health status of data connectors. Specifically, I want to: Identify unhealthy or disconnected data connectors, Determine when a data connector last lost connection Get historical connection status information What I'm looking for: A KQL query that can be run in the Sentinel workspace to check connector status OR a PowerShell script/command that can retrieve this information Ideally, something that can be automated for regular monitoring Looking at the SentinelHealth table, but unsure about the exact schema,connector, etc Checking if there are specific tables that track connector status changes Using Azure Resource Graph or management APIs Ive Tried multiple approaches (KQL, PowerShell, Resource Graph) however I somehow cannot get the information I'm looking to obtain. Please assist with this, for example i see this microsoft docs page, https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health#supported-data-connectors however I would like my query to state data such as - Last ingestion of tables? How much data has been ingested by specific tables and connectors? What connectors are currently connected? The health of my connectors? Please help69Views2likes1CommentASIM built-in functions in Sentinel, are they updated automatically?
Are the ASIM built-in functions in Sentinel automatically updated? For example, the built-in parsers such for DNS, NetworkSession, and WebSession. Do the built-in ones receive automatic updates or will the workspace-deployed versions of these parsers be the most up-to-date? And if true, would it be recommended to use workspace-deployed version of parsers that already come built-in?718Views2likes1CommentDevice Tables are not ingesting tables for an orgs workspace
Device Tables are not ingesting tables for an orgs workspace. I can confirm that all devices are enrolled and onboarded to MDE (Microsoft defender for endpoint) I had placed an EICAR file on one of the machine which bought an alert through to sentinel,however this did not invoke any of the device related tables . Workspace i am targeting Workspace from another org with tables enabled and ingesting data Microsoft Defender XDR connector shows as connected however the tables do not seem to be ingesting data; I run the following; DeviceEvents | where TimeGenerated > ago(15m) | top 20 by TimeGenerated DeviceProcessEvents | where TimeGenerated > ago(15m) | top 20 by TimeGenerated I receive no results; No results found from the specified time range Try selecting another time range Please assist As I cannot think where this is failing32Views1like1CommentFetching alerts from Sentinel using logic apps
Hello everyone, I have a requirement to archive alerts from sentinel. To do that I need to do the following: Retrieve the alerts from Sentinel Send the data to an external file share As a solution, I decided to proceed with using logic apps where I will be running a script to automate this process. My questions are the following: -> Which API endpoints in sentinel are relevant to retrieve alerts or to run kql queries to get the needed data. -> I know that I will need some sort of permissions to interact with the API endpoint. What type of service account inside azure should I create and what permissions should I provision to it ? -> Is there any existing examples of logic apps interacting with ms sentinel ? That would be helpful for me as I am new to Azure. Any help is much appreciated !396Views1like4CommentsCannot access aka.ms/lademo
Hello team, I am Nikolas. I am learning KQL for Microsoft Sentinel. As far as I know, we can access the aka.ms/lademo for demo data. However I cannot access the demo. I tried using VPN, access page from many other devices with different IP address different account. But it does not work. Can you help to confirm if this link is still accessible. I can access the resource last week, but not this week. I am looking forward to hearing from you.Solved482Views1like2CommentsNeed some KQL for DNS
I need few KQL query for below use case as table is _Im_Dns and ASimDnsActivityLogs. Monitor DNS for Brand Abuse - This search looks for DNS requests for faux domains similar to the domains that you want to have monitored for abuse. DNS Query Length with high standard deviation - The following analytic identifies DNS queries with unusually large lengths by computing the standard deviation of query lengths and filtering those exceeding twice the standard deviation. It leverages DNS query data from the Network_Resolution data model, focusing on the length of the domain names being resolved. This activity is significant as unusually long DNS queries can indicate data exfiltration or command-and-control communication attempts. If confirmed malicious, this activity could allow attackers to stealthily transfer data or maintain persistent communication channels within the network. Detect Long DNS TXT Record Response - This search is used to detect attempts to use DNS tunneling, by calculating the length of responses to DNS TXT queries. Endpoints using DNS as a method of transmission for data exfiltration, Command And Control, or evasion of security controls can often be detected by noting unusually large volumes of DNS traffic. Deprecated because this detection should focus on DNS queries instead of DNS responses. Large Volume of DNS ANY Queries - The following analytic identifies a large volume of DNS ANY queries, which may indicate a DNS amplification attack. It leverages the Network_Resolution data model to count DNS queries of type "ANY" directed to specific destinations. This activity is significant because DNS amplification attacks can overwhelm network resources, leading to Denial of Service (DoS) conditions. If confirmed malicious, this activity could disrupt services, degrade network performance, and potentially be part of a larger Distributed Denial of Service (DDoS) attack, impacting the availability of critical infrastructure.885Views1like0Commentsquery 2 workspaces from power bi
I have a gqkl query that is s union of 2 workspaces this runs and I get expected data when I export the query to m language and run from a psi dataflow I get ab authentication error against the second workspace the creds I'm using in pbi are for the First workspace. can this even be done or is there a more robust method?496Views1like2CommentsIngest CEF logs in CommonSecurityLog with Logstasth
Hello We are migrating to Sentinel from Splunk. For the log ingestion we are using Native Data Connectors where we can and Logstash with the microsoft-sentinel-log-analytics-logstash-output-plugin for the rest. The reason behind Logstash choice is that AMA only has a 10 GB buffer size which is too small for our need in a case of a connection drop. I am working on getting logs in a CEF format to logstash and then to the CommonSecurityLog table. I have been following instructions from this page to ingest log in Syslog format with Logstash : https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules I was able to ingest logs in a Custom Table doing so but I want now to ingest the data in the CommonSecurityLog table. I have changed the DCR rule accordingly but I only see the entries without the data or parsing : The modified DCR rule is : { "properties": { "immutableId": "dcr-1efc2494a966f2fc95f730e22", "dataCollectionEndpointId": "/subscriptions/xxxxxxxxxxxx/resourceGroups/rg_cybersecurity_sentinel_prod/providers/Microsoft.Insights/dataCollectionEndpoints/xxxxxxxx", "streamDeclarations": { "Custom-test_table_to_delete_CL": { "columns": [ { "name": "message", "type": "string" }, { "name": "event", "type": "dynamic" }, { "name": "ls_timestamp", "type": "datetime" }, { "name": "ls_version", "type": "string" } ] } }, "dataSources": {}, "destinations": { "logAnalytics": [ { "workspaceResourceId": "/subscriptions/xxxxxxx/resourcegroups/xxxxxxxxx/providers/microsoft.operationalinsights/workspaces/xxxxxxxxxxx", "workspaceId": "558b9a62-adf5-4a0b-957e-e04d82719877", "name": "558b9a62adf54a0b957ee04d82719877" } ] }, "dataFlows": [ { "streams": [ "Custom-test_table_to_delete_CL" ], "destinations": [ "558b9a62adf54a0b957ee04d82719877" ], "transformKql": "source | project-away event, ls_timestamp, ls_version | project-rename CEF=message | extend TimeGenerated = todatetime(now())", "outputStream": "Microsoft-CommonSecurityLog" } ], "provisioningState": "Succeeded" }, "location": "westus2", "id": "/subscriptions/xxxxxxxx/resourceGroups/xxxxxx/providers/Microsoft.Insights/dataCollectionRules/DCR_logs_ingestion", "name": "DCR_logs_ingestion", "type": "Microsoft.Insights/dataCollectionRules", "etag": "\"d1075091-0000-0800-0000-655509870000\"", "systemData": { "createdBy": "xxxxxxx", "createdByType": "User", "createdAt": "2023-11-08T20:46:49.8781378Z", "lastModifiedBy": "xxxxxx", "lastModifiedByType": "User", "lastModifiedAt": "2023-11-15T18:10:14.0501104Z" } } Am I missing something ? Is this even possible ? Thank you for your help1.5KViews1like2CommentsAnalytic Rule Auditing with KQL
I'm looking to audit enabled analytic rules and perform transformations on the data using KQL and I'm wondering if this is possible? I know that the API can be used to list the enabled analytic rules in the Log Analytics Workspace: GET https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.OperationalInsights/workspaces/{workspaceName}/providers/Microsoft.SecurityInsights/alertRules?api-version=2023-02-01 Then when using a Workbook we can use an Azure Resource Manager query to use this same GET request to ingest the data into a Workbook parameter. From the ingested parameter can be used within the workbook to perform filtering on the data using KQL and is then presented within the workbook. I'm wondering if there's a way to do something like this outside of the workbooks functionality? Is there a way to pull the information using the API, but then still use KQL to do what I want with the data? A use case example for this would be if I wanted to take the data, transform it, then export the output to another external application. ThanksSolved2.9KViews1like4Comments