Blog Post

Microsoft Sentinel Blog
7 MIN READ

Running KQL queries on Microsoft Sentinel data lake using API

Zeinab Mokhtarian Koorabbasloo's avatar
Apr 14, 2026

Unlock automation, agents, and scalable insights

Co-Authors: Zeinab Mokhtarian Koorabbasloo and Matthew Lowe

As security data lakes become the backbone of modern analytics platforms, organizations need new ways to operationalize their data. While interactive tools and portals support data exploration, many real-world workflows increasingly require flexible programmatic access that enables automation, scale, and seamless integration.

By running KQL (Kusto Query Language) queries on Microsoft Sentinel data lake through APIs, you can embed analytics directly into automation workflows, background services, and intelligent agents, without relying on manual query execution.

In this post, we explore API based KQL query execution, review some of the scenarios where it delivers the most value, and what you need to get started.

Why run KQL queries on Sentinel data lake via API?

Traditional query experiences, such as dashboards and query editors, are optimized for human interaction. APIs, on the other hand, are optimized for systems.

Running KQL through an API enables:

  • Automation-first analytics
  • Repeatable and scheduled insights
  • Integration with external systems and agents
  • Consistent query execution at scale

Instead of asking “How do I run this query?”, our customers are asking “How do I embed analytics into my workflow?”

Scenarios where API-based KQL queries add value

  1. Automated monitoring and alerting

SOC teams often want to continuously analyze data in their lake to detect anomalies, trends, or policy violations.

With API-based KQL execution, they can:

  • Run queries as part of automated workflows and playbooks
  • Evaluate query results programmatically
  • Trigger downstream actions such as alerts, tickets, or notifications

This turns KQL into a signal engine, not just an exploration tool.

  1. Powering intelligent agents

AI agents require programmatic access to data lakes to retrieve timely, relevant context for decision making. Using KQL over an API allows agents to:

  • Dynamically query data lake based on user intent or system context
  • Retrieve aggregated or filtered results on demand
  • Combine analytical results with reasoning and decision logic

In this model, KQL acts as the analytical retrieval layer, while the agent focuses on orchestration, reasoning, and action.

  1. Embedding analytics into business workflows

Many organizations want analytics embedded directly into CI/CD and operational pipelines. Instead of exporting data or duplicating logic, they can:

  • Run KQL queries inline via API
  • Use results as inputs to other systems
  • Keep analytics logic centralized and consistent

This reduces drift between “analytics code” and “application code.”

High-level flow: What happens when you run KQL via API

At a conceptual level, the flow looks like this:

  1. A client authenticates to Microsoft Sentinel data lake platform.
  2. The client submits a KQL query via an API.
  3. The query executes against data stored in the data lake.
  4. Results are returned in a structured, machine-readable format.
  5. The client processes or acts on the results.

Prerequisites

To run KQL queries against the Sentinel data lake using APIs, you will need:

  • A user token or a service principal
  • Appropriate permissions to execute queries on the Sentinel data lake. Azure RBAC roles such as Log Analytics reader or Log Analytics contributor on the workspace are needed.
  • Familiarity with KQL and API based query execution patterns

Scenario 1: Execute a KQL query via API within a Playbook

The following Sentinel SOAR playbook example demonstrates how data within Sentinel data lake can be used within automation. This example leverages a service principal that will be used to query the DeviceNetworkEvent logs that are within Sentinel data lake to enrich an incident involving a device before taking action on it.

Within this playbook, the entities involved within the incident are retrieved, then queries are executed against the Sentinel data lake to gain insights on each host involved. For this example, the API call to the Sentinel data lake to retrieve events from the DeviceNetworkEvents table to find relevant information that shows network connections with the host where the IP originated from outside of the United States.

 

As this action does not have a gallery artifacts within Azure Logic Apps, the action must be built out by using the HTTP action that is offered within Logic Apps. This action requires the API details for the API call as well as the authentication details that will be used to run the API. The step that executes the query leverages the Sentinel data lake API by performing the following call: POST https://api.securityplatform.microsoft.com/lake/kql/v2/rest/query. The service principal being used has read permissions on the Sentinel data lake that contains the relevant details and is authenticating to Entra ID OAuth when running the API call.

NOTE: When using API calls to query Sentinel data lake, use 4500ebfb-89b6-4b14-a480-7f749797bfcd/.default as the scope/audience when retrieving a token for the service principal. This GUID is associated with the query service for Sentinel data lake.

The body of the query is the following:

{
"csl": "DeviceNetworkEvents | where TimeGenerated >= ago(30d) | where DeviceName has '' | where ActionType in (\"ConnectionSuccess\", \"ConnectionAttempted\", \"InboundConnectionAccepted\") | extend GeoInfo = geo_info_from_ip_address(RemoteIP) | extend Country = tostring(GeoInfo.country), State = tostring(GeoInfo.state), City = tostring(GeoInfo.city) | where Country != 'United States' and RemoteIP !has '127.0.0.1' | project TimeGenerated, DeviceName, ActionType, RemoteIP, RemotePort, RemoteUrl, City, State, Country, InitiatingProcessFileName | order by TimeGenerated desc | top 2 by DeviceName", “db”: “WORKSPACENAMEHERE – WORKSPACEIDHERE”
}

Within this body, the query and workspace are defined. “csl” represents the query to run against the Sentinel data lake and “db” represents the Sentinel workspace/lake. This value is a combination of the workspace name – workspace ID. Both of these values can be found on the workspace overview blade within Azure.

NOTE: The query must be one line in the JSON. Multi-line items will not be seen as valid JSON.

With this, initial investigative querying via Sentinel data lake has been done the moment that the incident is triggered, allowing the SOC analyst responding to expediate their investigation and validating that the automated action of disabling the account was justified. For this Playbook, the results gathered from Sentinel data lake were placed into a comment and added to the incident within Defender, allowing SOC analysts to quickly review relevant details when beginning their work:

Scenario 2: Execute a KQL query via API in code

The following Python example demonstrates how to use a service principal to execute a KQL query on the Sentinel data lake via API. This example is provided for illustration purposes, but you can also call the API directly via common API tools. Within the code, the query and workspace are defined. “csl” represents the query to run against the Sentinel data lake and “db” represents the Sentinel workspace/lake. This value is a combination of the workspace name – workspace ID. Both of these values can be found on the workspace overview blade within Azure.

You also need to use a token or a service principal.

import requests
import msal

# ====== SPN / Entra app settings ======
TENANT_ID = ""
CLIENT_ID = ""
CLIENT_SECRET = ""

# Token authority
AUTHORITY = f"https://login.microsoftonline.com/{TENANT_ID}"

# ---- IMPORTANT ----
# Most APIs use the resource + "/.default" pattern for client-credentials.
# Try this first:
SCOPE = ["4500ebfb-89b6-4b14-a480-7f749797bfcd/.default"]

# ====== KQL query payload ======
KQL_QUERY = {
    "csl": "SigninLogs| take 10",
    "db": " workspace1-12345678-abcd-abcd-1234-1234567890ab ",
    "properties": {
        "Options": {
            "servertimeout": "00:04:00",
            "queryconsistency": "strongconsistency",
            "query_language": "kql",
            "request_readonly": False,
            "request_readonly_hardline": False
        }
    }
}
# ====== Acquire token using client credentials ======
app = msal.ConfidentialClientApplication(
    client_id=CLIENT_ID,
    authority=AUTHORITY,
    client_credential=CLIENT_SECRET
)

result = app.acquire_token_for_client(scopes=SCOPE)

if "access_token" not in result:
    raise RuntimeError(
        f"Token acquisition failed: {result.get('error')} - {result.get('error_description')}"
    )

access_token = result["access_token"]

# ====== Call the KQL API ======
headers = {
    "Authorization": f"Bearer {access_token}",
    "Content-Type": "application/json"
}

url = "https://api.securityplatform.microsoft.com/lake/kql/v2/rest/query"  # same endpoint
response = requests.post(url, headers=headers, json=KQL_QUERY)

if response.status_code == 200:
    print("Query Results:")
    print(response.json())
else:
    print(f"Error {response.status_code}: {response.text}")

In summary, you need the following parameters in your API call:

Request URI: https://api.securityplatform.microsoft.com/lake/kql/v2/rest/query

Method: POST

Sample payload:

{
    "csl": " SigninLogs | take 10",
    "db": "workspace1-12345678-abcd-abcd-1234-1234567890ab",

 }

Limitations and considerations

The following considerations should be considered when planning to execute KQL queries on a data lake:

  • Service principal permissions

When using a service principal, Azure RBAC roles can be assigned at the Sentinel workspace level. Entra ID roles or XDR unified RBAC role are not supported for this scenario. Alternatively, user tokens with Entra ID roles can be used.

  • Result size limits
    Queries are subject to limits on execution time and response size. Review Microsoft Sentinel data lake query service limits when designing your workflows.

Summary

Running KQL queries on Sentinel data lake via APIs unlocks a new class of scenarios, from intelligent agents to fully automated analytics pipelines. By decoupling query execution from user interfaces, customers gain flexibility, scalability, and control over how insights are generated and consumed.

If you’re already using KQL for interactive analysis, API access is the natural next step toward production grade analytics.

Happy hunting!

Resources

Updated Apr 11, 2026
Version 1.0
No CommentsBe the first to comment