<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>Microsoft Sentinel topics</title>
    <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/bd-p/MicrosoftSentinel</link>
    <description>Microsoft Sentinel topics</description>
    <pubDate>Wed, 11 Mar 2026 23:26:04 GMT</pubDate>
    <dc:creator>MicrosoftSentinel</dc:creator>
    <dc:date>2026-03-11T23:26:04Z</dc:date>
    <item>
      <title>Clarification on UEBA Behaviors Layer Support for Zscaler and Fortinet Logs</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/clarification-on-ueba-behaviors-layer-support-for-zscaler-and/m-p/4496720#M12879</link>
      <description>&lt;P&gt;I would like to confirm whether the new UEBA Behaviors Layer in Microsoft Sentinel currently supports generating behavior insights for Zscaler and Fortinet log sources.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Based on the documentation, the preview version of the Behaviors Layer only supports specific vendors under CommonSecurityLog (CyberArk Vault and Palo Alto Threats), AWS CloudTrail services, and GCP Audit Logs. Since Zscaler and Fortinet are not listed among the supported vendors, I want to verify:&lt;/P&gt;&lt;P&gt;Does the UEBA Behaviors Layer generate behavior records for Zscaler and Fortinet logs, or are these vendors currently unsupported for behavior generation? As logs from Zscaler and Fortinet will also be get ingested in CommonSecurityLog table only.&lt;/P&gt;</description>
      <pubDate>Tue, 24 Feb 2026 16:54:36 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/clarification-on-ueba-behaviors-layer-support-for-zscaler-and/m-p/4496720#M12879</guid>
      <dc:creator>RohitN026</dc:creator>
      <dc:date>2026-02-24T16:54:36Z</dc:date>
    </item>
    <item>
      <title>McasShadowItReporting / Cloud Discovery in Azure Sentinel</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/mcasshadowitreporting-cloud-discovery-in-azure-sentinel/m-p/4495351#M12871</link>
      <description>&lt;P&gt;Hi!&lt;BR /&gt;&lt;BR /&gt;I´m trying to Query the&amp;nbsp;McasShadowItReporting Table, for Cloud App DISCOVERYs&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;The Table is empty at the moment, the connector is warning me that the Workspace is onboarded to Unified Security Operations Platform&lt;BR /&gt;So I cant activate it here&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I cant mange it via &lt;A class="lia-external-url" href="https://security.microsoft.com/," target="_blank"&gt;https://security.microsoft.com/,&lt;/A&gt; too&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;The Documentation ( &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/defender-cloud-apps/siem-sentinel#integrating-with-microsoft-sentinel" target="_blank"&gt;https://learn.microsoft.com/en-us/defender-cloud-apps/siem-sentinel#integrating-with-microsoft-sentinel&lt;/A&gt; )&amp;nbsp;&lt;/P&gt;&lt;P&gt;Leads me to the SIEM Integration, which is configured for (for a while)&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&lt;BR /&gt;I wonder if something is misconfigured here and why there is no log ingress / how I can query them&amp;nbsp;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 17 Feb 2026 09:48:47 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/mcasshadowitreporting-cloud-discovery-in-azure-sentinel/m-p/4495351#M12871</guid>
      <dc:creator>Felix87</dc:creator>
      <dc:date>2026-02-17T09:48:47Z</dc:date>
    </item>
    <item>
      <title>CrowdStrike API Data Connector (via Codeless Connector Framework) (Preview)</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/crowdstrike-api-data-connector-via-codeless-connector-framework/m-p/4494852#M12870</link>
      <description>&lt;P&gt;API scopes created. Added to Connector however only streams observed are from Alerts and Hosts.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Detections is not logging? Anyone experiencing this issue? Github has post about it apears to be escalated for feature request.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;CrowdStrikeDetections. not ingested&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Anyone have this setup and working?&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 17:04:32 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/crowdstrike-api-data-connector-via-codeless-connector-framework/m-p/4494852#M12870</guid>
      <dc:creator>logger2115</dc:creator>
      <dc:date>2026-02-13T17:04:32Z</dc:date>
    </item>
    <item>
      <title>Salesforce Service Cloud (via Codeless Connector Framework)</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/salesforce-service-cloud-via-codeless-connector-framework/m-p/4494850#M12869</link>
      <description>&lt;P&gt;We have 3 environments, does this connector support multiple tenants or is it limited to onle one FQDN?&lt;/P&gt;</description>
      <pubDate>Fri, 13 Feb 2026 16:53:06 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/salesforce-service-cloud-via-codeless-connector-framework/m-p/4494850#M12869</guid>
      <dc:creator>logger2115</dc:creator>
      <dc:date>2026-02-13T16:53:06Z</dc:date>
    </item>
    <item>
      <title>Dedicated cluster for Sentinels in different tenants</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/dedicated-cluster-for-sentinels-in-different-tenants/m-p/4494529#M12868</link>
      <description>&lt;P&gt;Hello&lt;BR /&gt;&lt;BR /&gt;I see that there is a possibility to use a dedicated cluster for a workspace in the same Azure region. What about workspaces that reside in different tenants but are in the same Azure region? Is that possible?&lt;/P&gt;&lt;P&gt;We are utilizing multiple tenants, and we want to keep this operational model. However, there is a central SOC, and we wonder if there is a possibility to utilize a dedicated cluster for cost optimization.&lt;/P&gt;</description>
      <pubDate>Thu, 12 Feb 2026 08:47:30 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/dedicated-cluster-for-sentinels-in-different-tenants/m-p/4494529#M12868</guid>
      <dc:creator>de3no2</dc:creator>
      <dc:date>2026-02-12T08:47:30Z</dc:date>
    </item>
    <item>
      <title>How Should a Fresher Learn Microsoft Sentinel Properly?</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/how-should-a-fresher-learn-microsoft-sentinel-properly/m-p/4494249#M12866</link>
      <description>&lt;P&gt;Hello everyone,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am a fresher interested in learning Microsoft Sentinel and preparing for SOC roles.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Since Sentinel is a cloud-native enterprise tool and usually used inside organizations, I am unsure how individuals without company access are expected to gain real hands-on experience.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would like to hear from professionals who actively use Sentinel:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;- How do freshers typically learn and practice Sentinel?&lt;/P&gt;&lt;P&gt;- What learning resources or environments are commonly used by beginners?&lt;/P&gt;&lt;P&gt;- What level of hands-on experience is realistically expected at entry level?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am looking for guidance based on real industry practice.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you for your time.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 11 Feb 2026 02:38:02 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/how-should-a-fresher-learn-microsoft-sentinel-properly/m-p/4494249#M12866</guid>
      <dc:creator>Arjun34</dc:creator>
      <dc:date>2026-02-11T02:38:02Z</dc:date>
    </item>
    <item>
      <title>How do I import Purview Unified Audit Log data related to the use of the Audit Log into Sentinel?</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/how-do-i-import-purview-unified-audit-log-data-related-to-the/m-p/4488430#M12863</link>
      <description>&lt;P&gt;Dear Community, I would like to implement the following scenario on an environment with Microsoft 365 E5 licenses:&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Scenario&lt;/STRONG&gt;: I want to import audit activities into an Azure Log Analytics workspace linked to Sentinel to generate alerts/incidents as soon as a search is performed in the Microsoft 365 Purview Unified Audit Log (primarily for IRM purposes).&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Challenge&lt;/STRONG&gt;: Neither the "Microsoft 365" connector, nor the "Defender XDR" or "Purview" (which appear to be exclusively Azure Purview) connectors are importing the necessary data.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Question&lt;/STRONG&gt;: Which connector do I have to use in order to obtain Purview Unified Audit Log activities about the use of the Purview Unified Audit Log so that I can identify...&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;...which&lt;/EM&gt;&amp;nbsp;user conducted&amp;nbsp;&lt;EM&gt;when&lt;/EM&gt;&amp;nbsp;an audit log search and with&amp;nbsp;&lt;EM&gt;what&lt;/EM&gt; kind of search query.&lt;/P&gt;&lt;P&gt;Thank you!&lt;/P&gt;</description>
      <pubDate>Thu, 22 Jan 2026 09:29:43 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/how-do-i-import-purview-unified-audit-log-data-related-to-the/m-p/4488430#M12863</guid>
      <dc:creator>BM-HV</dc:creator>
      <dc:date>2026-01-22T09:29:43Z</dc:date>
    </item>
    <item>
      <title>Issue connecting Azure Sentinel GitHub app to Sentinel Instance when IP allow list is enabled</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/issue-connecting-azure-sentinel-github-app-to-sentinel-instance/m-p/4486172#M12862</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I’m running into an issue connecting the Azure Sentinel GitHub app to my Sentinel workspace in order to create our CI/CD pipelines for our detection rules, and I’m hoping someone can point me in the right direction.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Symptoms:&lt;/P&gt;&lt;P&gt;When configuring the GitHub connection in Sentinel, the repository dropdown does not populate.&lt;/P&gt;&lt;P&gt;There are no explicit errors, but the connection clearly isn’t completing.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If I disable my organization’s IP allow list, everything works as expected and the repos appear immediately.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;I’ve seen that some GitHub Apps automatically add the IP ranges they require to an organization’s allow list. However, from what I can tell, the Azure Sentinel GitHub app does not seem to have this capability, and requires manual allow listing instead.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;What I’ve tried / researched:&lt;/P&gt;&lt;P&gt;Reviewed Microsoft documentation for Sentinel ↔ GitHub integrations&lt;/P&gt;&lt;P&gt;Looked through Azure IP range and Service Tag documentation&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I’ve seen recommendations to allow list the IP ranges published at //api.github.com/meta, as many GitHub apps rely on these ranges&lt;/P&gt;&lt;P&gt;I’ve already tried allow listing multiple ranges from the GitHub meta endpoint, but the issue persists&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My questions:&lt;/P&gt;&lt;P&gt;Does anyone know which IP ranges are used by the Azure Sentinel GitHub app specifically?&lt;/P&gt;&lt;P&gt;Is there an official or recommended approach for using this integration in environments with strict IP allow lists?&lt;/P&gt;&lt;P&gt;Has anyone successfully configured this integration without fully disabling IP restrictions?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any insight, references, or firsthand experience would be greatly appreciated. Thanks in advance!&lt;/P&gt;</description>
      <pubDate>Fri, 16 Jan 2026 04:33:45 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/issue-connecting-azure-sentinel-github-app-to-sentinel-instance/m-p/4486172#M12862</guid>
      <dc:creator>JingleDingle</dc:creator>
      <dc:date>2026-01-16T04:33:45Z</dc:date>
    </item>
    <item>
      <title>How to Prevent Workspace Details from Appearing in LAQueryLogs During Cross-Workspace Queries</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/how-to-prevent-workspace-details-from-appearing-in-laquerylogs/m-p/4483079#M12860</link>
      <description>&lt;P&gt;I’ve onboarded multiple workspaces using Azure Lighthouse, and I’m running cross-workspace KQL queries using the workspace() function.&lt;BR /&gt;However, I’ve noticed that LAQueryLogs records the query in every referenced workspace, and the RequestContext field includes details about all other workspaces involved in the query.&lt;BR /&gt;Is there any way to run cross-workspace queries without having all workspace details logged in LAQueryLogs for each referenced workspace?&lt;/P&gt;</description>
      <pubDate>Mon, 05 Jan 2026 14:09:29 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/how-to-prevent-workspace-details-from-appearing-in-laquerylogs/m-p/4483079#M12860</guid>
      <dc:creator>ParthPatel50</dc:creator>
      <dc:date>2026-01-05T14:09:29Z</dc:date>
    </item>
    <item>
      <title>I'm stuck!</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/i-m-stuck/m-p/4476042#M12854</link>
      <description>&lt;P&gt;Logically, I'm not sure how\if I can do this.&lt;/P&gt;&lt;P&gt;I want to monitor for EntraID Group additions - I can get this to work for a single entry using this:&lt;/P&gt;&lt;P&gt;AuditLogs&lt;BR /&gt;| where TimeGenerated &amp;gt; ago(7d)&lt;BR /&gt;| where OperationName == "Add member to group"&lt;BR /&gt;| where TargetResources[0].type == "User"&lt;BR /&gt;| extend GroupName = tostring(parse_json(tostring(parse_json(tostring(TargetResources[0].modifiedProperties))[1].newValue)))&lt;BR /&gt;| where GroupName == "NameOfGroup" &amp;lt;-- This returns the single entry&lt;BR /&gt;| extend User = tostring(TargetResources[0].userPrincipalName)&lt;BR /&gt;| summarize ['Count of Users Added']=dcount(User), ['List of Users Added']=make_set(User) by GroupName&lt;BR /&gt;| sort by GroupName asc&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, I have a list of 20 Priv groups that I need to monitor.&amp;nbsp; I can do this using:&lt;/P&gt;&lt;P&gt;let PrivGroups = dynamic[('name1','name2','name3'});&lt;/P&gt;&lt;P&gt;and then call that like this:&lt;/P&gt;&lt;P&gt;blahblah&lt;/P&gt;&lt;P&gt;| where TargetResources[0].type == "User"&lt;BR /&gt;| extend GroupName = tostring(parse_json(tostring(parse_json(tostring(TargetResources[0].modifiedProperties))[1].newValue)))&lt;BR /&gt;| where GroupName has_any (PrivGroup)&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But that's a bit dirty to update - I wanted to call a watchlist.&amp;nbsp; I've tried defining with:&lt;/P&gt;&lt;P&gt;let PrivGroup = (_GetWatchlist('TestList'));&lt;/P&gt;&lt;P&gt;and tried calling like:&lt;/P&gt;&lt;P&gt;blahblah&lt;/P&gt;&lt;P&gt;| where TargetResources[0].type == "User"&lt;BR /&gt;| extend GroupName = tostring(parse_json(tostring(parse_json(tostring(TargetResources[0].modifiedProperties))[1].newValue)))&lt;BR /&gt;| where GroupName has_any ('PrivGroup')&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I've tried dropping the let and attempted to lookup the watchlist directly:&lt;/P&gt;&lt;P&gt;| where GroupName has_any (_GetWatchlist('TestList'))&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The query runs but doesn't return any results (Obvs I know the result exists) - How do I lookup that extracted value on a Watchlist.&lt;/P&gt;&lt;P&gt;Any ideas or pointers why I'm wrong would be appreciated!&lt;/P&gt;&lt;P&gt;Many thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 08 Dec 2025 14:06:14 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/i-m-stuck/m-p/4476042#M12854</guid>
      <dc:creator>MrD</dc:creator>
      <dc:date>2025-12-08T14:06:14Z</dc:date>
    </item>
    <item>
      <title>Webinar Rescheduled: AI-Powered Entity Analysis in Sentinel's MCP Server</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/webinar-rescheduled-ai-powered-entity-analysis-in-sentinel-s-mcp/m-p/4475369#M12853</link>
      <description>&lt;P&gt;Hi folks!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The webinar: &lt;STRONG&gt;AI-Powered Entity Analysis in Sentinel's MCP Server&lt;/STRONG&gt; which was previously scheduled for: &lt;SPAN class="lia-text-color-8"&gt;January 13th, 2026&lt;/SPAN&gt;, has been rescheduled to: &lt;STRONG&gt;&lt;SPAN class="lia-text-color-11"&gt;January 27th, 2026, at 9:00 AM PT.&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Please delete the old invite from your calendar and find the new one at &lt;A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/microsoft-security-blog/welcome-to-the-microsoft-security-community/4471927" data-lia-auto-title="aka.ms/securitycommunity." data-lia-auto-title-active="0" target="_blank"&gt;aka.ms/securitycommunity.&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We apologize for the inconvenience and hope to see you there!&lt;/P&gt;</description>
      <pubDate>Fri, 05 Dec 2025 00:50:30 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/webinar-rescheduled-ai-powered-entity-analysis-in-sentinel-s-mcp/m-p/4475369#M12853</guid>
      <dc:creator>emilyfalla</dc:creator>
      <dc:date>2025-12-05T00:50:30Z</dc:date>
    </item>
    <item>
      <title>Understand New Sentinel Pricing Model with Sentinel Data Lake Tier</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/understand-new-sentinel-pricing-model-with-sentinel-data-lake/m-p/4473020#M12852</link>
      <description>&lt;H1&gt;&lt;STRONG&gt;&lt;SPAN class="lia-text-color-10"&gt;Introduction on Sentinel and its New Pricing Model&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/H1&gt;
&lt;P&gt;Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) platform that collects, analyzes, and correlates security data from across your environment to detect threats and automate response. Traditionally, Sentinel stored all ingested data in the &lt;STRONG data-start="335" data-end="379"&gt;Analytics tier (Log Analytics workspace)&lt;/STRONG&gt;, which is powerful but expensive for high-volume logs. To reduce cost and enable customers to retain all security data without compromise, Microsoft introduced a &lt;STRONG data-start="542" data-end="573"&gt;new dual-tier pricing model&lt;/STRONG&gt; consisting of the &lt;STRONG data-start="592" data-end="610"&gt;Analytics tier&lt;/STRONG&gt; and the &lt;STRONG data-start="619" data-end="637"&gt;Data Lake tier&lt;/STRONG&gt;. The Analytics tier continues to support fast, real-time querying and analytics for core security scenarios, while the new Data Lake tier provides &lt;STRONG data-start="785" data-end="810"&gt;very low-cost storage&lt;/STRONG&gt; for long-term retention and high-volume datasets. Customers can now choose where each data type lands—analytics for high-value detections and investigations, and data lake for large or archival types—allowing organizations to significantly lower cost while still retaining all their security data for analytics, compliance, and hunting.&lt;/P&gt;
&lt;P&gt;Please flow diagram depicts new sentinel pricing model:&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;Now let's understand this new pricing model with below scenarios:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Scenario 1A (PAY GO)&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Scenario 1B (Usage Commitment)&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Scenario 2 (Data Lake Tier Only)&lt;/STRONG&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;H1&gt;&lt;STRONG&gt;&lt;SPAN class="lia-text-color-10"&gt;Scenario 1A (PAY GO)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/H1&gt;
&lt;H3 data-start="129" data-end="150"&gt;&lt;STRONG data-start="133" data-end="148"&gt;Requirement&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P data-start="151" data-end="350"&gt;Suppose you need to ingest &lt;STRONG data-start="178" data-end="203"&gt;10 GB of data per day&lt;/STRONG&gt;, and you must retain that data for &lt;STRONG data-start="239" data-end="250"&gt;2 years&lt;/STRONG&gt;. However, you will only &lt;STRONG data-start="275" data-end="313"&gt;frequently use, query, and analyze&lt;/STRONG&gt; the data for the &lt;STRONG data-start="331" data-end="349"&gt;first 6 months&lt;/STRONG&gt;.&lt;/P&gt;
&lt;H3 data-start="352" data-end="370"&gt;&lt;STRONG data-start="356" data-end="368"&gt;Solution&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P data-start="371" data-end="705"&gt;To optimize cost, you can ingest the data into the &lt;STRONG data-start="422" data-end="440"&gt;Analytics tier&lt;/STRONG&gt; and retain it there for the first &lt;STRONG data-start="475" data-end="487"&gt;6 months&lt;/STRONG&gt;, where active querying and investigation happen. After that period, the remaining &lt;STRONG data-start="570" data-end="596"&gt;18 months of retention&lt;/STRONG&gt; can be shifted to the &lt;STRONG data-start="619" data-end="637"&gt;Data Lake tier&lt;/STRONG&gt;, which provides low-cost storage for compliance and auditing needs. But you will be charged separately for data lake tier querying and analytics which depicted as &lt;STRONG&gt;Compute (D)&lt;/STRONG&gt; in pricing flow diagram.&lt;/P&gt;
&lt;H3 data-start="707" data-end="735"&gt;&lt;STRONG data-start="711" data-end="735"&gt;Pricing Flow / Notes&lt;/STRONG&gt;&lt;/H3&gt;
&lt;UL data-start="737" data-end="1231"&gt;
&lt;LI data-start="737" data-end="852"&gt;&lt;STRONG data-start="739" data-end="762"&gt;The first 10 GB/day&lt;/STRONG&gt; ingested into the Analytics tier is &lt;STRONG data-start="799" data-end="819"&gt;free for 31 days&lt;/STRONG&gt; under the Analytics logs plan.&lt;/LI&gt;
&lt;LI data-start="853" data-end="1000"&gt;&lt;STRONG data-start="855" data-end="926"&gt;All data ingested into the Analytics tier is automatically mirrored&lt;/STRONG&gt; to the Data Lake tier &lt;STRONG data-start="949" data-end="997"&gt;at no additional ingestion or retention cost&lt;/STRONG&gt;.&lt;/LI&gt;
&lt;LI data-start="1001" data-end="1122"&gt;&lt;STRONG data-start="1003" data-end="1029"&gt;For the first 6 months&lt;/STRONG&gt;, you pay only for &lt;STRONG data-start="1048" data-end="1090"&gt;Analytics tier ingestion and retention&lt;/STRONG&gt;, excluding any free capacity.&lt;/LI&gt;
&lt;LI data-start="1123" data-end="1231"&gt;&lt;STRONG data-start="1125" data-end="1151"&gt;For the next 18 months&lt;/STRONG&gt;, you pay only for &lt;STRONG data-start="1170" data-end="1198"&gt;Data Lake tier retention&lt;/STRONG&gt;, which is significantly cheaper.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1&gt;&lt;STRONG&gt;Azure Pricing Calculator Equivalent&amp;nbsp;&lt;/STRONG&gt;&lt;/H1&gt;
&lt;P data-start="107" data-end="199"&gt;Assuming no data is queried or analyzed during the 18-month Data Lake tier retention period:&lt;/P&gt;
&lt;P data-start="201" data-end="429"&gt;Although the Analytics tier retention is set to &lt;STRONG data-start="249" data-end="261"&gt;6 months&lt;/STRONG&gt;, the &lt;STRONG data-start="267" data-end="334"&gt;first 3 months of retention fall under the free retention limit&lt;/STRONG&gt;, so retention charges apply only for the remaining 3 months of the analytics retention window. Azure pricing calculator will adjust accordingly.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1&gt;&lt;STRONG&gt;&lt;SPAN class="lia-text-color-10"&gt;Scenario 1B (Usage Commitment)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/H1&gt;
&lt;P data-start="112" data-end="296"&gt;Now, suppose you are ingesting &lt;STRONG data-start="143" data-end="161"&gt;100 GB per day&lt;/STRONG&gt;. If you follow the same pay-as-you-go pricing model described above, your estimated cost would be approximately &lt;STRONG data-start="274" data-end="295"&gt;$15,204 per month&lt;/STRONG&gt;.&lt;/P&gt;
&lt;P data-start="298" data-end="588"&gt;However, you can reduce this cost by choosing a &lt;STRONG data-start="346" data-end="365"&gt;Commitment Tier&lt;/STRONG&gt;, where Analytics tier &lt;STRONG data-start="388" data-end="401"&gt;ingestion&lt;/STRONG&gt; is billed at a &lt;STRONG data-start="417" data-end="436"&gt;discounted rate&lt;/STRONG&gt;. Note that the discount applies &lt;STRONG data-start="469" data-end="490"&gt;only to Analytics tier ingestion&lt;/STRONG&gt;—it does &lt;STRONG data-start="499" data-end="506"&gt;not&lt;/STRONG&gt; apply to Analytics tier retention costs or to any Data Lake tier–related charges.&lt;/P&gt;
&lt;P data-start="590" data-end="681"&gt;Please refer to the pricing flow and the equivalent pricing calculator results shown below.&lt;/P&gt;
&lt;P data-start="683" data-end="751"&gt;&lt;STRONG data-start="683" data-end="708"&gt;Monthly cost savings:&lt;/STRONG&gt;&lt;BR data-start="708" data-end="711" /&gt;&lt;STRONG data-start="711" data-end="751"&gt;$15,204 – $11,184 = $4,020 per month&lt;/STRONG&gt;&lt;/P&gt;
&lt;P data-start="91" data-end="229"&gt;Now the question is: &lt;EM data-start="112" data-end="164"&gt;What happens if your usage reaches 150 GB per day?&lt;/EM&gt;&lt;BR data-start="164" data-end="167" /&gt;Will the additional 50 GB be billed at the Pay-As-You-Go rate?&lt;/P&gt;
&lt;P data-start="231" data-end="367"&gt;&lt;STRONG data-start="231" data-end="238"&gt;No.&lt;/STRONG&gt; The entire 150 GB/day will still be billed at the &lt;STRONG data-start="289" data-end="308"&gt;discounted rate&lt;/STRONG&gt; associated with the &lt;STRONG data-start="329" data-end="366"&gt;100 GB/day commitment tier bucket&lt;/STRONG&gt;.&lt;/P&gt;
&lt;img /&gt;
&lt;H1&gt;&lt;STRONG&gt;Azure Pricing Calculator Equivalent (100 GB/ Day)&lt;/STRONG&gt;&lt;/H1&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1&gt;&lt;STRONG&gt;Azure Pricing Calculator Equivalent (150 GB/ Day)&lt;/STRONG&gt;&lt;/H1&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1&gt;&lt;SPAN class="lia-text-color-10"&gt;&lt;STRONG&gt;Scenario 2 (Data Lake Tier Only)&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/H1&gt;
&lt;H3 data-start="90" data-end="111"&gt;&lt;STRONG data-start="94" data-end="109"&gt;Requirement&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P data-start="112" data-end="394"&gt;Suppose you need to store certain &lt;STRONG data-start="146" data-end="174"&gt;audit or compliance logs&lt;/STRONG&gt; amounting to &lt;STRONG data-start="188" data-end="205"&gt;10 GB per day&lt;/STRONG&gt;. These logs are &lt;STRONG data-start="222" data-end="277"&gt;not used for querying, analytics, or investigations&lt;/STRONG&gt; on a regular basis, but must be retained for &lt;STRONG data-start="323" data-end="334"&gt;2 years&lt;/STRONG&gt; as per your organization’s compliance or forensic policies.&lt;/P&gt;
&lt;H3 data-start="396" data-end="414"&gt;&lt;STRONG data-start="400" data-end="412"&gt;Solution&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P data-start="415" data-end="732"&gt;Since these logs are not actively analyzed, you should &lt;STRONG data-start="470" data-end="518"&gt;avoid ingesting them into the Analytics tier&lt;/STRONG&gt;, which is more expensive and optimized for active querying.&lt;BR data-start="578" data-end="581" /&gt;Instead, send them &lt;STRONG data-start="600" data-end="634"&gt;directly to the Data Lake tier&lt;/STRONG&gt;, where they can be retained cost-effectively for future &lt;STRONG data-start="691" data-end="725"&gt;audit, compliance, or forensic&lt;/STRONG&gt; needs.&lt;/P&gt;
&lt;H3 data-start="734" data-end="754"&gt;&lt;STRONG data-start="738" data-end="754"&gt;Pricing Flow&lt;/STRONG&gt;&lt;/H3&gt;
&lt;UL data-start="756" data-end="1215"&gt;
&lt;LI data-start="756" data-end="913"&gt;Because the data is ingested &lt;STRONG data-start="787" data-end="823"&gt;directly into the Data Lake tier&lt;/STRONG&gt;, you pay &lt;STRONG data-start="833" data-end="865"&gt;both ingestion and retention&lt;/STRONG&gt; costs there for the &lt;STRONG data-start="886" data-end="910"&gt;entire 2-year period&lt;/STRONG&gt;.&lt;/LI&gt;
&lt;LI data-start="914" data-end="1084"&gt;If, at any point in the future, you need to perform &lt;STRONG data-start="968" data-end="1011"&gt;advanced analytics, querying, or search&lt;/STRONG&gt;, you will incur &lt;STRONG data-start="1028" data-end="1058"&gt;additional compute charges&lt;/STRONG&gt;, based on actual usage.&lt;/LI&gt;
&lt;LI data-start="1085" data-end="1215"&gt;Even with occasional compute charges, the cost remains &lt;STRONG data-start="1142" data-end="1165"&gt;significantly lower&lt;/STRONG&gt; than storing the same data in the Analytics tier.&lt;/LI&gt;
&lt;/UL&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3 data-start="1222" data-end="1246"&gt;&lt;STRONG data-start="1226" data-end="1246"&gt;Realized Savings&lt;/STRONG&gt;&lt;/H3&gt;
&lt;DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"&gt;&lt;table border="1" style="border-width: 1px;"&gt;&lt;thead&gt;&lt;tr&gt;&lt;th&gt;Scenario&lt;/th&gt;&lt;th&gt;Cost per Month&lt;/th&gt;&lt;/tr&gt;&lt;/thead&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;&lt;STRONG data-start="1309" data-end="1324"&gt;Scenario 1:&lt;/STRONG&gt; 10 GB/day in Analytics tier&lt;/td&gt;&lt;td&gt;&lt;STRONG data-start="1355" data-end="1368"&gt;$1,520.40&lt;/STRONG&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;&lt;STRONG data-start="1373" data-end="1388"&gt;Scenario 2:&lt;/STRONG&gt; 10 GB/day directly into Data Lake tier&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;STRONG data-start="1430" data-end="1441"&gt;$202.20 (without compute)&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG data-start="1430" data-end="1441"&gt;&lt;STRONG data-start="1597" data-end="1624"&gt;$257.20 (with sample compute price)&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;/DIV&gt;
&lt;UL data-start="1445" data-end="1624"&gt;
&lt;LI data-start="1445" data-end="1534"&gt;&lt;STRONG data-start="1447" data-end="1484"&gt;Savings with no compute activity:&lt;/STRONG&gt;&lt;BR data-start="1484" data-end="1487" /&gt;&lt;STRONG data-start="1489" data-end="1534"&gt;$1,520.40 – $202.20 = $1,318.20 per month&lt;/STRONG&gt;&lt;/LI&gt;
&lt;LI data-start="1536" data-end="1624"&gt;&lt;STRONG data-start="1538" data-end="1592"&gt;Savings with some compute activity (sample value):&lt;/STRONG&gt;&lt;BR data-start="1592" data-end="1595" /&gt;&lt;STRONG data-start="1489" data-end="1534"&gt;$1,520.40 – $257.20 = $1,263.20 per month&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;H1&gt;&lt;STRONG&gt;Azure calculator equivalent without compute&lt;/STRONG&gt;&lt;/H1&gt;
&lt;img /&gt;
&lt;H1&gt;&lt;STRONG&gt;Azure calculator equivalent with Sample Compute&lt;/STRONG&gt;&lt;/H1&gt;
&lt;img /&gt;
&lt;H1&gt;&lt;SPAN class="lia-text-color-10"&gt;&lt;STRONG&gt;Conclusion&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/H1&gt;
&lt;P&gt;The combination of the &lt;STRONG data-start="157" data-end="175"&gt;Analytics tier&lt;/STRONG&gt; and the &lt;STRONG data-start="184" data-end="202"&gt;Data Lake tier&lt;/STRONG&gt; in Microsoft Sentinel enables organizations to optimize cost based on how their security data is used. High-value logs that require frequent querying, real-time analytics, and investigation can be stored in the &lt;STRONG data-start="414" data-end="432"&gt;Analytics tier&lt;/STRONG&gt;, which provides powerful search performance and built-in detection capabilities. At the same time, large-volume or infrequently accessed logs—such as audit, compliance, or long-term retention data—can be directed to the &lt;STRONG data-start="653" data-end="671"&gt;Data Lake tier&lt;/STRONG&gt;, which offers dramatically lower storage and ingestion costs. Because all Analytics tier data is automatically mirrored to the Data Lake tier at no extra cost, customers can use the Analytics tier only for the period they actively query data, and rely on the Data Lake tier for the remaining retention. This tiered model allows different scenarios—active investigation, archival storage, compliance retention, or large-scale telemetry ingestion—to be handled at the most cost-effective layer, ultimately delivering substantial savings without sacrificing visibility, retention, or future analytical capabilities.&lt;/P&gt;</description>
      <pubDate>Wed, 26 Nov 2025 12:43:46 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/understand-new-sentinel-pricing-model-with-sentinel-data-lake/m-p/4473020#M12852</guid>
      <dc:creator>Aaida_Aboobakkar</dc:creator>
      <dc:date>2025-11-26T12:43:46Z</dc:date>
    </item>
    <item>
      <title>Sentinel to Defender | Two-part webinar series: Dec 9 &amp; 16</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/sentinel-to-defender-two-part-webinar-series-dec-9-16/m-p/4471795#M12848</link>
      <description>&lt;img /&gt;
&lt;H1&gt;&lt;A class="lia-external-url" href="https://aka.ms/SentinelToDefender" target="_blank"&gt;Register here&lt;/A&gt;&amp;nbsp;&lt;/H1&gt;</description>
      <pubDate>Thu, 20 Nov 2025 16:05:41 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/sentinel-to-defender-two-part-webinar-series-dec-9-16/m-p/4471795#M12848</guid>
      <dc:creator>RenWoods</dc:creator>
      <dc:date>2025-11-20T16:05:41Z</dc:date>
    </item>
    <item>
      <title>Defender Entity Page w/ Sentinel Events Tab</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/defender-entity-page-w-sentinel-events-tab/m-p/4471756#M12847</link>
      <description>&lt;P&gt;One device is displaying the Sentinel Events Tab, while the other is not. The only difference observed is that one device is Azure AD (AAD) joined and the other is Domain Joined.&amp;nbsp; Could this difference account for the missing Sentinel events data?&lt;/P&gt;&lt;P&gt;Any insight would be appreciated!&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 20 Nov 2025 14:45:36 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/defender-entity-page-w-sentinel-events-tab/m-p/4471756#M12847</guid>
      <dc:creator>HeyNiko</dc:creator>
      <dc:date>2025-11-20T14:45:36Z</dc:date>
    </item>
    <item>
      <title>Tenant-based Microsoft Defender for Cloud Connector</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/tenant-based-microsoft-defender-for-cloud-connector/m-p/4469782#M12842</link>
      <description>&lt;P&gt;As the title states the connector is connect but no alerts show in Sentinel. Alerts are in Defender for Cloud they do not show in Sentinel. Data connector is connected, law exports are configured to send to law rg. What's missing?&lt;/P&gt;</description>
      <pubDate>Thu, 13 Nov 2025 22:04:01 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/tenant-based-microsoft-defender-for-cloud-connector/m-p/4469782#M12842</guid>
      <dc:creator>logger2115</dc:creator>
      <dc:date>2025-11-13T22:04:01Z</dc:date>
    </item>
    <item>
      <title>Asia Pacific and Japan- Get Insider Access to Microsoft Security Products!</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/asia-pacific-and-japan-get-insider-access-to-microsoft-security/m-p/4469421#M12841</link>
      <description>&lt;img /&gt;
&lt;H2&gt;&lt;A href="https://aka.ms/JoinAPJCommunity" target="_blank"&gt;https://aka.ms/JoinAPJCommunity&lt;/A&gt;&lt;/H2&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-teams="true"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 12 Nov 2025 21:13:14 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/asia-pacific-and-japan-get-insider-access-to-microsoft-security/m-p/4469421#M12841</guid>
      <dc:creator>RenWoods</dc:creator>
      <dc:date>2025-11-12T21:13:14Z</dc:date>
    </item>
    <item>
      <title>Sentinel to Defender webinar series CANCELLED, will be rescheduled at a later date.</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/sentinel-to-defender-webinar-series-cancelled-will-be/m-p/4467028#M12837</link>
      <description>&lt;img /&gt;
&lt;H2&gt;The Sentinel to Defender webinar series has been cancelled. Please visit aka.ms/securitycommunity to sign up for upcoming Microsoft Security webinars and to join the mailing list to be notified of future sessions. We apologize for any inconvenience.&lt;/H2&gt;</description>
      <pubDate>Tue, 04 Nov 2025 15:50:32 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/sentinel-to-defender-webinar-series-cancelled-will-be/m-p/4467028#M12837</guid>
      <dc:creator>RenWoods</dc:creator>
      <dc:date>2025-11-04T15:50:32Z</dc:date>
    </item>
    <item>
      <title>Microsoft Application Protection Incidents</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/microsoft-application-protection-incidents/m-p/4464724#M12835</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Seeing a small number of incidents within Sentinel with the Alert product name of 'Microsoft Application Protection'.&lt;/P&gt;&lt;P&gt;Can view them in Sentinel but when clicking on the hyper link to be taken to the defender portal I can't access them?&lt;/P&gt;&lt;P&gt;Two things, which Defender suite are these alerts coming from? Which roles/permissions are required to view them within the Defender/Unified portal?&lt;/P&gt;</description>
      <pubDate>Tue, 28 Oct 2025 11:08:28 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/microsoft-application-protection-incidents/m-p/4464724#M12835</guid>
      <dc:creator>manntj</dc:creator>
      <dc:date>2025-10-28T11:08:28Z</dc:date>
    </item>
    <item>
      <title>Ingest IOC from Google Threat Intelligence into Sentinel</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/ingest-ioc-from-google-threat-intelligence-into-sentinel/m-p/4464024#M12831</link>
      <description>&lt;P&gt;Hi all,&lt;/P&gt;&lt;P&gt;I'm string to ingest IOCs from Google Threat Intelligence into Sentinel.&lt;/P&gt;&lt;P&gt;I follow the guide at gtidocs.virutotal.com/docs/gti4sentinel-guide&lt;/P&gt;&lt;P&gt;API KEY is correct.&lt;/P&gt;&lt;P&gt;PS: I'm using standard free public API (created in Viru Total)&lt;/P&gt;&lt;P&gt;Managed Identitity has been configured using the correct role.&lt;/P&gt;&lt;P&gt;When I run the Logic APP, I received an HTTP error 403&lt;/P&gt;&lt;P&gt;"code": "ForbiddenError",&lt;/P&gt;&lt;P&gt;"message": "You are not authorized to perform the requested operation"&lt;/P&gt;&lt;P&gt;What's the problem ??&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;HA&lt;/P&gt;</description>
      <pubDate>Fri, 24 Oct 2025 12:00:25 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/ingest-ioc-from-google-threat-intelligence-into-sentinel/m-p/4464024#M12831</guid>
      <dc:creator>HA13029</dc:creator>
      <dc:date>2025-10-24T12:00:25Z</dc:date>
    </item>
    <item>
      <title>Issue when ingesting Defender XDR table in Sentinel</title>
      <link>https://techcommunity.microsoft.com/t5/microsoft-sentinel/issue-when-ingesting-defender-xdr-table-in-sentinel/m-p/4463990#M12830</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;We are migrating our on-premises SIEM solution to Microsoft Sentinel since we have E5 licences for all our users. The integration between Defender XDR and Sentinel convinced us to make the move.&lt;/P&gt;&lt;P&gt;We have a limited budget for Sentinel, and we found out that the Auxiliary/Data Lake feature is sufficient for verbose log sources such as network logs.&lt;/P&gt;&lt;P&gt;We would like to retain Defender XDR data for more than 30 days (the default retention period). We implemented the solution described in this blog post:&amp;nbsp;&lt;A href="https://jeffreyappel.nl/how-to-store-defender-xdr-data-for-years-in-sentinel-data-lake-without-expensive-ingestion-cost/" target="_blank"&gt;https://jeffreyappel.nl/how-to-store-defender-xdr-data-for-years-in-sentinel-data-lake-without-expensive-ingestion-cost/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;However, we are facing an issue with 2 tables: DeviceImageLoadEvents and DeviceFileCertificateInfo. The table forwarded by Defender to Sentinel are empty like this row:&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;We created a support ticket but so far, we haven't received any solution. If anyone has experienced this issue, we would appreciate your feedback.&lt;/P&gt;&lt;P&gt;Lucas&lt;/P&gt;</description>
      <pubDate>Fri, 24 Oct 2025 08:52:54 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/microsoft-sentinel/issue-when-ingesting-defender-xdr-table-in-sentinel/m-p/4463990#M12830</guid>
      <dc:creator>lsoumille</dc:creator>
      <dc:date>2025-10-24T08:52:54Z</dc:date>
    </item>
  </channel>
</rss>

