Forum Widgets
Latest Discussions
Webinar Rescheduled: AI-Powered Entity Analysis in Sentinel's MCP Server
Hi folks! The webinar: AI-Powered Entity Analysis in Sentinel's MCP Server which was previously scheduled for: January 13th, 2026, has been rescheduled to: January 27th, 2026, at 9:00 AM PT. Please delete the old invite from your calendar and find the new one at aka.ms/securitycommunity. We apologize for the inconvenience and hope to see you there!emilyfallaDec 05, 2025Microsoft43Views0likes0CommentsUnderstand New Sentinel Pricing Model with Sentinel Data Lake Tier
Introduction on Sentinel and its New Pricing Model Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) platform that collects, analyzes, and correlates security data from across your environment to detect threats and automate response. Traditionally, Sentinel stored all ingested data in the Analytics tier (Log Analytics workspace), which is powerful but expensive for high-volume logs. To reduce cost and enable customers to retain all security data without compromise, Microsoft introduced a new dual-tier pricing model consisting of the Analytics tier and the Data Lake tier. The Analytics tier continues to support fast, real-time querying and analytics for core security scenarios, while the new Data Lake tier provides very low-cost storage for long-term retention and high-volume datasets. Customers can now choose where each data type lands—analytics for high-value detections and investigations, and data lake for large or archival types—allowing organizations to significantly lower cost while still retaining all their security data for analytics, compliance, and hunting. Please flow diagram depicts new sentinel pricing model: Now let's understand this new pricing model with below scenarios: Scenario 1A (PAY GO) Scenario 1B (Usage Commitment) Scenario 2 (Data Lake Tier Only) Scenario 1A (PAY GO) Requirement Suppose you need to ingest 10 GB of data per day, and you must retain that data for 2 years. However, you will only frequently use, query, and analyze the data for the first 6 months. Solution To optimize cost, you can ingest the data into the Analytics tier and retain it there for the first 6 months, where active querying and investigation happen. After that period, the remaining 18 months of retention can be shifted to the Data Lake tier, which provides low-cost storage for compliance and auditing needs. But you will be charged separately for data lake tier querying and analytics which depicted as Compute (D) in pricing flow diagram. Pricing Flow / Notes The first 10 GB/day ingested into the Analytics tier is free for 31 days under the Analytics logs plan. All data ingested into the Analytics tier is automatically mirrored to the Data Lake tier at no additional ingestion or retention cost. For the first 6 months, you pay only for Analytics tier ingestion and retention, excluding any free capacity. For the next 18 months, you pay only for Data Lake tier retention, which is significantly cheaper. Azure Pricing Calculator Equivalent Assuming no data is queried or analyzed during the 18-month Data Lake tier retention period: Although the Analytics tier retention is set to 6 months, the first 3 months of retention fall under the free retention limit, so retention charges apply only for the remaining 3 months of the analytics retention window. Azure pricing calculator will adjust accordingly. Scenario 1B (Usage Commitment) Now, suppose you are ingesting 100 GB per day. If you follow the same pay-as-you-go pricing model described above, your estimated cost would be approximately $15,204 per month. However, you can reduce this cost by choosing a Commitment Tier, where Analytics tier ingestion is billed at a discounted rate. Note that the discount applies only to Analytics tier ingestion—it does not apply to Analytics tier retention costs or to any Data Lake tier–related charges. Please refer to the pricing flow and the equivalent pricing calculator results shown below. Monthly cost savings: $15,204 – $11,184 = $4,020 per month Now the question is: What happens if your usage reaches 150 GB per day? Will the additional 50 GB be billed at the Pay-As-You-Go rate? No. The entire 150 GB/day will still be billed at the discounted rate associated with the 100 GB/day commitment tier bucket. Azure Pricing Calculator Equivalent (100 GB/ Day) Azure Pricing Calculator Equivalent (150 GB/ Day) Scenario 2 (Data Lake Tier Only) Requirement Suppose you need to store certain audit or compliance logs amounting to 10 GB per day. These logs are not used for querying, analytics, or investigations on a regular basis, but must be retained for 2 years as per your organization’s compliance or forensic policies. Solution Since these logs are not actively analyzed, you should avoid ingesting them into the Analytics tier, which is more expensive and optimized for active querying. Instead, send them directly to the Data Lake tier, where they can be retained cost-effectively for future audit, compliance, or forensic needs. Pricing Flow Because the data is ingested directly into the Data Lake tier, you pay both ingestion and retention costs there for the entire 2-year period. If, at any point in the future, you need to perform advanced analytics, querying, or search, you will incur additional compute charges, based on actual usage. Even with occasional compute charges, the cost remains significantly lower than storing the same data in the Analytics tier. Realized Savings Scenario Cost per Month Scenario 1: 10 GB/day in Analytics tier $1,520.40 Scenario 2: 10 GB/day directly into Data Lake tier $202.20 (without compute) $257.20 (with sample compute price) Savings with no compute activity: $1,520.40 – $202.20 = $1,318.20 per month Savings with some compute activity (sample value): $1,520.40 – $257.20 = $1,263.20 per month Azure calculator equivalent without compute Azure calculator equivalent with Sample Compute Conclusion The combination of the Analytics tier and the Data Lake tier in Microsoft Sentinel enables organizations to optimize cost based on how their security data is used. High-value logs that require frequent querying, real-time analytics, and investigation can be stored in the Analytics tier, which provides powerful search performance and built-in detection capabilities. At the same time, large-volume or infrequently accessed logs—such as audit, compliance, or long-term retention data—can be directed to the Data Lake tier, which offers dramatically lower storage and ingestion costs. Because all Analytics tier data is automatically mirrored to the Data Lake tier at no extra cost, customers can use the Analytics tier only for the period they actively query data, and rely on the Data Lake tier for the remaining retention. This tiered model allows different scenarios—active investigation, archival storage, compliance retention, or large-scale telemetry ingestion—to be handled at the most cost-effective layer, ultimately delivering substantial savings without sacrificing visibility, retention, or future analytical capabilities.1KViews0likes0CommentsSentinel to Defender webinar series CANCELLED, will be rescheduled at a later date.
The Sentinel to Defender webinar series has been cancelled. Please visit aka.ms/securitycommunity to sign up for upcoming Microsoft Security webinars and to join the mailing list to be notified of future sessions. We apologize for any inconvenience.RenWoodsNov 04, 2025Microsoft993Views0likes0CommentsIngest IOC from Google Threat Intelligence into Sentinel
Hi all, I'm string to ingest IOCs from Google Threat Intelligence into Sentinel. I follow the guide at gtidocs.virutotal.com/docs/gti4sentinel-guide API KEY is correct. PS: I'm using standard free public API (created in Viru Total) Managed Identitity has been configured using the correct role. When I run the Logic APP, I received an HTTP error 403 "code": "ForbiddenError", "message": "You are not authorized to perform the requested operation" What's the problem ?? Regards, HAHA13029Oct 24, 2025Brass Contributor53Views0likes0CommentsIssue when ingesting Defender XDR table in Sentinel
Hello, We are migrating our on-premises SIEM solution to Microsoft Sentinel since we have E5 licences for all our users. The integration between Defender XDR and Sentinel convinced us to make the move. We have a limited budget for Sentinel, and we found out that the Auxiliary/Data Lake feature is sufficient for verbose log sources such as network logs. We would like to retain Defender XDR data for more than 30 days (the default retention period). We implemented the solution described in this blog post: https://jeffreyappel.nl/how-to-store-defender-xdr-data-for-years-in-sentinel-data-lake-without-expensive-ingestion-cost/ However, we are facing an issue with 2 tables: DeviceImageLoadEvents and DeviceFileCertificateInfo. The table forwarded by Defender to Sentinel are empty like this row: We created a support ticket but so far, we haven't received any solution. If anyone has experienced this issue, we would appreciate your feedback. LucaslsoumilleOct 24, 2025Copper Contributor69Views0likes0CommentsModernize security operations to secure agentic AI—Microsoft Sentinel at Ignite 2025
Security is a core focus at Microsoft Ignite this year, with the Security Forum on November 17, deep dive technical sessions, theater talks, and hands-on labs designed for security leaders and practitioners. Join us in San Francisco, November 17–21, or online, November 18–20, to learn what’s new and what’s next across SecOps, data, cloud, and AI—and how to get more from the Microsoft capabilities you already use. This year, Microsoft Sentinel takes center stage with sessions and labs designed to help you unify data, automate response, and leverage AI-powered insights for faster, more effective threat detection. Featured sessions: BRK235: Power agentic defense with Microsoft Sentinel Explore Microsoft Sentinel’s platform architecture, graph intelligence, and agentic workflows to automate, investigate, and respond with speed and precision. BRK246: Blueprint for building the SOC of the future Learn how to architect a modern SOC that anticipates and prevents threats using predictive shielding, agentic AI, and graph-powered reasoning. LAB543: Perform threat hunting in Microsoft Sentinel Dive deep into advanced threat hunting, KQL queries, and proactive investigation workflows to sharpen your security operations. Explore and filter the full security catalog by topic, format, and role: aka.ms/Ignite/SecuritySessions. Why attend: Ignite is your opportunity to see the latest innovations in Microsoft Sentinel, connect with experts, and gain hands-on experience. Sessions will also touch on future directions for agentic AI and unified SOC operations, as outlined in Microsoft’s broader security roadmap. Security Forum (November 17): Kick off with an immersive, in‑person pre‑day focused on strategic security discussions and real‑world guidance from Microsoft leaders and industry experts. Select Security Forum during registration. Connect with peers and security leaders through these signature security experiences: Security Leaders Dinner—CISOs and VPs connect with Microsoft leaders. CISO Roundtable—Gain practical insights on secure AI adoption. Secure the Night Party—Network in a relaxed, fun setting. Register for Microsoft Ignite >MSdellisOct 22, 2025Microsoft207Views0likes0CommentsData Connectors Storage Account and Function App
Several data connectors downloaded via Content Hub has ARM deployment templates which is default OOB experience. If we need to customize we could however I wanted to ask community how do you go about addressing some of the infrastructure issues where these connectors deploy storage accounts with insecure configurations like infrastructure key requirement, vnet intergration, cmk, front door etc... Storage and Function Apps. It appears default configuration basically provisions all required services to get streams going but posture configuration seems to be dismissing security standards around hardening these services.logger2115Sep 30, 2025Brass Contributor45Views0likes0CommentsSingle Rule for No logs receiving (Global + Per-device Thresholds)
Hi everyone, I currently maintain one Analytics rule per table to detect when logs stop coming in. Some tables receive data from multiple sources, each with a different expected interval (for example, some sources send every 10 minutes, others every 30 minutes). In other SIEM platforms there’s usually: A global threshold (e.g., 60 minutes) for all sources. Optional per-device/per-table thresholds that override the global value. Is there a recommended way to implement one global rule that uses a default threshold but allows per-source overrides when a particular device or log table has a different expected frequency? Also, if there are other approaches you use to manage “logs not received” detection, I’d love to hear your suggestions as well. This is a sample of my current rule let threshold = 1h; AzureActivity | summarize LastHeartBeat = max(TimeGenerated) | where LastHeartBeat < ago(threshold)Akila2Sep 15, 2025Copper Contributor51Views1like0Comments
Resources
Tags
- siem436 Topics
- KQL299 Topics
- data collection240 Topics
- Log Data217 Topics
- analytics159 Topics
- azure155 Topics
- automation142 Topics
- integration135 Topics
- kusto123 Topics
- playbooks120 Topics