solutions
93 TopicsUnderstand New Sentinel Pricing Model with Sentinel Data Lake Tier
Introduction on Sentinel and its New Pricing Model Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) platform that collects, analyzes, and correlates security data from across your environment to detect threats and automate response. Traditionally, Sentinel stored all ingested data in the Analytics tier (Log Analytics workspace), which is powerful but expensive for high-volume logs. To reduce cost and enable customers to retain all security data without compromise, Microsoft introduced a new dual-tier pricing model consisting of the Analytics tier and the Data Lake tier. The Analytics tier continues to support fast, real-time querying and analytics for core security scenarios, while the new Data Lake tier provides very low-cost storage for long-term retention and high-volume datasets. Customers can now choose where each data type lands—analytics for high-value detections and investigations, and data lake for large or archival types—allowing organizations to significantly lower cost while still retaining all their security data for analytics, compliance, and hunting. Please flow diagram depicts new sentinel pricing model: Now let's understand this new pricing model with below scenarios: Scenario 1A (PAY GO) Scenario 1B (Usage Commitment) Scenario 2 (Data Lake Tier Only) Scenario 1A (PAY GO) Requirement Suppose you need to ingest 10 GB of data per day, and you must retain that data for 2 years. However, you will only frequently use, query, and analyze the data for the first 6 months. Solution To optimize cost, you can ingest the data into the Analytics tier and retain it there for the first 6 months, where active querying and investigation happen. After that period, the remaining 18 months of retention can be shifted to the Data Lake tier, which provides low-cost storage for compliance and auditing needs. But you will be charged separately for data lake tier querying and analytics which depicted as Compute (D) in pricing flow diagram. Pricing Flow / Notes The first 10 GB/day ingested into the Analytics tier is free for 31 days under the Analytics logs plan. All data ingested into the Analytics tier is automatically mirrored to the Data Lake tier at no additional ingestion or retention cost. For the first 6 months, you pay only for Analytics tier ingestion and retention, excluding any free capacity. For the next 18 months, you pay only for Data Lake tier retention, which is significantly cheaper. Azure Pricing Calculator Equivalent Assuming no data is queried or analyzed during the 18-month Data Lake tier retention period: Although the Analytics tier retention is set to 6 months, the first 3 months of retention fall under the free retention limit, so retention charges apply only for the remaining 3 months of the analytics retention window. Azure pricing calculator will adjust accordingly. Scenario 1B (Usage Commitment) Now, suppose you are ingesting 100 GB per day. If you follow the same pay-as-you-go pricing model described above, your estimated cost would be approximately $15,204 per month. However, you can reduce this cost by choosing a Commitment Tier, where Analytics tier ingestion is billed at a discounted rate. Note that the discount applies only to Analytics tier ingestion—it does not apply to Analytics tier retention costs or to any Data Lake tier–related charges. Please refer to the pricing flow and the equivalent pricing calculator results shown below. Monthly cost savings: $15,204 – $11,184 = $4,020 per month Now the question is: What happens if your usage reaches 150 GB per day? Will the additional 50 GB be billed at the Pay-As-You-Go rate? No. The entire 150 GB/day will still be billed at the discounted rate associated with the 100 GB/day commitment tier bucket. Azure Pricing Calculator Equivalent (100 GB/ Day) Azure Pricing Calculator Equivalent (150 GB/ Day) Scenario 2 (Data Lake Tier Only) Requirement Suppose you need to store certain audit or compliance logs amounting to 10 GB per day. These logs are not used for querying, analytics, or investigations on a regular basis, but must be retained for 2 years as per your organization’s compliance or forensic policies. Solution Since these logs are not actively analyzed, you should avoid ingesting them into the Analytics tier, which is more expensive and optimized for active querying. Instead, send them directly to the Data Lake tier, where they can be retained cost-effectively for future audit, compliance, or forensic needs. Pricing Flow Because the data is ingested directly into the Data Lake tier, you pay both ingestion and retention costs there for the entire 2-year period. If, at any point in the future, you need to perform advanced analytics, querying, or search, you will incur additional compute charges, based on actual usage. Even with occasional compute charges, the cost remains significantly lower than storing the same data in the Analytics tier. Realized Savings Scenario Cost per Month Scenario 1: 10 GB/day in Analytics tier $1,520.40 Scenario 2: 10 GB/day directly into Data Lake tier $202.20 (without compute) $257.20 (with sample compute price) Savings with no compute activity: $1,520.40 – $202.20 = $1,318.20 per month Savings with some compute activity (sample value): $1,520.40 – $257.20 = $1,263.20 per month Azure calculator equivalent without compute Azure calculator equivalent with Sample Compute Conclusion The combination of the Analytics tier and the Data Lake tier in Microsoft Sentinel enables organizations to optimize cost based on how their security data is used. High-value logs that require frequent querying, real-time analytics, and investigation can be stored in the Analytics tier, which provides powerful search performance and built-in detection capabilities. At the same time, large-volume or infrequently accessed logs—such as audit, compliance, or long-term retention data—can be directed to the Data Lake tier, which offers dramatically lower storage and ingestion costs. Because all Analytics tier data is automatically mirrored to the Data Lake tier at no extra cost, customers can use the Analytics tier only for the period they actively query data, and rely on the Data Lake tier for the remaining retention. This tiered model allows different scenarios—active investigation, archival storage, compliance retention, or large-scale telemetry ingestion—to be handled at the most cost-effective layer, ultimately delivering substantial savings without sacrificing visibility, retention, or future analytical capabilities.1.5KViews2likes2CommentsUnified SecOps XDR
Hi, I am reaching out to community to seek understanding regarding Unified SecOps XDR portal for Multi-tenant Multi-workspace. Our organization already has a Azure lighthouse setup. My question is if M365 lighthouse license also required for the Multi-tenant Multi-workspace in unified SecOps XDR portal?334Views2likes4Comments[DevOps] dps.sentinel.azure.com no longer responds
Hello, Ive been using Repository connections in sentinel to a central DevOps for almost two years now. Today i got my first automated email on error for a webhook related to my last commit from the central repo to my Sentinel intances. Its a webhook that is automticly created in connections that are made the last year (the once from 2 years ago dont have this webhook automaticly created). The hook is found in devops -> service hooks -> webhooks "run state change" for each connected sentinel However, after todays run (which was successfull, all content deployed) this hook generates alerts. It says it cant reach: (EU in my case) eu.prod.dps.sentinel.azure.com full url: https://eu.prod.dps.sentinel.azure.com/webhooks/ado/workspaces/[REDACTED]/sourceControls/[REDACTED] So, what happened to this domain? why is it no longer responding and when was it going offline? I THINK this is the hook that sets the status under Sentinel -> Repositories in the GUI. this success status in screenshoot is from 2025/02/06, no new success has been registered in the receiving Sentinel instance. For the Sentinel that is 2 year old and dont have a hook in my DevOps that last deployment status says "Unknown" - so im fairly sure thats what the webhook is doing. So a second question would be, how can i set up a new webhook ? (it want ID and password of the "Azure Sentinel Content Deployment App" - i will never know that password....) so i cant manually add ieather (if the URL ever comes back online or if a new one exists?). please let me know.259Views2likes3CommentsBacking up Sentinel and the Security subscription
A lot of people ask about how Security Operations can effectively back up all of the Sentinel related objects. One option is to use GitHub or Azure DevOps pipelines to get a daily backup. I've been doing this for a very long time and it seems like a good forum to share that code. The trick behind it has been to use PowerShell to derive the current API versions for Azure objects. Once you do that, you can recursively download the whole subscription to a repo and then scripts can renerate reports using markdown and yaml. I've been backing up my subscription reliably since 2021. The default project creates reports for all the Sentinel related elements. Markdown lets the object reports be drilled down into... And KQL is presented as YAML for readability. It's actually easy to deploy all the backedup JSON files through REST if needed but for most of us, being able to have readable KQL and Git History of changes in files is probably all we need. This project is completely written in PowerShell with no compiled modules & anyone is freely welcome to it. I've written more about it here: https://www.laurierhodes.info/node/168 ... and the source code and install documentation can be found here: https://github.com/LaurieRhodes/PUBLIC-Subscription-Backup I hope this is of use to the community! 🙂 Best Regards Laurie1.4KViews2likes3CommentsEssential solutions
There are currently 10 different solutions with Essentials in their title and many of them have very similar titles, e.g. DNS, Network Session and Network Threat Protection. Do other firms typically recommend/install all of these? Does Microsoft consider this to be the recommended "starter set"?1KViews2likes1CommentASIM built-in functions in Sentinel, are they updated automatically?
Are the ASIM built-in functions in Sentinel automatically updated? For example, the built-in parsers such for DNS, NetworkSession, and WebSession. Do the built-in ones receive automatic updates or will the workspace-deployed versions of these parsers be the most up-to-date? And if true, would it be recommended to use workspace-deployed version of parsers that already come built-in?733Views2likes1CommentFetching alerts from Sentinel using logic apps
Hello everyone, I have a requirement to archive alerts from sentinel. To do that I need to do the following: Retrieve the alerts from Sentinel Send the data to an external file share As a solution, I decided to proceed with using logic apps where I will be running a script to automate this process. My questions are the following: -> Which API endpoints in sentinel are relevant to retrieve alerts or to run kql queries to get the needed data. -> I know that I will need some sort of permissions to interact with the API endpoint. What type of service account inside azure should I create and what permissions should I provision to it ? -> Is there any existing examples of logic apps interacting with ms sentinel ? That would be helpful for me as I am new to Azure. Any help is much appreciated !529Views1like4CommentsCan we deploy Bicep through Sentinel repo
Hi there, Im new here, but 😅.... With the problem statement being "Deploying and managing sentinel infrastructure through git repository. I had looked into Sentinel Repository feature which is still in Preview. With added limitations of not being able to deploy watchlists or custom log analytical functions ( custom parsers ). There is also a limitation of deploying only ARM content My guess would be that the product folks at msft are working on this 😋 My hypothesized (just started the rnd, as of writing this) options would be to Fully go above and beyond with Bicep; Create bicep deployment files for both the rules as well as their dependencies like LAW functions, watchlists and the whole nine yards. Need to write pipelines for the deployment. The CI/CD would also need extra work to implement Hit that sweet spot; Deploy the currently supported resources using sentinel repo and write a pipeline to deploy the watchlists using Bicep. But not sure if this will be relevant to solutions to clients. When the whole shtick is that we are updating now so we dont have to later. Go back to the dark ages: Stick to the currently supported sentinel content through ARM & repo. And deploy the watchlists and dependencies using GUI 🙃 I will soon confirm the first two methods, but may take some time. As you know, I may or may not be new to sentinel...or devops.. But wanted to kick off the conversation, to see how close to being utterly wrong I am. 😎 Thanks, mal_sec92Views1like0CommentsGITHUB - AI Sentinel attack simulation
The recent support for Model Context Protocol (MCP) with Claude Desktop has opened the door for some really useful testing capability with Sentinel and emerging threats. I'm happy to share with the community a GitHub project that demonstrates the use of MCP against current exploits to generate simulated attack data that can be used with testing migrated ASIM alert rules. MCP allows for up-to-date exploits to be queried... ... and with AI prompting, simulated attack events can be created against our Sentinel test environments. Which results in a simulated attack based on the exploit being referenced. This is really useful for testing the migration of our Sentinel alert rules to ASIM! The full code and details about the project are available here: https://laurierhodes.info/node/175286Views1like1Comment