microsoft defender for cloud apps
13 TopicsUnderstand New Sentinel Pricing Model with Sentinel Data Lake Tier
Introduction on Sentinel and its New Pricing Model Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) platform that collects, analyzes, and correlates security data from across your environment to detect threats and automate response. Traditionally, Sentinel stored all ingested data in the Analytics tier (Log Analytics workspace), which is powerful but expensive for high-volume logs. To reduce cost and enable customers to retain all security data without compromise, Microsoft introduced a new dual-tier pricing model consisting of the Analytics tier and the Data Lake tier. The Analytics tier continues to support fast, real-time querying and analytics for core security scenarios, while the new Data Lake tier provides very low-cost storage for long-term retention and high-volume datasets. Customers can now choose where each data type lands—analytics for high-value detections and investigations, and data lake for large or archival types—allowing organizations to significantly lower cost while still retaining all their security data for analytics, compliance, and hunting. Please flow diagram depicts new sentinel pricing model: Now let's understand this new pricing model with below scenarios: Scenario 1A (PAY GO) Scenario 1B (Usage Commitment) Scenario 2 (Data Lake Tier Only) Scenario 1A (PAY GO) Requirement Suppose you need to ingest 10 GB of data per day, and you must retain that data for 2 years. However, you will only frequently use, query, and analyze the data for the first 6 months. Solution To optimize cost, you can ingest the data into the Analytics tier and retain it there for the first 6 months, where active querying and investigation happen. After that period, the remaining 18 months of retention can be shifted to the Data Lake tier, which provides low-cost storage for compliance and auditing needs. But you will be charged separately for data lake tier querying and analytics which depicted as Compute (D) in pricing flow diagram. Pricing Flow / Notes The first 10 GB/day ingested into the Analytics tier is free for 31 days under the Analytics logs plan. All data ingested into the Analytics tier is automatically mirrored to the Data Lake tier at no additional ingestion or retention cost. For the first 6 months, you pay only for Analytics tier ingestion and retention, excluding any free capacity. For the next 18 months, you pay only for Data Lake tier retention, which is significantly cheaper. Azure Pricing Calculator Equivalent Assuming no data is queried or analyzed during the 18-month Data Lake tier retention period: Although the Analytics tier retention is set to 6 months, the first 3 months of retention fall under the free retention limit, so retention charges apply only for the remaining 3 months of the analytics retention window. Azure pricing calculator will adjust accordingly. Scenario 1B (Usage Commitment) Now, suppose you are ingesting 100 GB per day. If you follow the same pay-as-you-go pricing model described above, your estimated cost would be approximately $15,204 per month. However, you can reduce this cost by choosing a Commitment Tier, where Analytics tier ingestion is billed at a discounted rate. Note that the discount applies only to Analytics tier ingestion—it does not apply to Analytics tier retention costs or to any Data Lake tier–related charges. Please refer to the pricing flow and the equivalent pricing calculator results shown below. Monthly cost savings: $15,204 – $11,184 = $4,020 per month Now the question is: What happens if your usage reaches 150 GB per day? Will the additional 50 GB be billed at the Pay-As-You-Go rate? No. The entire 150 GB/day will still be billed at the discounted rate associated with the 100 GB/day commitment tier bucket. Azure Pricing Calculator Equivalent (100 GB/ Day) Azure Pricing Calculator Equivalent (150 GB/ Day) Scenario 2 (Data Lake Tier Only) Requirement Suppose you need to store certain audit or compliance logs amounting to 10 GB per day. These logs are not used for querying, analytics, or investigations on a regular basis, but must be retained for 2 years as per your organization’s compliance or forensic policies. Solution Since these logs are not actively analyzed, you should avoid ingesting them into the Analytics tier, which is more expensive and optimized for active querying. Instead, send them directly to the Data Lake tier, where they can be retained cost-effectively for future audit, compliance, or forensic needs. Pricing Flow Because the data is ingested directly into the Data Lake tier, you pay both ingestion and retention costs there for the entire 2-year period. If, at any point in the future, you need to perform advanced analytics, querying, or search, you will incur additional compute charges, based on actual usage. Even with occasional compute charges, the cost remains significantly lower than storing the same data in the Analytics tier. Realized Savings Scenario Cost per Month Scenario 1: 10 GB/day in Analytics tier $1,520.40 Scenario 2: 10 GB/day directly into Data Lake tier $202.20 (without compute) $257.20 (with sample compute price) Savings with no compute activity: $1,520.40 – $202.20 = $1,318.20 per month Savings with some compute activity (sample value): $1,520.40 – $257.20 = $1,263.20 per month Azure calculator equivalent without compute Azure calculator equivalent with Sample Compute Conclusion The combination of the Analytics tier and the Data Lake tier in Microsoft Sentinel enables organizations to optimize cost based on how their security data is used. High-value logs that require frequent querying, real-time analytics, and investigation can be stored in the Analytics tier, which provides powerful search performance and built-in detection capabilities. At the same time, large-volume or infrequently accessed logs—such as audit, compliance, or long-term retention data—can be directed to the Data Lake tier, which offers dramatically lower storage and ingestion costs. Because all Analytics tier data is automatically mirrored to the Data Lake tier at no extra cost, customers can use the Analytics tier only for the period they actively query data, and rely on the Data Lake tier for the remaining retention. This tiered model allows different scenarios—active investigation, archival storage, compliance retention, or large-scale telemetry ingestion—to be handled at the most cost-effective layer, ultimately delivering substantial savings without sacrificing visibility, retention, or future analytical capabilities.931Views0likes0CommentsMicrosoft Defender XDR / Defender for Endpoint data connectors inconsistent failures
Hello, We are deploying our SOC (Sentinel) environments via Bicep. Now the Defender XDR ( MicrosoftThreatProtection) and Defender for Endpoint ( MicrosoftDefenderAdvancedThreatProtection) data connectors are failing to deploy inconsistantly. It seems to be a known issue due to the following posts: - https://github.com/Azure/SimuLand/issues/23 - https://techcommunity.microsoft.com/t5/microsoft-sentinel/quot-missing-consent-invalid-license-quot-defender-for-endpoint/m-p/3027212 - https://github.com/Azure/Azure-Sentinel/issues/5007 Next to this issue I see almost no development on the data connectors API, is there some news to be spread how to enable data connectors automated in the future, since it seems to be moving to Content Hub. It is hard to find any docs about how to deploy this for example via Bicep!? Also I have a question regarding 'Tenant-based Microsoft Defender for Cloud (Preview)' data connector. We deploy this now via GenericUI data connector kind, but this has no option to enable it via automation. Same as the question in the previous paragraph, how would this be made possible?1.1KViews0likes0CommentsMicrosoft 365 Defender data connector and error ('AdvancedHunting-CloudAppEvents are not supported')
Hello, I have a client who has set up the Microsoft 365 Defender data connector, and on selecting the 'connect events' for Microsoft Defender for Cloud Apps and saving the configuration, the following error is generated... The exact error is: 'AdvancedHunting-CloudAppEvents are not supported'. I have not checked the configurations in the Microsoft 365 Defender portal under Cloud Apps yet, but has anyone come across this error and is it likely to be related to a configuration issue?1.5KViews1like2CommentsRun Query and List Results operation
I am using the Run Query and List Results operation within Logic Apps to get an Incident Name. The issue I have is it seems to be duplicating the results in the list i.e Incident Name appears twice. Is there some setting I'm missing or is there a concise way to strip the second value away?1.2KViews0likes1CommentTicket Sync between Sentinel and Defender for Cloud Apps
Hello I have defender for Cloud APP syncing with sentinel to open incidents but when I close incidents in Sentinel it doesn't close Defender for Cloud Apps alerts. Is there any MSFT solution, I've already checked official MSFT links and so far I haven't found anything related to what I want. Best regards1.6KViews0likes3CommentsFeature Request: Entity Annotation
So I was investigating an incident where a user had signed in from a TOR exit node on an AAD Joined device. After investigating, I had found that they had a commercial VPN, and their endpoints also served as exit nodes. So they weren't actually using TOR, but their traffic was coming from an exit node. The device is part of a group with more lax controls, so this is absolutely allowed (I can't really explain more, I would love to go to town with this stuff and remove it, but that isn't my call). So I was in a situation where I can't tune, because I need Defender device logs to see if its the VPN (too high ingestion), and I can't just allow the IPs as they are TOR exit nodes. Which gave me the idea of having annotations on the entities in UEBA. So in this case, I could say "known to use a VPN which also acts as TOR exit nodes, check source IP" or something similar. It saves having to create a separate knowledge base and keep it up to date with data from all security products. Would also be useful for users too. I have a user who frequently mass deletes files on a certain time on a certain day which triggers DLP rules. I could add the conditions of that behaviour as an annotation, rather than having to write a crazy analytics rule which has to check the day and time, user and Sharepoint site, plus other exclusions. Something like the comments thread on incidents will suffice.737Views0likes0CommentsDefender Sentinel Sync
The status of an incident in Sentinel does not sync with Microsoft 365 Defender (Alert product name Microsoft Cloud App Security) when the incident is closed. Has anyone else encountered this issue? I expected Microsoft 365 Defender and Sentinel to sync incidents on status, owner, and closing reason bi-directionally. Thanks2.8KViews0likes4CommentsAzure Sentinel Side by Side with QRadar
Hi, quick question: in the "Event Filter" on Qradar we add: vendorInformation/provider eq 'Azure Sentinel' to get Sentinel events but is it possible to include another azure instances such as Cloud App, Identity, etc? I mean, like: provider eq 'Azure Sentinel, MCAS, IPS' thank you1.8KViews0likes2CommentsFalse positive alert of defense evasion behavior was blocked on one endpoint
I am receiving a lots of alert from defender saying dense evasion was blocked on one endpoint. Normally when outlook.exe interact with .JPG file and follows by runddll32.exe used by photoviewer.dll, it trigger this alert. Does any one experience similar experience ?1.7KViews0likes1Comment