Microsoft Defender Threat Intelligence
50 TopicsAdvanced threat hunting and multi-layered defense within the Microsoft Cloud ecosystem!
Dear Microsoft Cloud Friends, I think we all agree that cloud services are indispensable today. Whether it is cloud services from Microsoft, Amazon AWS, Google Cloud, etc., the integration of such services are widespread. As great and supportive as these cloud functionalities are, they also bring a big challenge. SECURITY! But this is exactly where the challenge starts. Where do protective measures need to be taken everywhere? Where is the first place to start? Where is it most important to set up protective mechanisms? Honestly, there is no standard solution that can be applied everywhere. This is extremely situational and depends on the cloud services used. This article is about giving you a jump start. The following information, measures, etc. are neither exhaustive nor complete. But they are intended to support you so that you can continue to develop. Identities are an incredibly important element in cloud services (I'm not just talking about user accounts, but also managed identities, service principals, App registration, etc.). This is exactly why we start with this first topic. What could support us, for example the MITRE ATT&CK framework and of course the Microsoft documentation. Identities: Cloud Matrix https://attack.mitre.org/matrices/enterprise/cloud/ Valid Accounts: Cloud Accounts https://attack.mitre.org/techniques/T1078/004/ Modify Authentication Process: Hybrid Identity https://attack.mitre.org/techniques/T1556/007/ Account Manipulation: Additional Cloud Credentials https://attack.mitre.org/techniques/T1098/001/ Azure AD Matrix https://attack.mitre.org/matrices/enterprise/cloud/azuread/ Tactics above represent the "why" of an ATT&CK technique or sub-technique. The following article describes several best practices on how to protect identities. Azure Identity Management and access control security best practices https://learn.microsoft.com/en-us/azure/security/fundamentals/identity-management-best-practices Best practices for Azure AD roles https://learn.microsoft.com/en-us/azure/active-directory/roles/best-practices Microsoft identity platform best practices and recommendations https://learn.microsoft.com/en-us/azure/active-directory/develop/identity-platform-integration-checklist Best practices for all isolation architectures https://learn.microsoft.com/en-us/azure/active-directory/fundamentals/secure-with-azure-ad-best-practices Securing identity with Zero Trust https://learn.microsoft.com/en-us/security/zero-trust/deploy/identity All of these safeguards are great, but logging should definitely not be forgotten. For example, if you have log collation set up for Azure Active Directory, you can use KQL (Kusto Query Language) to examine the logs. In the following example, you can investigate why a person was given the Global Administrator role. AuditLogs | where Category == "RoleManagement" | where Result == "success" | where OperationName == "Add member to role" | where (TargetResources has "Company" or TargetResources has "Tenant" or TargetResources has "Global") | project TargetUser = tostring(TargetResources.[0].["userPrincipalName"]) With Microsoft Defender for Identity there is a cloud service to monitor the Active Directory. Attacks like the following pictures show can be detected this way (sorry the screenshots are in German). Email and data information storage: Working with email services and storing data/information in the various cloud environments has become indispensable today. It is self-explanatory that this situation offers a large attack surface. Email Collection https://attack.mitre.org/techniques/T1114/ Compromise Accounts: Email Accounts https://attack.mitre.org/techniques/T1586/002/ Phishing https://attack.mitre.org/techniques/T1566/ Establish Accounts: Email Accounts https://attack.mitre.org/techniques/T1585/002/ Email Collection: Email Forwarding Rule https://attack.mitre.org/techniques/T1114/003/ Email Collection: Remote Email Collection https://attack.mitre.org/techniques/T1114/002/ Email Collection: Local Email Collection https://attack.mitre.org/techniques/T1114/001/ Data from Information Repositories: Sharepoint https://attack.mitre.org/techniques/T1213/002/ Office 365 Matrix https://attack.mitre.org/matrices/enterprise/cloud/office365/ Data from Information Repositories https://attack.mitre.org/techniques/T1213/ Let's take a look together at what Microsoft has to offer in terms of security features on these topics. Secure your data with Microsoft 365 for business https://learn.microsoft.com/en-us/microsoft-365/business-premium/secure-your-business-data Policy recommendations for securing email https://learn.microsoft.com/en-us/microsoft-365/security/office-365-security/secure-email-recommended-policies Recommended settings for EOP and Microsoft Defender for Office 365 security https://learn.microsoft.com/en-us/microsoft-365/security/office-365-security/recommended-settings-for-eop-and-office365 Managing SharePoint Online Security: A Team Effort https://learn.microsoft.com/en-us/microsoft-365/community/sharepoint-security-a-team-effort Policy recommendations for securing SharePoint sites and files https://learn.microsoft.com/en-us/microsoft-365/security/office-365-security/sharepoint-file-access-policies What is Microsoft 365 Defender? https://learn.microsoft.com/en-us/microsoft-365/security/defender/microsoft-365-defender Check last login to a mailbox https://github.com/tomwechsler/Threat_Hunting_with_PowerShell/blob/main/Hunting_Exchange_Online/Exchange_Mailbox_LastLogin.ps1 Check Sharepoint Online Library for specific file extensions https://github.com/tomwechsler/Threat_Hunting_with_PowerShell/blob/main/Hunting_SharePoint_Online/SharePoint_Online_specific_files.ps1 Virtual machines: Virtual machines are not only used in the on-premises infrastructure, but also in the cloud. In many cases, there are even hybrid infrastructures. Hide Artifacts: Run Virtual Instance https://attack.mitre.org/techniques/T1564/006/ Virtualization/Sandbox Evasion https://attack.mitre.org/techniques/T1497/ Virtualization/Sandbox Evasion: System Checks https://attack.mitre.org/techniques/T1497/001/ Compromise Infrastructure: Virtual Private Server https://attack.mitre.org/techniques/T1584/003/ Modify Cloud Compute Infrastructure: Create Cloud Instance https://attack.mitre.org/techniques/T1578/002/ Modify Cloud Compute Infrastructure: Delete Cloud Instance https://attack.mitre.org/techniques/T1578/003/ Acquire Infrastructure: Virtual Private Server https://attack.mitre.org/techniques/T1583/003/ Instance https://attack.mitre.org/datasources/DS0030/ Cloud Administration Command https://attack.mitre.org/techniques/T1651/ In such an infrastructure (IaaS Infrastructure-as-a-service) there are an incredible number of different threats, making them all visible is a real challenge. Azure Virtual Desktop Security best practices https://learn.microsoft.com/en-us/azure/virtual-desktop/security-guide Security best practices for IaaS workloads in Azure https://learn.microsoft.com/en-us/azure/security/fundamentals/iaas Best practices for defending Azure Virtual Machines https://www.microsoft.com/en-us/security/blog/2020/10/07/best-practices-for-defending-azure-virtual-machines/ Security recommendations for virtual machines in Azure https://learn.microsoft.com/en-us/azure/virtual-machines/security-recommendations Azure Virtual Machines security overview https://learn.microsoft.com/en-us/azure/security/fundamentals/virtual-machines-overview Security considerations for SQL Server on Azure Virtual Machines https://learn.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/security-considerations-best-practices Azure best practices for network security https://learn.microsoft.com/en-us/azure/security/fundamentals/network-best-practices Azure security baseline for Windows Virtual Machines https://learn.microsoft.com/en-us/security/benchmark/azure/baselines/virtual-machines-windows-security-baseline Plan your Defender for Servers deployment https://learn.microsoft.com/en-us/azure/defender-for-cloud/plan-defender-for-servers With Microsoft Defender for Servers, you can not only monitor systems in Azure, no, you also get support for Amazon AWS and Google Cloud. Networking: Systems but also cloud services want to communicate. Networks are therefore at high risk and require special attention. Network Service Discovery https://attack.mitre.org/techniques/T1046/ Network Segmentation https://attack.mitre.org/mitigations/M0930/ Network Sniffing https://attack.mitre.org/techniques/T1040/ Network Traffic https://attack.mitre.org/datasources/DS0029/ Network Allowlists https://attack.mitre.org/mitigations/M0807/ Data from Network Shared Drive https://attack.mitre.org/techniques/T1039/ Let's look together that we can make the networks more secure. Azure best practices for network security https://learn.microsoft.com/en-us/azure/security/fundamentals/network-best-practices Azure Virtual Network concepts and best practices https://learn.microsoft.com/en-us/azure/virtual-network/concepts-and-best-practices Azure security baseline for Virtual Network https://learn.microsoft.com/en-us/security/benchmark/azure/baselines/virtual-network-security-baseline Azure security best practices and patterns https://learn.microsoft.com/en-us/azure/security/fundamentals/best-practices-and-patterns Network security https://learn.microsoft.com/en-us/azure/well-architected/security/design-network Azure network security overview https://learn.microsoft.com/en-us/azure/security/fundamentals/network-overview Best practices to set up networking for workloads migrated to Azure https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-networking Security best practices for IaaS workloads in Azure https://learn.microsoft.com/en-us/azure/security/fundamentals/iaas Azure security best practices https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/secure/security-top-10 What is Zero Trust? https://learn.microsoft.com/en-us/security/zero-trust/zero-trust-overview Advanced Hunting: These were a few examples and make directly visible how enormous the whole environment is. Installing protection mechanisms, setting up logging, monitoring the systems, examining log data is one thing, but how do you "master" this flood of information? One tool that can help us do this is Microsoft Sentinel. The Microsoft Sentinel is a is a cloud native security information and event management (SIEM) and security orchestration, automation, and response (SOAR) solution that runs in the Azure cloud. But not only can cloud environments be monitored, no, local infrastructures can also be integrated. In addition, other cloud providers and there are more than 100 connectors and third-party providers can be integrated. With Sentinel you get a tool with which you can search for threats in a targeted, fast and efficient way. Hunt for threats with Microsoft Sentinel https://learn.microsoft.com/en-us/azure/sentinel/hunting Use Hunts to conduct end-to-end proactive threat hunting in Microsoft Sentinel https://learn.microsoft.com/en-us/azure/sentinel/hunts Keep track of data during hunting with Microsoft Sentinel https://learn.microsoft.com/en-us/azure/sentinel/bookmarks If you have included the sources in Sentinel, you can create queries with KQL since only the sky is the limit (if at all). Here are a few examples: // Failed Signin reasons // The query list the main reasons for sign in failures. SigninLogs | where ResultType != 0 | summarize Count=count() by ResultDescription, ResultType | sort by Count desc nulls last // Failed MFA challenge // Highlights sign in failures caused by failed MFA challenge. SigninLogs | where ResultType == 50074 | project UserDisplayName, Identity,UserPrincipalName, ResultDescription, AppDisplayName, AppId, ResourceDisplayName | summarize FailureCount=count(), FailedResources=dcount(ResourceDisplayName), ResultDescription=any(ResultDescription) by UserDisplayName // All SiginLogs events // All Azure signin events. SigninLogs | project UserDisplayName, Identity,UserPrincipalName, AppDisplayName, AppId, ResourceDisplayName // Successful key enumaration // Lists users who performed key enumeration, and their location. AzureActivity | where OperationName == "List Storage Account Keys" | where ActivityStatus == "Succeeded" | project TimeGenerated, Caller, CallerIpAddress, OperationName let lookback = 2d; SecurityEvent | where TimeGenerated >= ago(lookback) | where EventID == 4688 and Process =~ "powershell.exe" | extend PwshParam = trim(@"[^/\\]*powershell(.exe)+" , CommandLine) | project TimeGenerated, Computer, SubjectUserName, PwshParam | summarize min(TimeGenerated), count() by Computer, SubjectUserName, PwshParam | order by count_ desc nulls last DeviceEvents | where ingestion_time() > ago(1d) | where ActionType == "AntivirusDetection" | summarize (Timestamp, ReportId)=arg_max(Timestamp, ReportId), count() by DeviceId | where count_ > 5 I hope that this information is helpful to you and that you have received a good "little" foundation. This is certainly not an exhaustive list. But I still hope that this information is helpful for you. Thank you for taking the time to read the article. Happy Hunting, Tom Wechsler P.S. All scripts (#PowerShell, Azure CLI, #Terraform, #ARM) that I use can be found on github! https://github.com/tomwechslerNew blog post | Seeking Dead and Dying Servers with the MDEASM APIs
This post follows Seeking Dead and Dying Servers blog and introduces the https://learn.microsoft.com/en-us/rest/api/defenderforeasm/. You should start with the previous post if you haven't already done so or are brand new to Defender EASM. Defender EASM APIs provide much more capability than the UI (user interface) alone, enabling users to work with large numbers of assets in one action or piece of code. The pro of APIs is they provide an unencumbered interface between the application and the code or app interacting with it to enable exciting capabilities. However, leveraging an API usually involves significant coding work, even for experienced users. Luckily, I've written sample Jupyter Notebooks in Python and PowerShell you can download and use regardless of your experience level. Seeking Dead and Dying Servers with the MDEASM APIs - Microsoft Community HubMicrosoft Defender Threat Intelligence and Sentinel integration deep dive
See how quick detection and response are vital to navigating today's fast-moving cyberattacks. We'll break down a cyberattack and show how Microsoft Defender Threat Intelligence, combined with Microsoft's SIEM and XDR solutions, constructs a multi-stage incident giving visibility into the attack timeline and all related events. We'll then investigate the attacker and automate mitigations to contain the damage. This session is part of the Microsoft Secure Tech Accelerator. RSVP for event reminders, add it to your calendar, and post your questions and comments below! This session will also be recorded and available on demand shortly after conclusion of the live event.5.5KViews5likes26CommentsNew Blog Post | The New Microsoft Security Customer Connection Program (CCP)
Read the full blog post: The New Microsoft Security Customer Connection Program (CCP) - Microsoft Community Hub The security community is constantly growing, changing, and learning from each other in order to better position the world against cyber security threats. For years, Microsoft has driven a customer-obsessed development process by hosting two private communities for end-users of Microsoft security products: the Microsoft Cloud Security Private Community and the Microsoft 365 Defender Customer Connection Program. Under a strict confidentiality framework, our engineering teams get direct community feedback and insights for our roadmap plans, new user experience designs, private preview features, and more. Today, we are happy to announce that these two communities have now come together under one team – The Microsoft Security Customer Connection Program.2.2KViews1like0CommentsFor enterprise APIs, is Zero-Copy Integration the David to big data’s Goliath?
In Rodgers and Hammerstein’s “The King and I,” the King explains to “I” that the bee always flies from flower to flower, the flower never flies from bee to bee. That justification for philandering didn’t fly with Mrs. Anna, but it does make sense when applied to the relationship between applications and data: Should data fly from application to application, or should the data stay put like a flower and let applications approach it on its terms? A new framework, formulated as an open standard that has just received the imprimatur of the Canadian government, is keeping data firmly rooted. What is Zero-Copy Integration? Zero-Copy Integration is an initiative championed by the Canadian collaborative data company Cinchy. It aims to overturn the enterprise software https://technologyadvice.com/blog/information-technology/how-to-use-an-api/ paradigm with a totally new model — the company calls it dataware — that keeps data effectively rooted while removing complexity and data redundancy from the enterprise software integration process. Benefits of Zero-Data Integration Proponents of zero-copy integration and dataware say the framework will lower data storage costs, improve performance of IT teams, improve privacy and security of data, and drive innovation in systems for public health, social research, open banking and sustainability through innovations in: Application development and enrichment. Predictive analytics. Digital twins. Customer 360 technology. Artificial intelligence and machine learning. Workflow automation. Legacy system modernization. On Tuesday, Canada’s Digital Governance Council and the not-for-profit Data Collaboration Alliance, created by Cinchy, announced CAN/CIOSC 100-9, Data governance – Part 9: Zero-Copy Integration, a national standard approved by the Standards Council of Canada, to be published as an open standard. Zero-Copy Integration seeks to eliminate API-driven data silos The basic idea, according to Dan DeMers, Cinchy’s CEO, is that the framework aims to remove application data silos by using access-based data collaboration versus standard API-base data integration that involves copying data and branding it with complex app-specific coding. This would be done by access controls set in the data layer. It would also involve: Data governance via data products and https://www.techrepublic.com/article/data-stewardship-vs-data-governance/, not centralized teams. Prioritization of “data-centricity” and active metadata over complex code. Prioritization of solution modularity over monolithic design. The initiative said viable projects for Zero-Copy Integration include the development of new applications, predictive analytics, https://www.techrepublic.com/article/digital-twins-are-moving-into-the-mainstream/, customer 360 views, AI/ML operationalization and workflow automations as well as legacy system modernization and SaaS application enrichment. DeMers, who is also technical committee member for the standard, promises a revolution in data. “At some point in a world of increasing complexity, you fall off a cliff, so we believe we’re at the beginning of the simplification revolution,” he said. “The fact is that data is becoming increasingly central, and the way that we share it is with APIs and https://www.techrepublic.com/article/what-is-etl/, which involves creating copies and vastly increases complexity and cost. It amounts to half the IT capacity of every complex organization on the planet, and every year it gets more expensive.” He said even more concerning is that every time a copy is generated, a degree of control is lost. “If I run a bank, and I have a thousand applications, and they all need to interact with some representation of my customer, and by doing that are copying that representation, I now have a thousand copies of that customer,” DeMers said. “How do I protect that?” Security through Zero-Copy frameworks Laws describing ownership of data limit how organizations or governments can use that data — but they are laws, not systematic controls, noted DeMers. A key point of the Zero-Data Integration argument, and Canada’s adoption of a framework in principle, is that it makes data security easier by limiting access and control. “Zero Copy is a paradigm shift because it allows you to embed controls in the data itself,” DeMers said. “Because it’s access based, not copy based, access can be granted and it can be revoked, whereas copies are forever and you can quickly lose control over who has them, and any attempt to limit what organizations do when they obtain a copy is hard. “ Cinchy is aiming for a “data fabric architecture” to transform data warehouses, lakes and/or https://www.techrepublic.com/article/top-5-things-to-know-about-data-lakehouses/ into repositories that can actualize both analytics and operational software. This is so apps can come to it, not carry copies of data back to the application walled garden. DeMers argued that the creation and storage of copies costs money, both because of storage and data pipelines and the time IT has to spend managing the iterations of data generated by hundreds or thousands of apps an enterprise may host. “Copies of data require storage; the creation of the copy and synchronizing it not only uses storage, but also uses computation,” he said. “If you imagine most of the processes running on servers in the bank right now, they’re moving and reconciling copies of data, which constitutes energy use.” He added that copying and moving data creates opportunities to introduce errors. If two systems connected by a data pipeline desync, then data can be lost or corrupted, reducing data quality. With one copy of the data used collectively by all systems, there’s no chance of records appearing differently in different contexts. Is Zero-Copy Integration an L.A. subway dream? Matt McLarty, chief technology officer of Salesforce’s MuleSoft, agrees that data replication is a perennial issue. “Not even data replication, but the existence of semantically equivalent data in different places,” he said. He sees it as a bit like Los Angeles and subways: A great idea in principle, but nobody is going to tear Los Angeles down and rebuild it around mass transit. “It’s both a huge issue but also an unavoidable reality,” he said. “From a problem statement, yes, but I would say there are multiple categories of software in the space, including Salesforce Genie, all about how you harness all of the customer data widely dispersed across the ecosystem.” Operational elephants and analytical zebras drinking from the same data lake Most enterprises, explained McLarty, have two massive areas of data that, while not at cross purposes, need to live separately: operational data and https://www.techrepublic.com/article/data-analytics-growth-in-down-market/. Operational data is employed by such user-facing applications as mobile banking; analytical data takes data out of the flow of operational activities and uses it for business analytics and intelligence. “They have historically lived separately because of the processing differences,” he said. “Operationally, there’s high speed, high-scale processing and analytically, small internal groups crunching big numbers.” DeMers explained that what dataware does, among other things, is to incorporate “operational data fabric.” This, he said, makes “last time” integration from external data sources to an architecture based on a “network of datasets” that’s capable of powering unlimited business models. “Once created, these models can be readily operationalized as metadata-based experiences or exposed as APIs to power low code and pro code UX designs,” he said, adding that it eliminates the need to stand up new databases, perform point-to-point data integration or set app-specific data protections. “Another core concept associated with dataware technology is ‘collaborative intelligence,’ which is created as a result of users and connected systems, simultaneously enriching the information within the dataset network,” he said. DeMers said users granted access to a dataset by its owners get an interface called a “data browser” offering a “self-serve experience.” “In principle, this works a bit like Google Docs, where multiple colleagues collaborate on a white paper or business proposal while the software automatically offers grammatical suggestions and manages roles, permissions, versioning and backup,” he said. DeMers added that the end result is super-enriched and auto-protected data that can be instantly queried by teams to power unlimited dashboards, 360 views and other analytics projects. Will companies simplify or “embrace the chaos?” By some estimates, companies are taking the “embrace the chaos” route to find new approaches that concede that the enterprise data frameworks will remain complex and L.A.-like. These include https://www.eweek.com/enterprise-apps/data-mesh/ frameworks and automation and machine learning systems creating models that integrate different kinds of data. “I think the biggest shift right now in the world of data is that the two worlds — analytical and operational — are colliding,” McLarty said. “What’s happening now, because of the big data movement and machine learning, is data-derived coding — writing code with data, ingesting data and producing machine learning models based on the data that I can put into my applications.” DeMers said that the dataware paradigm enables data mesh concepts. “Requiring a single team to manage every dataset in the organization is a sure path to failed https://www.techrepublic.com/article/data-governance-framework/,” he said. He also argued that in a data-centric organization, data stewards should reflect the granularity of your organization chart. “This approach to federated data governance organized around data domains and data products is the data mesh, and it’s a big part of establishing a more agile enterprise,” DeMers said. Data silos make this difficult because of the unrestricted point-to-point data integration that it involves. Liberating data from the application Sylvie Veilleux, former chief information officer of Dropbox, said data silos are a fundamental part of the ecosystem, but that is a problem dataware can solve. “Every app solves a specific and unique purpose, and they are tending toward more and more specialization, she said. “The more SaaS adoption continues, which is very healthy in terms of how the business gets access to tools, the more it’s continuously creating a hundred, thousand or more data silos in larger corporations. This number will continue to grow without us taking a whole new approach to how we think about data applications.” She said dataware and Zero-Data Integration allows enterprises to eliminate extra data integrations by having the app connect to a network data source. “It changes how we work by pivoting the process from data being the captive of an application to keeping it on a network, thereby letting users collaborate, and giving businesses real-time access to it,” Veilleux said. With data repositories moving to the cloud, a boon to collaboration, companies have more flexibility and reduced costs, but at what cost to security and threats? which includes guidelines that will help you achieve secure cloud data management for integrity and privacy of company-owned information.CIS Benchmark 1.4 "Ensure that activity log alert exists for the Delete Network Security Group Rule"
Hi, We are trying to remediate this policy receommnedation as per CIS Benchmark 1.4.0, but cannot see an option to configure this rule. Can anyone please advise where to look for this option to remediate this recommendation? Many thanks1.3KViews0likes1CommentBlocking someone from trying to log in
I was involved in a romance scam. I have my computer set up that you need to approve login through my cell phone. Now, I have reason to believe that he is unsuccessfully trying to login to my computer from different isps all over the world. He tries several times a day almost every day. I have enabled all the security ways that I can. Is there any way I can block him without the hassle of changing my email everywhere?1.2KViews0likes1CommentNew Blog | Introducing Automatic File and URL (Detonation) Analysis
The Microsoft Defender Threat Intelligence (MDTI) team continuously adds new threat intelligence capabilities to MDTI and Defender XDR, giving customers new ways to hunt, research, and contextualize threats. Read up on a new feature that enhances our file and URL analysis (detonation) capabilities in the threat intelligence blade within the Defender XDR user interface. If MDTI cannot return any results when a customer searches for a file or URL, MDTI now automatically detonates it to improve search coverage and add to our corpus of knowledge of the global threat landscape. See the blog post here: Introducing Automatic File and URL (Detonation) Analysis - Microsoft Community HubNew Blog | 10 essential insights from the Microsoft Digital Defense Report 2023
By Quy Nguyen Published Jan 08 2024 09:19 AM In an era marked by escalating cyber threats, Microsoft sheds light on the global security landscape through the Microsoft Digital Defense Report 2023. Harnessing extensive security research and a unique vantage point, Microsoft not only comprehends the current state of cybersecurity but also utilizes a diverse range of security data to predict and identify indicators of cyber threats. Read the full blog post here: Microsoft Digital Defense Report 20231KViews0likes0Comments