security operations
19 TopicsIntroducing a Unified Security Operations Platform with Microsoft Sentinel and Defender XDR
Read about our announcement of an exciting private preview that represents the next step in the SOC protection and efficiency journey by bringing together the power of Microsoft Sentinel, Microsoft Defender XDR and Microsoft Security Copilot into a unified security operations platform.81KViews17likes12CommentsLevel Up Your Security Skills with the New Microsoft Sentinel Ninja Training!
If you’ve explored our Microsoft Sentinel Ninja Training in the past, it’s time to revisit! Our training program has undergone some exciting changes to keep you ahead of the curve in the ever-evolving cybersecurity landscape. Microsoft Sentinel is a cutting-edge, cloud-native SIEM and SOAR solution designed to help security professionals protect their organizations from today’s complex threats. Our Ninja Training program is here to guide you through every aspect of this powerful tool. So, what’s new? In addition to the structured security roles format, the Ninja Training now offers a more interactive experience with updated modules, hands-on labs, and real-world scenarios. Whether you're focusing on threat detection, incident response, or automation, the training ensures you gain the practical skills needed to optimize your security operations. One of the biggest updates is the integration of Sentinel into the Defender XDR portal, creating a unified security platform. This merger simplifies workflows, speeds up incident response, and minimizes tool-switching, allowing for seamless operations. Other highlights include: Step-by-step guidance through the official Microsoft Sentinel documentation. Exclusive webinars and up-to-date blog posts from Microsoft experts. If you're ready to take your Sentinel skills to the next level or want to revisit the program’s new features, head over to the blog now and dive into the refreshed Microsoft Sentinel Ninja Training! Don’t miss out—your next cybersecurity breakthrough is just a click away!5.7KViews5likes1CommentUse Azure DevOps to manage Sentinel for MSSPs and Multi-tenant Environments
Automate Sentinel resource deployment in multi-tenant scenarios using Azure DevOps and Sentinel Repositories. Enable version control, collaboration, and streamlined updates for consistent and secure configurations.11KViews5likes6CommentsIngesting Non-Microsoft Cloud Security Data into Microsoft Sentinel for Government & DIB Customers
Microsoft Azure Government customers that are using Microsoft Sentinel to monitor their Azure & Office 365 Tenants, the next logical step in modernizing their SOC Operations is to ingest security data from non-Microsoft Clouds like Google Cloud Platform (GCP) and Amazon Web Services (AWS) to get a holistic picture of their security alerting, monitoring, and response across all cloud environments. This blog series will review how to ingest security data from non-Microsoft clouds into an Azure Government Sentinel instance. Throughout this series, we’ll be covering: FedRamp Cloud Authorizations, AWS ingestion scenarios, and connector architecture. Ingesting AWS Commercial and GovCloud data into Azure Government Sentinel. Ingesting AWS Commercial and GovCloud data into Azure Commercial Sentinel. AWS Ingestion using Cloud Formation Template Ingesting GCP data into Sentinel in either Azure Government or Commercial.3.9KViews4likes0CommentsIntroducing Threat Intelligence Ingestion Rules
Microsoft Sentinel just rolled out a powerful new public preview feature: Ingestion Rules. This feature lets you fine-tune your threat intelligence (TI) feeds before they are ingested to Microsoft Sentinel. You can now set custom conditions and actions on Indicators of Compromise (IoCs), Threat Actors, Attack Patterns, Identities, and their Relationships. Use cases include: Filter Out False Positives: Suppress IoCs from feeds known to generate frequent false positives, ensuring only relevant intel reaches your analysts. Extending IoC validity periods for feeds that need longer lifespans. Tagging TI objects to match your organization's terminology and workflows Get Started Today with Ingestion Rules To create new “Ingestion rule”, navigate to “Intel Management” and Click on “Ingestion rules” With the new Ingestion rules feature, you have the power to modify or remove indicators even before they are integrated into Sentinel. These rules allow you to act on indicators currently in the ingestion pipeline. > Click on “Ingestion rules” Note: It can take up to 15 minutes for the rule to take effect Use Case #1: Delete IOC’s with less confidence score while ingesting When ingesting IOC's from TAXII/Upload API/File Upload, indicators are imported continuously. With pre-ingestion rules, you can filter out indicators that do not meet a certain confidence threshold. Specifically, you can set a rule to drop all indicators in the pipeline with a confidence score of 0, ensuring that only reliable data makes it through. Use Case #2: Extending IOC’s The following rule can be created to automatically extend the expiration date for all indicators in the pipeline where the confidence score is greater than 75. This ensures that these high-value indicators remain active and usable for a longer duration, enhancing the overall effectiveness of threat detection and response. Use Case #3: Bulk Tagging Bulk tagging is an efficient way to manage and categorize large volumes of indicators based on their confidence scores. With pre-ingestion rules, you can set up a rule to tag all indicators in the pipeline where the confidence score is greater than 75. This automated tagging process helps in organizing indicators, making it easier to search, filter, and analyze them based on their tags. It streamlines the workflow and improves the overall management of indicators within Sentinel. Managing Ingestion rules In addition to the specific use cases mentioned, managing ingestion rules gives you control over the entire ingestion process. 1. Reorder Rules You can reorder rules to prioritize certain actions over others, ensuring that the most critical rules are applied first. This flexibility allows for a tailored approach to data ingestion, optimizing the system's performance and accuracy. 2. Create From Creating new ingestion rules from existing ones can save you a significant amount of time and offer the flexibility to incorporate additional logic or remove unnecessary elements. Effectively duplicating these rules ensures you can quickly adapt to new requirements, streamline operations, and maintain a high level of efficiency in managing your data ingestion process. 3. Delete Ingestion Rules Over time, certain rules may become obsolete or redundant as your organizational needs and security strategies evolve. It's important to note that each workspace is limited to a maximum of 25 ingestion rules. Having a clean and relevant set of rules ensures that your data ingestion process remains streamlined and efficient, minimizing unnecessary processing and potential conflicts. Deleting outdated or unnecessary rules allows for a more focused approach to threat detection and response. It reduces clutter, which can significantly enhance the performance. By regularly reviewing and purging obsolete rules, you maintain a high level of operational efficiency and ensure that only the most critical and up-to-date rules are in place. Conclusion By leveraging these pre-ingestion rules effectively, you can enhance the quality and reliability of the IOC’s ingested into Sentinel, leading to more accurate threat detection and an improved security posture for your organization.4.8KViews3likes2CommentsIntegrating Fluent Bit with Microsoft Sentinel
This guide will walk you through the steps required to integrate Fluent Bit with Microsoft Sentinel. Beware that in this article, we assume you already have a Sentinel workspace, a Data Collection Endpoint and a Data Collection Rule, an Entra ID application and finally a Fluent Bit installation. As mentioned above, log ingestion API supports ingestion both in custom tables as built-in tables, like CommonSecurityLog, Syslog, WindowsEvent and more. In case you need to check which tables are supported please the following article: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables Prerequisites: Before beginning the integration process, ensure you have the following: An active Azure subscription with Microsoft Sentinel enabled; Microsoft Entra ID Application taking note of the ClientID, TenantID and Client Secret – create one check this article: https://learn.microsoft.com/en-us/entra/identity-platform/quickstart-register-app?tabs=certificate A Data Collection Endpoint (DCE) – to create a data collection endpoint, please check this article: https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-endpoint-overview?tabs=portal A Data Collection Rule (DCR) – fields from the Data Collection Rule need to match exactly to what exists in table columns and also the fields from the log source. To create a DCR please check this article: https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-rule-create-edit?tabs=cli Depending on the source, it might require a custom table to be created or an existing table from log analytics workspace; Fluent Bit installed on your server or container – In case you haven’t yet installed Fluent Bit, in the following article you'll find the instructions per type of operating system: https://docs.fluentbit.io/manual/installation/getting-started-with-fluent-bit High level architecture: Step 1: Setting up Fluent Big configuration file Before we step-in into the configuration, Fluent Bit has innumerous output plugins and one of those is through Log Analytics Ingestion API both to supported Sentinel tables but also for custom tables. You can check more information about it here in Fluent Bit documentation: https://docs.fluentbit.io/manual/pipeline/outputs/azure_logs_ingestion Moving forwarder, in order to configure Fluent Bit to send logs into Sentinel log analytics workspace, please take note of the specific input plugin you are using or intend to use to receive logs and how can you use it to output the logs to Sentinel workspace. For example most of the Fluent Bit plugins allow to set a “tag” key which can be used within the output plugin so that there’s a match in which logs are intended to send. On the other hand, in a scenario where multiple input plugins are used and all are required send logs to Sentinel, then a match of type wildcard "*" could be used as well. Another example, in a scenario where there are multiple input plugins of type “HTTP” and you want to just send a specific one into Sentinel, then the “match” field must be set according to the position of the required input plugin, for example “match http.2”, if the input plugin would the 3 rd in the list of HTTP inputs. If nothing is specified in the "match" field, then it will assume "http.0" by default. For better understanding, here’s an example of how a Fluent Bit config file could look: First, the configuration file is located under the path ”/etc/fluent-bit/fluent-bit.conf” The first part is the definition of all “input plugins”, then follows the “filter plugins” which you can use for example to rename fields from the source to match for what exists within the data collection rule schema and Sentinel table columns and finally there’s the output plugins. Below is a screenshot of a sample config file: INPUT plugins section: In this example we’re going to use the “dummy input” to send sample messages to Sentinel. However, in your scenario you could leverage other’s input plugins within the same config file. After everything is configured in the input section, make sure to complete the “FILTER” section if needed, and then move forward to the output plugin section, screenshot below. OUTPUT plugins section: In this section, we have output plugins to write on a local file based on two tags “dummy.log” and “logger”, an output plugin that prints the outputs in json format and the output plugin responsible for sending data to Microsoft Sentinel. As you can see, this one is matching the “tag” for “dummy.log” where we’ve setup the message “{“Message”:”this is a sample message for testing fluent bit integration to Sentinel”, “Activity”:”fluent bit dummy input plugn”, “DeviceVendor”:”Ubuntu”}. Make sure you insert the correct parameters in the output plugin, in this scenario the "azure_logs_ingestion" plugin. Step 2: Fire Up Fluent Bit When the file is ready to be tested please execute the following: sudo /opt/fluent-bit/bin/fluent-bit -c /etc/fluent-bit/fluent-bit.conf Fluent bit will start initialization all the plugins it has under the config file. Then you’re access token should be retrieved if everything is well setup under the output plugin (app registration details, data collection endpoint URL, data collection rule id, sentinel table and important to make sure the name of the output plugin is actually “azure_logs_ingestion”). In a couple of minutes you should see this data under your Microsoft Sentinel table, either an existing table or a custom table created for the specific log source purpose. Summary Integrating Fluent Bit with Microsoft Sentinel provides a powerful solution for log collection and analysis. By following this guide, hope you can set up a seamless integration that enhances your organization's ability to monitor and respond to security threats, just carefully ensure that all fields processed in Fluent Bit are mapped exactly to the fields in Data Collection Rule and Sentinel table within Log Analytics Workspace. Special thanks to “Bindiya Priyadarshini” that collaborated with me on this blog post. Cheers!1.7KViews2likes1CommentImprove SecOps collaboration with case management
Are you using a 3rd party case management system for the SecOps work you do in Microsoft Sentinel or Defender XDR? Do you struggle to find a solution that encompasses the specific needs of your security team? We are excited to announce a new case management solution, now in public preview. This is our first step towards providing a native, security-focused case management system that spans all SecOps workloads in the Defender portal, removing customer reliance on 3rd party SIEM/XDR and ticketing systems. This will be available for all Microsoft Sentinel customers that have onboarded to the unified SecOps platform.4.5KViews2likes0Comments