data collection
31 TopicsRevolutionizing log collection with Azure Monitor Agent
The much awaited deprecation of the MMA agent is finally here. While still sunsetting, this blog post reviews the advantages of AMA, different deployment options and important updates to your favorite Windows, Syslog and CEF events via AMA data connectors.9.3KViews1like3CommentsEnhancing Security Monitoring: Integrating GitLab Cloud Edition with Microsoft Sentinel
Maximize your security operations by combining GitLab Cloud Edition with Microsoft Sentinel. This blog covers how to fill the void of a missing native connector for GitLab in Sentinel. Utilize GitLab's API endpoints, Azure Monitor Data Collection Rules, and Data Collection Endpoints, as well as Azure Logic Apps and Key Vault, to simplify log collection and improve immediate threat identification. Our detailed guide will help you integrate smoothly and strengthen your security defences.4.8KViews0likes6CommentsComprehensive coverage and cost-savings with Microsoft Sentinel’s new data tier
Microsoft is excited to announce the public preview of a new data tier Auxiliary Logs and Summary Rules in Microsoft Sentinel to further increase security coverage for high-volume data at an affordable price.16KViews3likes2CommentsExciting Announcements: New Data Connectors Released Using the Codeless Connector Framework
Microsoft Sentinel’s Codeless Connector Framework or ‘CCF’ (formerly called Codeless Connector Platform [CCP]) represents a paradigm shift in data ingestion, making it easier than ever for organisations to do more with Microsoft Sentinel by integrating diverse data sources seamlessly. Designed to simplify and expedite the onboarding of data sources, CCF eliminates the need for extensive coding expertise and maintaining additional services to facilitate ingestion, allowing security teams to focus on what truly matters – safeguarding their environment. Advantages of the Codeless Connector Framework The Codeless Connector Framework offers several compelling benefits: Ease of Use: CCF configuration-based templates allows advanced users to create data connectors without writing exhausting code, making the onboarding process quicker and more accessible to a broader audience. Flexibility: Users can customise data streams to meet their specific needs; optimizing efficacy while ensuring more control on the data being ingested. Scalability: The connectors built using CCF follows a true SaaS auto-expansion model making them highly scalable and natively reliable for large data volumes. Efficiency: By reducing the time and effort required to develop and deploy data connectors, CCF accelerates the availability of critical insights for security monitoring and more rapidly expands the value Microsoft Sentinel provides. What are we up to? We recognize that Codeless Connectors offer substantial advantages over Azure Function App based ingestion in Microsoft Sentinel in most cases. That motivates us to continue investing in modernizing our ingestion patterns for out-of-box connectors; one connector at a time. Another goal of modernizing these connectors is to replace the deprecated HTTP Data Collector API with the Log Ingestion API to send data to Microsoft Sentinel. Announcing the General Availability of New Data Connectors We are continually improving the Data Collection experience for our customers and are thrilled to announce that the following data connectors are now Generally Available (GA) on the Codeless Connector Framework. Atlassian Confluence Ingesting Confluence audit logs allows organizations to monitor collaboration activity, detect security risks, and troubleshoot configuration issues using Confluence audit records. Auth0 With the Auth0 Connector, organizations can effortlessly integrate authentication and authorization data from Auth0 into Microsoft Sentinel. This connector provides valuable insights into user activities and access patterns, bolstering identity security and compliance efforts. Azure DevOps Audit logs from Azure DevOps, allows security teams to monitor user activities, detect anomalous behavior, and investigate potential threats across DevOps environments. Box The Box Connector facilitates the ingestion of file storage and sharing data from Box into Microsoft Sentinel. By leveraging this connector, security teams can monitor file access and sharing activities, ensuring data integrity, and preventing unauthorized access. Google Cloud Platform Load Balancer With GCP Load Balancer and Web Application Firewall (Cloud Armor) logs, security teams can monitor inbound network activity, enforce security policies, and detect threats across GCP environments. Proofpoint POD The ingestion of email security logs allows organizations to monitor message traceability, detect threats, and investigate data exfiltration attempts by attackers and malicious insiders. Proofpoint TAP Email threat intelligence logs, including message and click events, provides visibility into malware and phishing activity to support custom alerts, dashboards, and threat investigation. SentinelOne The SentinelOne Connector enables seamless ingestion of threat intelligence and endpoint security data from SentinelOne into Microsoft Sentinel. This integration empowers security teams to enhance their threat detection capabilities and respond swiftly to potential threats. New Connectors in Public Preview CrowdStrike Falcon Data Replicator (S3 based Polling) Google Cloud Platform VPC Flow Google Cloud Platform DNS Google IAM These new additions are not new out-of-box sources in Microsoft Sentinel, but they do improve how data is collected. The previously Azure Function App based polling has now been upgraded to the Codeless Connector Framework for these products to ensure data collection adheres to the more scalable; advantageous pattern with CCF. As noted previously, the newer version of these connectors replaces the deprecated HTTP Data Collector API with the Log Ingestion API to send data to Microsoft Sentinel. Call to Action! Microsoft Sentinel customers collecting data from any of the mentioned sources using Azure Function Apps are advised to migrate their ingestion streams to newer versions to utilize the Codeless Connector Framework. While we continue to improve the data collection experience across all connectors, we encourage our customers and partners to join the Microsoft Security Communities to benefit from early insights about the latest and greatest with Microsoft Security. Call to Action for ISV Partners We invite our ISV partners to migrate their Azure Function App-based data connectors to the Codeless Connector Framework. By leveraging CCF for data ingestion, we can ensure that our mutual customers benefit from streamlined data integration and enhanced security monitoring in Microsoft Sentinel. We are committed to ensuring partners have all the support needed in this transformation. For any support, please reach out to us at Microsoft Sentinel Partners. Join us in this transformative journey to empower our customers by unlocking the full potential of their security investments with Microsoft Sentinel’s Codeless Connector Framework. References Create a codeless connector for Microsoft Sentinel Migrate from the HTTP Data Collector API to the Log Ingestion API to send data to Azure Monitor Logs1.4KViews0likes1CommentTransitioning from the HTTP Data Collector API to the Log Ingestion API…What does it mean for me?
This article is co-authored by Andrea Fisher, Brian Delaney, and Jon Shectman (Microsoft Customer Success Unit). Many customers have recently received an email sharing the information that the HTTP Data Collector API will be retired on September 14, 2026. What exactly does that mean for you? Either you have deployed a built-in Microsoft Sentinel Data Connector that is using the HTTP Data Collector API or you have configured a custom connector of your own that uses the API. In this blog, we’ll explain why you got (or will receive) this notification, what’s at stake, and what actions you need to take. But first, what is the HTTP Data Collector API Anyway? The HTTP Data Collector API is nothing more than a set of rules and protocols governing (you guessed it!) data collection – in this case to Azure Monitor (a “back end” for Microsoft Sentinel). This API has been deprecated in favor of a newer, improved API, the Azure Monitor Logs Ingestion API. Here is a copy of the email: What actions should I take? As you can see, the Account Information section only lists the Subscription name and ID that are calling the old API. It doesn’t state how your organization is calling it. Below are three possibilities. Do you have a custom application that you built or licensed? Do you have any custom data connectors (likely built as either Azure Functions or codeless connectors)? You have a data connector from the in-product Content Hub, provided by Microsoft or one of our partner ISVs – that will be rewritten prior to the API deprecation date. It’s also possible that you could be using more than one of the above methods in your workspace or in more than one workspace in your subscription. There are several steps you can take to start discovering your usage of this deprecated API. In your Log Analytics workspace, navigate to Settings, then Tables and examine the Type column. Any table built with data from the deprecated API will be of type Custom table (classic). Remember, some of these tables may not be in use anymore; there are many ways to identify tables that are in active use. One way is with a simple query - as in this example: InformationProtectionLogs_CL | where TimeGenerated > ago(90d) You could also examine the Usage and estimated costs chart in Log Analytics, or if you want to check regularly over time you could set up a log search alert rule. Now let’s examine built-in data connectors that use the deprecated API. Generally, they specify their usage in the details: To remediate: If you discover a custom application or data connector, you will need to follow these steps to transition to the Logs Ingestion API before the retirement date. We recommend that you do not wait but start the process early to give your organization time to thoroughly test and migrate all applications and connectors. For built-in data connectors, you’ll need to watch the Content Hub for updates and guidance as shown in these two screenshots: Advantages of the Azure Monitor Logs Ingestion API There are numerous advantages to using the new API: It supports transformations, which enable you to modify the data before it's ingested into the destination table, including filtering and data manipulation. It allows you send data to supported Azure tables or to custom tables that you create. You can extend the schema of Azure tables with custom columns to accept additional data. It lets you send data to multiple destinations. Last but certainly not least (we are security practitioners after all): it allows for granular role-based access controls (RBAC) to limit the ability to ingest data by data collection rule and identity. In Summary The transition from the HTTP Data Collector API to the Azure Monitor Logs Ingestion API is crucial for maintaining data ingestion functionality and security. The new API offers several advantages, including secure OAuth-based authentication, the ability to filter and transform data during ingestion, and granular RBAC. Organizations should proactively transition to the new API before the retirement date of September 14, 2026.2.1KViews0likes0CommentsMicrosoft Sentinel & Cyberint Threat Intel Integration Guide
Explore comprehensive guide on "Microsoft Sentinel & Cyberint Threat Intel Integration Guide," to learn how to integrate Cyberint's advanced threat intelligence with Microsoft Sentinel. This detailed resource will walk you through the integration process, enabling you to leverage enriched threat data for improved detection and response. Elevate your security posture and ensure robust protection against emerging threats. Read the guide to streamline your threat management and enhance your security capabilities.10KViews1like1CommentIntegrating Fluent Bit with Microsoft Sentinel
This guide will walk you through the steps required to integrate Fluent Bit with Microsoft Sentinel. Beware that in this article, we assume you already have a Sentinel workspace, a Data Collection Endpoint and a Data Collection Rule, an Entra ID application and finally a Fluent Bit installation. As mentioned above, log ingestion API supports ingestion both in custom tables as built-in tables, like CommonSecurityLog, Syslog, WindowsEvent and more. In case you need to check which tables are supported please the following article: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables Prerequisites: Before beginning the integration process, ensure you have the following: An active Azure subscription with Microsoft Sentinel enabled; Microsoft Entra ID Application taking note of the ClientID, TenantID and Client Secret – create one check this article: https://learn.microsoft.com/en-us/entra/identity-platform/quickstart-register-app?tabs=certificate A Data Collection Endpoint (DCE) – to create a data collection endpoint, please check this article: https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-endpoint-overview?tabs=portal A Data Collection Rule (DCR) – fields from the Data Collection Rule need to match exactly to what exists in table columns and also the fields from the log source. To create a DCR please check this article: https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/data-collection-rule-create-edit?tabs=cli Depending on the source, it might require a custom table to be created or an existing table from log analytics workspace; Fluent Bit installed on your server or container – In case you haven’t yet installed Fluent Bit, in the following article you'll find the instructions per type of operating system: https://docs.fluentbit.io/manual/installation/getting-started-with-fluent-bit High level architecture: Step 1: Setting up Fluent Big configuration file Before we step-in into the configuration, Fluent Bit has innumerous output plugins and one of those is through Log Analytics Ingestion API both to supported Sentinel tables but also for custom tables. You can check more information about it here in Fluent Bit documentation: https://docs.fluentbit.io/manual/pipeline/outputs/azure_logs_ingestion Moving forwarder, in order to configure Fluent Bit to send logs into Sentinel log analytics workspace, please take note of the specific input plugin you are using or intend to use to receive logs and how can you use it to output the logs to Sentinel workspace. For example most of the Fluent Bit plugins allow to set a “tag” key which can be used within the output plugin so that there’s a match in which logs are intended to send. On the other hand, in a scenario where multiple input plugins are used and all are required send logs to Sentinel, then a match of type wildcard "*" could be used as well. Another example, in a scenario where there are multiple input plugins of type “HTTP” and you want to just send a specific one into Sentinel, then the “match” field must be set according to the position of the required input plugin, for example “match http.2”, if the input plugin would the 3 rd in the list of HTTP inputs. If nothing is specified in the "match" field, then it will assume "http.0" by default. For better understanding, here’s an example of how a Fluent Bit config file could look: First, the configuration file is located under the path ”/etc/fluent-bit/fluent-bit.conf” The first part is the definition of all “input plugins”, then follows the “filter plugins” which you can use for example to rename fields from the source to match for what exists within the data collection rule schema and Sentinel table columns and finally there’s the output plugins. Below is a screenshot of a sample config file: INPUT plugins section: In this example we’re going to use the “dummy input” to send sample messages to Sentinel. However, in your scenario you could leverage other’s input plugins within the same config file. After everything is configured in the input section, make sure to complete the “FILTER” section if needed, and then move forward to the output plugin section, screenshot below. OUTPUT plugins section: In this section, we have output plugins to write on a local file based on two tags “dummy.log” and “logger”, an output plugin that prints the outputs in json format and the output plugin responsible for sending data to Microsoft Sentinel. As you can see, this one is matching the “tag” for “dummy.log” where we’ve setup the message “{“Message”:”this is a sample message for testing fluent bit integration to Sentinel”, “Activity”:”fluent bit dummy input plugn”, “DeviceVendor”:”Ubuntu”}. Make sure you insert the correct parameters in the output plugin, in this scenario the "azure_logs_ingestion" plugin. Step 2: Fire Up Fluent Bit When the file is ready to be tested please execute the following: sudo /opt/fluent-bit/bin/fluent-bit -c /etc/fluent-bit/fluent-bit.conf Fluent bit will start initialization all the plugins it has under the config file. Then you’re access token should be retrieved if everything is well setup under the output plugin (app registration details, data collection endpoint URL, data collection rule id, sentinel table and important to make sure the name of the output plugin is actually “azure_logs_ingestion”). In a couple of minutes you should see this data under your Microsoft Sentinel table, either an existing table or a custom table created for the specific log source purpose. Summary Integrating Fluent Bit with Microsoft Sentinel provides a powerful solution for log collection and analysis. By following this guide, hope you can set up a seamless integration that enhances your organization's ability to monitor and respond to security threats, just carefully ensure that all fields processed in Fluent Bit are mapped exactly to the fields in Data Collection Rule and Sentinel table within Log Analytics Workspace. Special thanks to “Bindiya Priyadarshini” that collaborated with me on this blog post. Cheers!1.7KViews2likes1CommentHelp Protect your Exchange Environment With Microsoft Sentinel
TL;DR; Sentinel + Exchange Servers or Exchange Online = better protected New Microsoft Sentinel security solution for Exchange Online and on premises servers : Microsoft Exchange Security! This content is very useful for any organization concerned about keeping the highest security posture as possible and be alerted in case of suspicious activities for those critical items.18KViews6likes12Comments