We commonly see customers using Azure Monitor in a wide range of scenarios across applications and infrastructure, using data collection, store, out-of-the-box insights and analytics query in their core operation core. We heard from our customers ask for a clear message to bring this immense scale and high-end services also for other logging use cases – making Azure Monitor the one stop shop for all logging needs. :
We are happy to launch these capabilities to public preview together with Microsoft Sentinel. With the new announcements, Microsoft Sentinel is reinventing the economics of SIEM and delivering new ways to access and work with security data, making it the most comprehensive and innovative threat hunting solution in the market. To learn more about what’s new in Microsoft Sentinel, read here.
Here are details on the new capabilities:
Azure Monitor logs is integrated with many infrastructure and application services and includes hundreds of pre-defines data types and their ingestion mechanisms. Though this is a very comprehensive offering, organizations have their own dedicated data types and specific needs. Today we introduce the a powerful tool to unleash Azure Monitor logs power for your own logs. You can define a schema and with all the control, flexibility and richness that is serving Azure logs.
In many scenarios, logs are not ingested as-is. There is a need to process them in the cloud to filter unnecessary logs, to improve the data schema, remove data or to enrich the records. Some of the organizations have deployed dedicated system just to handle the logs before they are ingested. Today we introduce a new option for you to write your transformations that would run in our cloud in large scale. The transformations would be written using a subset of the familiar KQL query language.
Not all logs are equal. Organizations have many verbose log sources that are not well curated and are not used for analytics. In some cases they are not handled properly, in other cases, they are ingested to a costly log solution without a real need. Today, Azure Monitor Logs introduces Basic Logs, a new plan for log ingestion that is tailored to high-volume verbose logs. With Basic Logs you can use most of the existing Azure Monitor Logs experiences at a lower cost. With Basic Logs you can keep all your logs under the same roof with minimal overhead.
When logs are fresh, there is a strong need for all the analytics power to extract insights out of them. As they age out, there is a need to keep logs, but they are rarely used. Organizations retain logs for months and years because they want them to be available in case of an incident, a court order or for compliance. Today we announce archive logs that enables storing logs for up to seven years at a significant price reduction. Customers can access these logs using or by restoring a chunk of the logs for a limited time. With archive logs you can keep all your logs for long-term under the same roof with minimal overhead.
At the same time the volume of logs grows up, new security incidents require organization to scan all their data – regular tools are not sufficient in these scales. Today we announce Search Job, a new tool in Azure Monitor Logs to query Petabytes of data. Search Job could run for more than few minutes and fetch all relevant records into a new persistent table.
Today we announce general availability of Azure Monitor Logs export. Data export in Azure Monitor lets you export data continuously for selected tables in your Log Analytics workspace and send it to Azure storage account or Azure Event Hubs, as it's collected. While Log Analytics data is used in various experiences, export helps meet additional capabilities such as tamper protected store and integration with other tools. For example, you can export to Azure Data Lake storage, where data from other sources is stored for further analysis and insights.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.