Microsoft Entra Suite Tech Accelerator
Aug 14 2024, 07:00 AM - 09:30 AM (PDT)
Microsoft Tech Community
Azure Monitor: Logs Ingestion API Tips & Tricks
Published Apr 20 2023 12:00 AM 4,851 Views

Hello followers and welcome back to this new post of mine :cool:


Today I am going to share with you an interesting experience in configuring the Logs ingestion using the new API in Azure Monitor in a data collection rule created using ARM templates.


In my previous HTTP Data Collector API in a real customer scenario post, I showed a possible use of the old HTTP Data Collector API but since things changed a lot with the release of the Azure Monitor Agent I decided to give the new Logs Ingestion API a try.


I basically followed the Tutorial: Send data to Azure Monitor Logs using REST API (Resource Manager templates) public documentation which is pretty exhaustive. In fact, I only found few points which took me a bit of time to fully figure out the whole picture. Here’s why I would like to share the tips and tricks I learned with you :smile:

Putting them on a list that follows the structure of a Data Collection Rule (DCR), the list would appear like:

  1. Custom table fields
  2. streamDeclaration
  3. transformation query
  4. outputStream

But let’s dive deep into each element.


#1 Custom table fields:

TIP: The custom table you need to create to store data into, either you create it using the PowerShell script or through the portal, must always have the TimeGenerated field.

TRICK: If not specifically populated with values during the ingestion, it could be set during the transformation phase to the ingestion time.





#2 streamDeclarations section: This section has the input stream arriving to the ingestion point. Data could come, for instance, from a CSV file or from a text log. Here’s where you declare the field names and field types coming from your custom source.

TIP: It is crucial to declare only the fields which are part of the input stream. Using a custom text log, the expected input could be just the TimeGenerated and the RawData fields. In this case you must create a streamDeclarations section having just 2 fields of the relevant type, such as:

Field Name







Should you need to manage an input stream with more fields, make sure you map them correctly. The picture below shows an input stream used to load sample data for a workshop lab environment where data is contained in a CSV file with named columns:




TRICK: Field values might arrive as string value. In this case they can be defined as string and cast to the right type to match the table schema during the transformation.


TIP: A fundamental point to always have in mind is that since we are dealing with custom ingestion, the section under streamDeclaration must always start with the Custom- prefix. Not doing this will result in in the ingestion point not being able to manage the data through the correct ingestion pipeline.





#3 transformation query: Data collection transformation is a new nice feature of Azure Monitor which allow us to achieve different goals. Writing a transformation query implies to understand the format of incoming data as well as the KQL limitations and the supported KQL features that apply here.

One added point to pay attention to is the special character’ escaping; when dealing with regex in the transformation query, for instance, remember that special characters need to be escaped correctly otherwise the template deployment will fail. Talking about regex, you first need to use the same rules that apply to regex in KUSTO queries. You need to escape square brackets ([ and ]) using the backslash (\). The backslash itself is a special character and so need to be escaped using the backslash. So, to escape a square bracket in Kusto you need to write something like this: \\[. As an alternative, you can enclose your regex in @'' making it like @'Your regex'


TIP: Since the transformation query is part of an ARM template which is written in JSON you also need to escape the escapes. In short, the above sequence will turn into \\\\[.

Yes you got it: you need to escape each of the 2 backslashes to make sure they will be converted back from 4 to 2 and then from 2 to 1 when the query is executed.

The below transformation query used in a template



source | where RawData matches regex '^*\\\\[\\\\[:space:\\\\]\\\\](\\\\[0-9\\\\]+)/(\\\\[0-9\\\\]+)/(\\\\[0-9\\\\]+|-1)/(\\\\[0-9\\\\]+)/(\\\\[0-9\\\\]+|-1)\\\\[\\\\[:space:\\\\]\\\\](\\\\[0-9\\\\]{3}|-1)\\\\[\\\\[:space:\\\\]\\\\]* || ^*\\\\[\\\\[:space:\\\\]\\\\](\\\\[0-9\\\\]+)/(\\\\[0-9\\\\]+)/(\\\\[0-9\\\\]+|-1)\\\\[\\\\[:space:\\\\]\\\\](\\\\[0-9\\\\]+|-1)\\\\[\\\\[:space:\\\\]\\\\]*'



will be deployed as



source | where RawData matches regex '^*\\[\\[:space:\\]\\](\\[0-9\\]+)/(\\[0-9\\]+)/(\\[0-9\\]+|-1)/(\\[0-9\\]+)/(\\[0-9\\]+|-1)\\[\\[:space:\\]\\](\\[0-9\\]{3}|-1)\\[\\[:space:\\]\\]* || ^*\\[\\[:space:\\]\\](\\[0-9\\]+)/(\\[0-9\\]+)/(\\[0-9\\]+|-1)\\[\\[:space:\\]\\](\\[0-9\\]+|-1)\\[\\[:space:\\]\\]*'




and executed correctly.


#4 outputStream: This is the place where we do set up which table data must be sent to.

TIP: This is a single line that must start with Custom- and finish with the table name (which includes the _CL)




TRICK: Dealing with ARM template gives you the advantage of referencing parameters and variables as well as calculate the values using string functions. You can make the value for the outputStream property dynamic using a syntax like the one below:




Is that enough to make your life easier? I hope so, hence have fun with Log Ingestion API :lol:


The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.

Version history
Last update:
‎May 04 2023 05:27 AM
Updated by: