Sep 04 2019
09:43 PM
- last edited on
Apr 08 2022
10:06 AM
by
TechCommunityAP
Sep 04 2019
09:43 PM
- last edited on
Apr 08 2022
10:06 AM
by
TechCommunityAP
I have a custom query that I have written to produce a report on specific events. My query goes as follows:
customEvents
| where todatetime(timestamp) > ago(1d)
| where customDimensions.EventType == "DEBUG"
| where customDimensions.Detail == "[Compare-Events-CalendarToUpdate][UPDATE]"
| extend events = parse_json(tostring(customDimensions.custom_object))
| project Id=events.Id,
Detail=customDimensions.Detail,
Owner=events.Owner,
Subject=events.Subject,
Room=events.Room,
StartDate=events.StartDate,
EndDate=events.EndDate
When executing the query I am getting blank columns. Upon further investigation, I have found that extended column "events" is a type string and not an object suggesting that parse_json is not working. This is a rather large object (Approx. 21KB, however I have shrunk it and it is still having issues.) Is there a size limitation to what Kusto can read/store and deseralize?
Many Thanks,
Michael.
Sep 04 2019 11:52 PM
Hi@Michael1145 Can you provide sample data. You can use let statement to build sample table that reflects your custom events table?
Sep 05 2019 02:05 PM
Please find the attached JSON sample snippet (This was serialized from PowerShell)
https://gist.github.com/ZanattaMichael/bb4c7c9d5be435bd10b6f7c7b7fc5ea3
The object is quite complex, so later tonight I will try and build some sample data using let.
Thanks for your help!
Michael.
Sep 05 2019 07:42 PM - edited Sep 05 2019 08:12 PM
I have found a few issues:
1. parse_json does not deal with serialized PowerShell dates. I have corrected this by double escaping the dates. I tested it by running a parse_json("JSON") within the query window.
2. I noticed that the JSON is incomplete within the window. However testing within the browser the full json string works, so it's not a limitation of parse_json, but might be an issue with storage and retention.
3. parse_json will return a json string and won't throw an error when passing a string. For example:
Sep 08 2019 09:59 PM
All,
Some more developments. I found that there is an approx. 4000 character limit on custom events. However I am not sure if this is API driven or if this is from the PS module I am using. So in summary:
1. 4000 character limit. This could be related to the particular module (ApplicationInsightsCustomEvents) I am using or the API (I am yet to confirm).
2. Line Spaces in empty Arrays/ Lists: By default PowerShell will include empty line spaces. For example:
val : [
]
This need to be removed by using:
$var -replace '\s+\r\n+', "`r`n"
3. Secondly DateTimes needs to be double escaped. I am not sure why. To do this:
# Add an additional Escape character to the beginning of the datetime
$var = $params.EventDictionary.custom_object.Replace('"\/Date(','"\\/Date(')
# Add an additional Escape character to the end of the datetime
$var = $params.EventDictionary.custom_object.Replace(')\/"',')\\/"')
Cheers,
Michael.
Sep 08 2019 11:14 PM
Initially I thought you are using Log Analytics but seems you are using Application Insights. The services are similar but still have some differences. Certainly with such big record you are hitting some limits. In such kind of situations what I have previously done if possible to split a single record into multiple ones. Having one level structure is probably the best scenario. The language is powerful so in most cases I try to put any logic later on the language rather building complex record. Additionally what I have done in the past is when I pass the record only the first level properties to be created as columns in Log Analytics and any second level, third level, etc. is passed as string. Probably it is the same case in App Insights and may be you have done it if you use parse_json. There are some limits about App Insights described here:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/api-custom-events-metrics#limits