Forum Discussion
sneakypanda
Oct 15, 2023Copper Contributor
Issue with parsing array of JSON values
Hi, I am working on an ASIM User Management parser for the Okta System log. I have hit issues with trying to parse the target field. As below this is a list of JSON objects. "target": [ List of...
rutgersmeets
Oct 16, 2023Brass Contributor
Hi!
If the order of object types in the array is guaranteed, this can be done in a relatively simple and performant manner. Do you have access to Okta documentation on the log format to confirm if that is the case? It would also help to know if there is maximum of 1 of each object type in the array.
The solutions that comes to mind for when neither of this is guaranteed involves serialize -> row_number() -> mv-expand -> your extend extraction -> summarize by row_number. This would be fine for a one-off query, but I don't think it's suitable for a parser because of performance issues. How many logs are you receiving in this table per hour?
btw, you have multiple rows in your output because mv-apply "returns the union of the results of all subqueries".
Kind regards,
Rutger
If the order of object types in the array is guaranteed, this can be done in a relatively simple and performant manner. Do you have access to Okta documentation on the log format to confirm if that is the case? It would also help to know if there is maximum of 1 of each object type in the array.
The solutions that comes to mind for when neither of this is guaranteed involves serialize -> row_number() -> mv-expand -> your extend extraction -> summarize by row_number. This would be fine for a one-off query, but I don't think it's suitable for a parser because of performance issues. How many logs are you receiving in this table per hour?
btw, you have multiple rows in your output because mv-apply "returns the union of the results of all subqueries".
Kind regards,
Rutger