Forum Discussion

AizaBC's avatar
AizaBC
Copper Contributor
Aug 03, 2023

Azure Diagnostic data cannot be processed by Azure Stream Analytics due to InputDeserializerError



Planning to steam Azure resource(frontdoor) diagnostic logs to stream to Azure Stream Analytics. However, having troubles on this one as data specifically from AzureDiagnostics failed to get deserialized as input for Stream Analytics job.

 

Error:

 

 

 

Error while deserializing input message Id: Partition: [0], Offset: [3663944], SequenceNumber: [285]. Hit following error: Column name: ErrorInfo is already being used. Please ensure that column names are unique (case insensitive) and do not differ only by whitespaces.

 

 

 

It's caused by a duplicating column, errorInfo and ErrorInfo on AzureDiagnostic Table, which I am unsure what distinguishes them apart when observing its values.

 

Have any thoughts or solution in mind on how we could simplify or transform these Diagnostic log to possibly remove this duplicating column prior to getting ingested to the Stream Analytics job?

 

Have initially thought of the following solutions, but they aren't so straight-forward and probably costs more and would like to hear other's thoughts as well.

1. Transformation using DCR. I beleive this is ideal for sending Diagnostic Logs to Log Analytics workspace. but this would mean diagnostic logs have to pass through the workspace and then get exported to Stream Analytics which to achieve, may require to add in more components in between the data pipeline. 
2. Logic App. Saw somewhere where a scheduled Logic App(probably run by schedule) is used to export data using a query (KQL) from Log analytics workspace then get sent to a storage. Has to modify the destination to an event hub instead perhaps. yet again, to many layers just to pass on the data to ASA. 

 

Any other solution you can suggest to refining the incoming data to ASA while minimizing the utilization of compute resources?

No RepliesBe the first to reply

Resources