Hi Nichola
I am grateful for the replies to my previous question.
Since you have told me that Logstash and Log A Agent can be deployed anywhere and I have seen on this page https://docs.microsoft.com/en-us/azure/sentinel/connect-logstash how to configure Logstash and send logs Log Analytics workspace using a log A output plugin.
I wish to know:
1.) In the above link, there is no mention of Log A Agent which should be there in the case of Logstash-vmss's architecture. As per my understanding, the Log A output plugin takes logs from Logstash and ingests them into Log A workspace in a custom table which I suppose is neither syslog no cef format. Am I correct at this point?
2.) Can we build such a model where the whole of Microsoft's grand list of data sources(https://techcommunity.microsoft.com/t5/azure-sentinel/azure-sentinel-the-connectors-grand-cef-syslog-direct-agent/ba-p/803891) be ingested into sentinel in one single common format (cef/syslog not custom) using Logstash?
3.) I read somewhere sentinel gives better monitoring, analytics, correlation, incident generation, etc. if data from all data sources be ingested into sentinel in cef/syslog format. There can be quality issues if every data source has its own custom data format and table in sentinel which will not allow sentinel to do better analytics on data because of the randomness of data field names. Am I correct on this point?
4.) Can sentinel perform data correlation and analytics if there are N numbers of custom tables present for different data source security appliances?
Nicholas, I am sorry if I am eating your mind too much.
Regards,
Simran