If the Sentinel data connectors page does not include the source you need, you may still not need a custom connector. Review the following blog posts for additional sources that can be used with Sentinel without a custom connector:
If you still can't find your source in any of those, custom connectors are the solution.
The fundamental way to get custom data to your Sentinel workspace is using the HTTP Data Collector API. The API will enable you to write code to ingest any data to Sentinel. Importantly, this can be used not just for event data, but also for context and enrichment data such as threat intelligence, user or asset information.
Using Azure Functions to implement the API connector is especially valuable as it keeps the connector serverless. The example How Azure Functions and Log Analytics provided easy and universal app logging for LISA App can help you to learn how to implement the API base connector using Azure functions.
Too much programming? The Upload-AzMonitorLog PowerShell script enables you to use PowerShell to stream events or context information to Sentinel. While it uses the same API behind the scenes, it is much simpler.
For example, this command will upload a CSV file to Sentinel:
The script takes the following parameters
While discussing PowerShell, the MaxPatrol connector PowerShell example shows an alternative implementation using PowerShell.
Another alternative is to use Logic Apps to get events or context data to Sentinel. To do that, build a playbook with the following elements:
There are many examples out there for doing so:
Note that while convenient, this method may be costly for large volumes of data and should be used only for low volume sources or for context and enrichment data upload.
If you know Logstash, this might be your best bet. Logstash has an output plugin for Sentinel which enables you to use LogStash as a collector for Sentinel. Now you can use all your GROK prowess as well as any Logstash input plugin to implement your connector. The challenge is that unlike all other methods above, this would require a VM and cannot be serverless.
The API and therefore all the other methods which utilize it allow defining fields. Using this feature, you can parse the information as part of the custom parser withing Logstash, Logic App, or your custom code.
However, Sentinel allows parsing at query time which offers much more flexibility and simplifies the import process. Query time allows you to push data in at the original format and parse on demand when needed. Updating a parser will apply to already ingested data.
Query time parsing reduces the overhead of creating a custom connector as the exact structure of the data does not have to be known beforehand. Nor do you need to identify the vital information to extract. Parsing can be implemented at any stage, even during an investigation to extract a piece of information Adhoc and will apply to already ingested data.
JSON, XML, and CSV are especially convenient as Sentinel has built-in parsing functions for those as well as a UI tool to build a JSON parser as described in the blog post Tip: Easily use JSON fields in Sentinel.
To ensure parsers are easy to use and transparent to analysts, they can be saved as functions and be used instead of Sentinel tables in any query, including hunting and detection queries. The blog post Using KQL functions to speed up analysis in Azure Sentinel describes how to do that.
The full documentation for Sentinel parsing can be found here.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.