How do I create a custom data table and is it necessary in this scenario?

Brass Contributor

Recently came across some documentation to push logs in an AWS S3 bucket to Sentinel using a lambda function via the log analytics API. Looking at the documentation it looks like I would have to setup a custom data table but there's nothing that covers that in the doc. Also not entirely sure where this data will go when pushed from the S3 bucket. How would I do this and is it necessary in this scenario? Link to docs below.

https://github.com/Azure/Azure-Sentinel/tree/master/DataConnectors/S3-Lambda 

 

I am unable to use the AWS S3 Data Connector from content hub as the logs we're pushing (AWS WAF) are not supported by that connector. 

8 Replies
Its the CustomLog value (see link) - its will append _CL to this for you. e.g. if you want the table to be called: test_CL use 'test' as the value.

https://github.com/Azure/Azure-Sentinel/tree/master/DataConnectors/S3-Lambda#edit-the-script

Hey Clive,

So when this script is run, it will create the table? I see now in the github doc where it asks for the Workspace info and a custom log name.

I'm just a little confused on where this data goes, where its stored, and if there's anything more I need to do than simply run this script to get these logs into sentinel.
Correct run the script to create the Table. The data is stored in Log Analytics just like non custom log data (Sentinel will store this for 90days - assuming you have set the workspace to 90days or more).
Is it possible I can view this data in Log analytics before its stored in the custom data table the script will create?

@Porter76 

you need to ingest the data into the Custom table to be able to query it.  The custom table is where it's stored in log analytics.  

 

You can check the schema before ingestion. 

https://learn.microsoft.com/en-us/azure/sentinel/data-source-schema-reference

@Clive_Watson 

 

After running the script, would I see that data table under "Custom Logs"

 

Porter76_0-1693330168877.png

 

Thanks so much for all of the help.