Forum Discussion
Porter76
Aug 29, 2023Brass Contributor
How do I create a custom data table and is it necessary in this scenario?
Recently came across some documentation to push logs in an AWS S3 bucket to Sentinel using a lambda function via the log analytics API. Looking at the documentation it looks like I would have to setup a custom data table but there's nothing that covers that in the doc. Also not entirely sure where this data will go when pushed from the S3 bucket. How would I do this and is it necessary in this scenario? Link to docs below.
https://github.com/Azure/Azure-Sentinel/tree/master/DataConnectors/S3-Lambda
I am unable to use the AWS S3 Data Connector from content hub as the logs we're pushing (AWS WAF) are not supported by that connector.
8 Replies
- Clive_WatsonBronze ContributorIts the CustomLog value (see link) - its will append _CL to this for you. e.g. if you want the table to be called: test_CL use 'test' as the value.
https://github.com/Azure/Azure-Sentinel/tree/master/DataConnectors/S3-Lambda#edit-the-script- Porter76Brass ContributorHey Clive,
So when this script is run, it will create the table? I see now in the github doc where it asks for the Workspace info and a custom log name.
I'm just a little confused on where this data goes, where its stored, and if there's anything more I need to do than simply run this script to get these logs into sentinel.- Clive_WatsonBronze ContributorCorrect run the script to create the Table. The data is stored in Log Analytics just like non custom log data (Sentinel will store this for 90days - assuming you have set the workspace to 90days or more).