02-11-2019 05:35 AM
02-11-2019 05:35 AM
Our data was pushed to Azure blob via shoebox pipeline in the form of json/json lines.
We are looking forward to depend on Log analytics for monitoring/reporting solution.
We would like to migrate that history data stored in blob to LogAnalytics.
What is the best way to migrate the data from blob to loganalytics?
I came across data collector api to ingest data to LA. My initial thought was creating custom tool to fetch data from blob container and ingest to LA using this API.
I am wondering is there any existing tool to achieve this migration. If not, what is the best way to migrate data from blob to LA?
Any pointers would be great.
02-12-2019 10:21 PMSolution
There is no out of the box functionality that can do that so you will have to ingest the data by building your own solution. You can use automation tools like Azure Automation, Logic App or Azure Function. With those automation tools you will use the data ingestion API you have mentioned. One important thing to remember about the data ingestion api and Log Analytics is that you have retention in Log Analytics. By default that retention is 31 days and you can increase it up to two years. The retention purges any data that is old than the set retention period. Old data is looked by TimeGenerated column. When you use the data ingestion api when you import records you will have to set TimeGenerated column for your records otherwise the date when you imported the log will be put in TimeGenerated column for that record. If you import the logs with TimeGenerated date of the ingestion you will have a bunch of records with wrong time period and will be hard to make sense of that data. When you set your TimeGenerated date remember that the date you are setting must be in the boundaries of the retention. If it is out of the boundaries of the retention the API will report successful ingestion but you will not see the logs as they will be purged immediately.