SOLVED

Migrating data from blob to Log analytics

Microsoft

Hi,

Our data was pushed to Azure blob via shoebox pipeline in the form of json/json lines.
We are looking forward to depend on Log analytics for monitoring/reporting solution.

We would like to migrate that history data stored in blob to LogAnalytics. 
What is the best way to migrate the data from blob to loganalytics?
I came across data collector api to ingest data to LA. My initial thought was creating custom tool to fetch data from blob container and ingest to LA using this API. 
I am wondering is there any existing tool to achieve this migration. If not, what is the best way to migrate data from blob to LA?

Any pointers would be great.

 

1 Reply
best response confirmed by Stanislav Zhelyazkov (MVP)
Solution

Hi,

There is no out of the box functionality that can do that so you will have to ingest the data by building your own solution. You can use automation tools like Azure Automation, Logic App or Azure Function. With those automation tools you will use the data ingestion API you have mentioned. One important thing to remember about the data ingestion api and Log Analytics is that you have retention in Log Analytics. By default that retention is 31 days and you can increase it up to two years. The retention purges any data that is old than the set retention period. Old data is looked by TimeGenerated column. When you use the data ingestion api when you import records you will have to set TimeGenerated column for your  records otherwise the date when you imported the log will be put in TimeGenerated column for that record. If you import the logs with TimeGenerated date of the ingestion you will have a bunch of records with wrong time period and will be hard to make sense of that data. When you set your TimeGenerated date remember that the date you are setting must be in the boundaries of the retention. If it is out of the boundaries of the retention the API will report successful ingestion but you will not see the logs as they will be purged immediately.

1 best response

Accepted Solutions
best response confirmed by Stanislav Zhelyazkov (MVP)
Solution

Hi,

There is no out of the box functionality that can do that so you will have to ingest the data by building your own solution. You can use automation tools like Azure Automation, Logic App or Azure Function. With those automation tools you will use the data ingestion API you have mentioned. One important thing to remember about the data ingestion api and Log Analytics is that you have retention in Log Analytics. By default that retention is 31 days and you can increase it up to two years. The retention purges any data that is old than the set retention period. Old data is looked by TimeGenerated column. When you use the data ingestion api when you import records you will have to set TimeGenerated column for your  records otherwise the date when you imported the log will be put in TimeGenerated column for that record. If you import the logs with TimeGenerated date of the ingestion you will have a bunch of records with wrong time period and will be hard to make sense of that data. When you set your TimeGenerated date remember that the date you are setting must be in the boundaries of the retention. If it is out of the boundaries of the retention the API will report successful ingestion but you will not see the logs as they will be purged immediately.

View solution in original post