Configuring archive period for tables at Mass for Data Retention within Log Analytics Workspace
Published May 23 2024 11:23 AM 2,000 Views
Microsoft
  1. Efficient Data Management: The article’s primary focus on mass applying archival to multiple tables within Log Analytics Workspace streamlines the process of managing a diverse range of log data. This efficiency is invaluable for organizations dealing with large volumes of logs from various sources, simplifying the management of data retention policies and significantly reducing the administrative overhead.
  2. Cost and Complexity Optimization: By leveraging Log Analytics Workspace for archival, organizations can maintain a balance between cost-effective storage and data accessibility. This approach eliminates the need for more complex and potentially costly alternatives like Blob Storage and Azure Data Explorer (ADX) for archival, thus reducing both operational complexity and storage expenses. It provides a practical solution for long-term data retention while optimizing both cost and management efforts.

 

VipulDabhi_14-1713507475812.png

 

VipulDabhi_15-1713507590698.png

 

Consider replicating above for multiple tables using below PowerShell commands.

 

VipulDabhi_0-1713506234763.png

 

Step 2: Export the KQL Result-Set:

Exporting the table list in CSV using export functionality

VipulDabhi_1-1713506302289.png

 

VipulDabhi_2-1713506340167.png

 

Step 4: Rename from "$table" column to "Table" as:

VipulDabhi_4-1713506390718.png

Rename $table to Table as highlighted:

VipulDabhi_5-1713506424164.png

 

Step 5: Rename the Excel File name as well:

VipulDabhi_6-1713506460995.png

 

From "query_data" to "Sentinel" as shown

VipulDabhi_7-1713506479456.png

 

Step 6: Open Cloud Shell on Azure portal and upload this new file: 

VipulDabhi_8-1713506513349.png

 

Upload the file from local machine as:

VipulDabhi_9-1713506523030.png

 

Step 7: Check the uploaded file using "ls" list command for uploaded File as:

VipulDabhi_10-1713506586799.png

 

 

Step 8: Run following PowerShell command in Cloud shell once file upload completes:

 

Import-CSV “SentinelTable.csv” | foreach {Update-AzOperationalInsightsTable -ResourceGroupName sentineltraining -WorkspaceName sentineltrainingworkspace -TableName $_.Table -TotalRetentionInDays 2556}

 

Prior Running the command ensure to update:

*Please update the -TotalRetentionInDays as required in your scenario

*Update the Resource Group Name, Log analytics Workspace name respectively.

 

VipulDabhi_11-1713506634262.png

 

Step 9: Check the Archival Log Analytics Table for following tables:

VipulDabhi_12-1713506836146.png

 

Step 10: The Tables exported have updated Archival period and others have default Retention as per Log Analytics Settings:

Navigation: Log Analytics Workspace > Settings > Tabels > Archive Period.

 

VipulDabhi_13-1713506910833.png

 

 

Conclusion:

 

1. This blog covers the default approach at a table level to perform archival for long term storage within log analytics workspace.

2.This blog covers steps to actually scale the archival for multiple tables which is a key production requirement.

3. All the steps can be implemented in a lab environment and archival period can be observed in log analytics workspace in table blade respectively.

Co-Authors
Version history
Last update:
‎May 23 2024 11:23 AM
Updated by: