Leveraging Storage Analytics Logs to analyze who accessed the Storage Account
Published Mar 01 2021 01:35 AM 4,810 Views


You want to know who accessed/accessing your storage account. There can be a scenario someone created, deleted, or modified some blobs/containers within your storage account. The blog talks about how you can leverage storage logs, that will help you troubleshoot such scenarios.



An important source of information for troubleshooting such scenarios are the Storage Analytics logs as it keep tracks of data plane operations happening over the storage account. There is a billing associated to this logging a well.


In case storage analytics is enabled, you can leverage below options based on the logging format:

  • If the logging format is 2.0 and you are using OAuth mechanism for authentication & authorization, there is a field UserObjectId. It denotes the object ID used for authentication. It may be any security principal, including a user, managed identity, or service principal.
  • If the logging format is 2.0 however you are not using OAuth and rather getting authenticated via SAS token or access keys, you can rely on the field requester-ip-address and the user-agent-header
  • If the logging format is 1.0 then you can only rely on the field requester-ip-address and the user-agent-header
  • The requester-ip-address fields provides information about the IP address of the requester, including the port number.
  • The user-agent-header fields provide the user agent details such as browser details, SDK details etc.


Let us take a look at some more details and the steps you can follow ahead:


  1. Enable the storage analytics Logging, if not enabled already.
  • You first need to enable the storage analytical logging. You can either enable the classic one or work with storage resource logs. The current blog is more inclined towards the classic one however the other logs can be leverage in similar way.
  • You can set what type of operation you want to log such as Read, Write etc. Also, take a note at the logging version of the logs as well. Based on available fields, it will help tracking operations based on available fields. If you opt for version 2.0, it has extra fields for a authentication with Azure AD for blob services UserObjectId field will store the Object ID of the user/group/service principal, that made call to storage.



  • Once you save the settings, there will be a folder $logs that will get created inside the same storage account. it might take some time for logs to appear post the operations are made and these are stored in UTC format.



NOTE : If the logging isn’t enabled then you won’t be able to backtrack much. You can also consider the above step as a prerequisite for analysis too.


  1. Parsing the logs generated
  • The logs will get created in .log format.
  • You can view the log data using Azure Portal, or using a storage explorer like the Microsoft Storage Explorer, or programmatically using the storage client library or PowerShell.
  • Once these have been downloaded, you can make use of below 2 links to parse them for further analysis. The first one converts the log file into CSV format while the second one is a standalone utility. You can leverage them as per your convenience.




  1. Analyzing the logs
  • We are taking a common scenario of someone deleting the blob and you want to track it back.
  • We will first download the logs and then parse it for further analysis.
  • Herein, a question arises as to get information regarding the time when the operation was performed and you can leverage Metrics option under the Monitoring pane.
  • You can select the metrics as “Transactions” and thereafter make use of ‘Apply Splitting’ option to split the result ahead based on available parameters.  You will get to see count of the various operations happening over the storage account. We can hover on the API we are interested and it will highlight the timings over the graph when the particular operation was called.



  • There is another level of splitting that you can apply based on the Authentication type. This will help segregating calls based on their Authentication types as keys, SAS, Oauth etc.



  • For this example, we have parsed it into CSV format, and we get various type of information regarding the call that was made to storage. Let’s take a look at 2 different Delete Blob requests and check on some of the fields of interest.


Log Version – It provides the log level version i.e. either 1.0 or 2.0.

Transaction Start Time : Time when the transaction was initiated.

REST Operation Type : Type of operation that was performed such Read, List, Delete etc.

Authentication Type : This tells us about the authentication mechanism such as SAS, OAuth etc.

Request URL : The request URL for the operation and can provide idea regarding the filename.

Request ID : This is the Storage Request ID.

Client IP: This provide information regarding the IP that was hitting the storage.

User Agent : This provide us user agent details of the client application e.g. Storage explorer in above example.

User Object ID: This field is empty in the first snippet because the authentication happed via SAS whereas in the second one you get to the the User Object ID and principal name as well. You can track the User Object ID in Azure AD via portal ahead as well whether it’s belongs to single user, group or Service Principal


  • Similarly, based on the available formats and operations, you can track further activities happening on your Storage Account ahead.
  • In case you are observing heavy transactions happening over your storage account then also this document and the procedure shall help in an analyzing the trend too.

More Information, please follow below link

Storage Analytics Logging Format, available fields and description

Storage Analytics Logged Operations


Hope this helps!


Version history
Last update:
‎Feb 28 2021 11:42 PM
Updated by: