Lesson Learned #382: Loading multiple Audit Log files using sys.fn_get_audit_file
Published Jun 25 2023 03:41 PM 3,063 Views

In Azure SQL Database, the auditing feature enables you to track and monitor database activities, providing valuable insights into the actions performed on your database. One of the key components of auditing is the audit log files, which store the recorded data.

 

However, when dealing with a large number of audit log files stored in a blob storage container, loading them into Azure SQL Database can be a challenging task.

 

This article explores a workaround using the sys.fn_get_audit_file function to load multiple audit log files without being able to define a pattern such as *.xel.

 

I would like to share the following PowerShell Script to automate this process. The provided script is written in PowerShell and it performs the task of importing audit log files with the extension "*.xel" from an Azure Storage Account into a SQL Server database. Let's go through the script and explain each section in detail:

 

1. Importing the Required Module:
The script begins by importing the "Az.Storage" module, which provides cmdlets for managing Azure Storage resources.

 

2. Setting the Connection Variables:
The next step is to set various connection variables required for connecting to the Azure Storage Account and the SQL Server database. These variables include the storage account name, storage account key, container name, blob name, resource group name, and subscription ID.

 

3. Establishing Azure Subscription Context:

4. Obtaining the Connection String:
The script retrieves the connection string for the SQL Server database. It specifies the server name, database name, and credentials (user ID and password).

 

5. Establishing the Database Connection:
Using the obtained connection string, the script establishes a connection to the SQL Server database using the System.Data.SqlClient.SqlConnection class.

 

6. Executing SQL Commands:
Once the database connection is established, the script executes several SQL commands. Firstly, it drops an existing table named "_xyz_log" if it already exists. Then, it creates a new table with the same name and schema as the result of executing the "sys.fn_get_audit_file" function with the specified file path ("$filePath").

 

7. Iterating Over Audit Log Blobs:
The script loops through each audit log blob retrieved from the Azure Storage Account using the "Get-AzStorageBlob" cmdlet. It filters the blobs based on the specified extension filter ("$extensionFilter"), which in this case is "*.xel".

 

8. Processing Each Blob:
Within the loop, the script extracts the file path of the current blob and performs some output operations, such as displaying the full path and size of the file. It then attempts to insert the content of the audit log file into the "_xyz_log" table using the "sys.fn_get_audit_file" function and an INSERT statement.

 

9. Error Handling:
If an error occurs during the insertion process, the script catches the exception and outputs an error message indicating the failed file and the specific error message.

 

10. Closing the Database Connection

 

 

# Import the required module
Import-Module Az.Storage

# Set the connection variables
$storageAccountName = "StorageAccountName"
$storageAccountKey = "MES7cU....0WFCfxGxe7cnU1vTxsMvQ=="
$containerName = "sqldbauditlogs"
$blobName = "BlobName"
$resourceGroupName = "REsource Group Name"
$subscriptionId = "9xxxx-7325-xxxx-bbe2-XYZ"
$extensionFilter = "*.xel"
$bFirstTime = $true

Connect-AzAccount -SubscriptionId $subscriptionId -UseDeviceAuthentication

$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

$blobs = Get-AzStorageBlob -Container $containerName -Context $context | Where-Object { $_.Name -like $extensionFilter }

$connectionString = "Server=tcp:servername.database.windows.net,1433;Database=dbname;User ID=UserName;Password=Password"

$sqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection($connectionString)
$sqlConnection.Open()
$sqlCommand = $sqlConnection.CreateCommand()

        $sqlCommand.CommandText = "DROP TABLE IF EXISTS [_xyz_log]"
        $sqlCommand.ExecuteNonQuery()

foreach ($blob in $blobs) {
    $filePath = $blob.ICloudBlob.Uri.AbsoluteUri
    Write-Output "File: $fullPath"
    Write-Output "Size: $($blob.Length) bytes"
    Write-Output "-------------------------------------"
   
    try {
        if($bFirstTime)
        {
        $sqlCommand.CommandText = "SELECT * into [_xyz_log] FROM sys.fn_get_audit_file('$filePath', NuLL, NULL)"               
        $sqlcommand.CommandTimeout=3600
        $sqlCommand.ExecuteNonQuery()
        $bFirstTime=$false
        }
        else
        {
          $sqlCommand.CommandText = "INSERT INTO [_xyz_log] SELECT * FROM         sys.fn_get_audit_file('$filePath', NuLL, NULL)"
           $sqlcommand.CommandTimeout=3600
           $sqlCommand.ExecuteNonQuery()
        }
        Write-Output "File load with success: $filePath"
    }
    catch {
        $errorMessage = $_.Exception.Message
        Write-Output "Load Failed: $filePath"
        Write-Output "Error Msg: $errorMessage"
    }
}

$sqlConnection.Close()

 

 

Additional, to read audit files, remember to create a DATABASE SCOPE CREDENTIAL first with the storage account that you are going to read.

 

CREATE DATABASE SCOPED CREDENTIAL 
[https://storageaccountname.blob.core.windows.net/sqldbauditlogs] 
WITH IDENTITY = 'SHARED ACCESS SIGNATURE', 
SECRET = 'sv=2022-11-02&ss=bfqt&srt=sco&sp=rwdlacupiyx&se=2023-06-26T14:00:54Z&st=2023-06-26T06:00:54Z&spr=https&sig=1l.....Xhf%2Fv.....gk%3D'

Version history
Last update:
‎Jun 25 2023 11:19 PM
Updated by: