Blog Post

Azure Database Support Blog
3 MIN READ

Lesson Learned #369:Size NNN from table 'XYZ' cannot fit within the specified batch size of 24576KB

Jose_Manuel_Jurado's avatar
Jun 12, 2023

Today we had a new case where our client encountered the following error message when synchronizing tables using DataSync "The row of size 42530KB from table 'dbo.XYZ' cannot fit within the specified batch size of 24576KB". Following, I would like to share a possible workaround for it, moving the data to blob storage and reducing the size of the database and backups.

 

This error reports that you are reaching the limit of 24 MB of Data Sync posted here: What is SQL Data Sync for Azure? - Azure SQL Database | Microsoft Learn about the Data Row size on a table. It is important to note that the 24 MB limit includes the total size of all columns within a row, including both variable-length and fixed-length columns. 

 

In such situations, an alternative we have is to save this content in a blob storage file. This approach reduces the size of the database, decreases the backup size, and improves data transfer between the application and the database.

 

The following PowerShell script demonstrates how to connect to an Azure SQL database, retrieve data from a specified table, and upload the data to a Blob Storage container

 

# Install the required module if it's not already installed
if (-not (Get-Module -Name Az.Storage -ListAvailable)) {
    Install-Module -Name Az.Storage -Force
}

# Set the Azure SQL database connection details
$serverName = "servername.database.windows.net"
$databaseName = "dbName"
$tableName = "TableName"
$columnName = "ColumnName"

# Set the Blob Storage account details
$storageAccountName = "AccountName"
$storageAccountKey = "I7hu3uIIo..=="
$containerName = "ContainerName"

# Connect to the Azure SQL Database
$connectionString = "Server=tcp:$serverName,1433;Initial Catalog=$databaseName;Persist Security Info=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;User=UserName;Password=Password"
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$connection.Open()

# Retrieve data from the table
$query = "SELECT $columnName FROM $tableName"
$command = New-Object System.Data.SqlClient.SqlCommand($query, $connection)
$dataReader = $command.ExecuteReader()

# Create the Blob Storage container
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

$container = Get-AzStorageContainer -Name $containerName -Context $context -ErrorAction SilentlyContinue

if (-not $container) {
    $container = New-AzStorageContainer -Name $containerName -Context $context -Permission Off
    Write-Host "Created container: $containerName"
} else {
    Write-Host "Container already exists: $containerName"
}


# Loop through the data and upload to Blob Storage
while ($dataReader.Read()) {
    $blobData = $dataReader[$columnName]
    $blobName = [Guid]::NewGuid().ToString().Replace("-", "") + ".txt"
    $tempFilePath = "c:\Temp\temp.txt"
    $blobData | Out-File -FilePath $tempFilePath -Encoding UTF8  # Save text to a temporary file
    Set-AzStorageBlobContent -File $tempFilePath -Container $containerName -Blob $blobName -Context $context -Force
    Write-Host "Uploaded file: $blobName"
    }

# Close the database connection
$connection.Close()

 

Just wanted to explain the section that creates a connection context ($context) for the Blob Storage account using the provided storage account name and key. It then checks if the specified container already exists using the Get-AzStorageContainer cmdlet. If the container does not exist, it creates a new container with the specified name and sets the access permission to "Off". If the container already exists, it simply displays a message indicating that it already exists.

 

 
 
 
# Loop through the data and upload to Blob Storage
while ($dataReader.Read()) {
    $blobData = $dataReader[$columnName]
    $blobName = [Guid]::NewGuid().ToString().Replace("-", "") + ".txt"
    $tempFilePath = "c:\Temp\temp.txt"
    $blobData | Out-File -FilePath $tempFilePath -Encoding UTF8  # Save text to a temporary file
    Set-AzStorageBlobContent -File $tempFilePath -Container $containerName -Blob $blobName -Context $context -Force
    Write-Host "Uploaded file: $blobName"
}

 

In this section, the script loops through each row of data retrieved from the table using the data reader. It extracts the value of the specified column ($columnName) for each row and assigns it to the $blobData variable. Then, it generates a unique blob name using NewGuid() and saves the data to a temporary text file ($tempFilePath) using the Out-File cmdlet. Finally, it uploads the temporary file to the Blob Storage container using the Set-AzStorageBlobContent cmdlet. 

 

Also, you have the option to customize the name of the file by using a Unique Primary Key of the table or a generated GUID for the blob storage. This helps to be as reference to easily retrieve the file at a later time.

 

Enjoy!

Published Jun 12, 2023
Version 1.0
No CommentsBe the first to comment