Sometimes, we get asked how much data the client application is receiving from Azure SQL Database, along with the time spent and the number of rows returned.
I would like to share a simple example that allows us to approximate these details. I hope that you could find useful.
I created a table called DownloadMeasure which contains various types of data (text, binary, bit values, and GUIDs):
SQL Script:
CREATE TABLE [dbo].[DownloadMeasure](
[timestamp] [datetime] NULL,
[GUID No_] [uniqueidentifier] NOT NULL,
[Example_Text1] [nvarchar](max) NULL,
[Example_BitValue] [bit] NULL,
[Example_Status] [nvarchar](255) NULL,
[Example_Amount Parameter] [decimal](18, 2) NULL,
[Example_HTML] [varbinary](max) NULL,
CONSTRAINT [PK_DownloadMeasure_GuidNo] PRIMARY KEY CLUSTERED
(
[GUID No_] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
After it I added 1000 rows using the following script using dynamic data for the different fields:
DECLARE @i INT = 1;
DECLARE @maxRows INT = 1000;
DECLARE @randomLengthText INT;
DECLARE @randomLengthHTML INT;
DECLARE @randomText NVARCHAR(MAX);
DECLARE @randomHTML VARBINARY(MAX);
DECLARE @chunkSize INT = 8000;
WHILE @i <= @maxRows
BEGIN
SET @randomLengthText = FLOOR(RAND() * (200000 - 100 + 1)) + 100;
SET @randomLengthHTML = FLOOR(RAND() * (200000 - 100 + 1)) + 100;
SET @randomText = N'';
WHILE @randomLengthText > 0
BEGIN
-- Append 8000 characters chunks until the total length is reached
IF @randomLengthText > @chunkSize
BEGIN
SET @randomText = @randomText + REPLICATE(N'A', @chunkSize);
SET @randomLengthText = @randomLengthText - @chunkSize;
END
ELSE
BEGIN
SET @randomText = @randomText + REPLICATE(N'A', @randomLengthText);
SET @randomLengthText = 0;
END
END
SET @randomHTML = 0x; -- Initialize empty binary
WHILE @randomLengthHTML > 0
BEGIN
-- Append 8000 bytes chunks until the total length is reached
IF @randomLengthHTML > @chunkSize
BEGIN
SET @randomHTML = @randomHTML + CAST(REPLICATE(CONVERT(VARCHAR(MAX), 0xFF), @chunkSize) AS VARBINARY(MAX));
SET @randomLengthHTML = @randomLengthHTML - @chunkSize;
END
ELSE
BEGIN
SET @randomHTML = @randomHTML + CAST(REPLICATE(CONVERT(VARCHAR(MAX), 0xFF), @randomLengthHTML) AS VARBINARY(MAX));
SET @randomLengthHTML = 0;
END
END
INSERT INTO [dbo].[DownloadMeasure]
([timestamp], [GUID No_], [Example_Text1], [Example_BitValue], [Example_Status], [Example_Amount Parameter], [Example_HTML])
VALUES
(GETDATE(), NEWID(), @randomText, CAST(RAND() * 2 AS BIT), 'Sample Status', ROUND(RAND() * 1000, 2), @randomHTML);
SET @i = @i + 1;
END;
GO
C# Script
My idea with this small script is to first retrieve the list of different GUIDs and then query the DownloadMeasure table.
For each GUID, I connect to the database and execute the query, but I use the fill method instead of ExecuteDataReader. My main goal is to capture the total bytes transmitted, the time to download the data, and the number of rows.
Every 5 GUIDs, I display an average of the times and also call the sys.dm_db_wait_stats DMV to compare the deltas of the different wait times. I'm not using the session DMV because I also want to measure the connection time. All this information is stored in a log file for review afterward.
using System;
using System.Data;
using Microsoft.Data.SqlClient;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Security.Policy;
using System.Threading;
using static System.Net.Mime.MediaTypeNames;
using System.Text.Json;
using static Program;
using System.Collections.Generic;
class Program
{
const string LogFile = "C:\\C\\DownloadMeasure\\log\\Results.Log";
const string ConfigFile = "C:\\C\\DownloadMeasure\\config.json";
const string queryWaitStats = @"
SELECT wait_type, waiting_tasks_count, wait_time_ms, max_wait_time_ms, signal_wait_time_ms
FROM sys.dm_db_wait_stats ORDER BY wait_type";
// Define the main query with the current GUID No using a parameter
const string queryMain = @"
SELECT [timestamp],
[GUID No_],
[Example_Text1],
[Example_BitValue],
[Example_Status],
[Example_Amount Parameter],
[Example_HTML]
FROM dbo.[DownloadMeasure]
WHERE [GUID No_] > @GuidNo
ORDER BY [GUID No_] ASC";
public class Config
{
public DatabaseConfig Database { get; set; }
}
public class DatabaseConfig
{
public string Server { get; set; }
public string DatabaseName { get; set; }
public string Username { get; set; }
public string Password { get; set; }
}
// Class to represent the data from the dynamic management view for wait stats
class WaitStat
{
public string WaitType { get; set; }
public long WaitingTasksCount { get; set; }
public long WaitTimeMs { get; set; }
public long MaxWaitTimeMs { get; set; }
public long SignalWaitTimeMs { get; set; }
}
static void Main()
{
int i = 0;
long totalTime = 0;
int totalRows = 0;
int queryCount = 0;
double totalDataSize = 0;
long batchTime = 0;
int batchRows = 0;
double batchDataSize = 0;
double dataSize = 0;
string guidNo;
Stopwatch stopwatch;
DataTable mainResults = new DataTable();
DataTable waitStatsTable = new DataTable();
List<WaitStat> previousWaitStats = new List<WaitStat>();
Random random = new Random();
Config config = LoadConfiguration(ConfigFile);
if (config == null)
{
Console.WriteLine("Error loading configuration.");
return;
}
if (string.IsNullOrEmpty(config.Database.Server) ||
string.IsNullOrEmpty(config.Database.DatabaseName) ||
string.IsNullOrEmpty(config.Database.Username) ||
string.IsNullOrEmpty(config.Database.Password))
{
Console.WriteLine("Invalid configuration file. Please check the settings.");
return;
}
Log($"New Process ID:...{Process.GetCurrentProcess().Id}");
string connectionString = $"Server={config.Database.Server};Database={config.Database.DatabaseName};User Id={config.Database.Username};Password={config.Database.Password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;MultipleActiveResultSets=false;Application Name=DownloadMeasure";
// Attempt to open the connection with retry policy
SqlConnection connection = ConnectWithRetry(connectionString, 5, 2000,"Obtaining List of Guids"); // 5 retries with 2 seconds delay
if (connection == null)
{
Console.WriteLine("Unable to establish a connection after multiple attempts.");
return;
}
DataTable guidResults = ExecuteSqlQuery("SELECT [Guid No_] FROM dbo.[DownloadMeasure] ORDER BY [Guid No_]", connection);
if (guidResults.Rows.Count == 0)
{
Console.WriteLine("No GUID No_ found in the query.");
return;
}
//connection.Close();
var randomizedRows = guidResults.AsEnumerable().OrderBy(r => random.Next()).ToArray();
SqlCommand commandWaitStats = new SqlCommand(queryWaitStats, connection);
commandWaitStats.CommandTimeout = 120;
commandWaitStats.Prepare();
SqlDataAdapter adapterWaitStats = new SqlDataAdapter(commandWaitStats);
foreach (DataRow row in randomizedRows)
{
guidNo = row["Guid No_"].ToString();
SqlConnection connectionRows = ConnectWithRetry(connectionString, 5, 2000, $"GUID No: {i+1} - {guidNo}"); // 5 retries with 2 seconds delay
if (connectionRows == null)
{
Console.WriteLine("Unable to establish a connection after multiple attempts.");
return;
}
i++;
queryCount++;
stopwatch = Stopwatch.StartNew();
ExecuteSqlQueryWithParameter(queryMain, connectionRows, guidNo, mainResults);
stopwatch.Stop();
// Calculate the size of the downloaded data in MB
dataSize = CalculateDataSize(mainResults) / (1024.0 * 1024.0);
// Update total and batch metrics
totalTime += stopwatch.ElapsedMilliseconds;
totalRows += mainResults.Rows.Count;
totalDataSize += dataSize;
batchTime += stopwatch.ElapsedMilliseconds;
batchRows += mainResults.Rows.Count;
batchDataSize += dataSize;
// Display the GUID No, execution time, and size of the downloaded data
//Console.WriteLine($" Time (ms): {stopwatch.ElapsedMilliseconds} Rows: {mainResults.Rows.Count} Data Size (MB): {dataSize:F2}");
Log($" Time (ms): {stopwatch.ElapsedMilliseconds} Rows: {mainResults.Rows.Count} Data Size (MB): {dataSize:F2}");
// Every 5 GUIDs, display the average
if (i % 5 == 0)
{
Log($"------------------------------- Average after {i} GUIDs -------------------------------");
Log($"Average time per query (ms): {(double)batchTime / 5}");
Log($"Average rows per query: {(double)batchRows / 5}");
Log($"Average data size per query (MB): {(batchDataSize) / 5:F2}");
CompareWaitStats(waitStatsTable, commandWaitStats, adapterWaitStats, ref previousWaitStats);
Log($"---------------------------------------------------------------------------------------");
// Reset batch totals
batchTime = 0;
batchRows = 0;
batchDataSize = 0;
}
connectionRows.Close();
}
connection.Close();
// Calculate and display the overall average
Console.WriteLine($"Total queries executed: {queryCount}");
Console.WriteLine($"Total rows processed: {totalRows}");
Console.WriteLine($"Average time per query (ms): {(double)totalTime / queryCount}");
Console.WriteLine($"Total data size downloaded (MB): {totalDataSize:F2}");
Console.WriteLine($"Press any key to continue");
Console.ReadLine();
}
static DataTable ExecuteSqlQuery(string query, SqlConnection connection)
{
DataTable dataTable = new DataTable();
try
{
using (SqlCommand command = new SqlCommand(query, connection))
{
using (SqlDataAdapter adapter = new SqlDataAdapter(command))
{
adapter.Fill(dataTable);
}
}
}
catch (Exception ex)
{
Console.WriteLine($"Error execution the query..{ex.Message}");
}
return dataTable;
}
static void ExecuteSqlQueryWithParameter(string query, SqlConnection connection, String guidNo, DataTable dataTable)
{
dataTable.Clear();
try
{
using (SqlCommand command = new SqlCommand(query, connection))
{
command.Parameters.Add("@GuidNo", SqlDbType.UniqueIdentifier);
Guid.TryParse(guidNo, out Guid guidValue);
command.Parameters[0].Value = guidValue;
command.CommandTimeout = 120;
using (SqlDataAdapter adapter = new SqlDataAdapter(command))
{
adapter.Fill(dataTable);
}
}
}
catch (Exception ex)
{
Console.WriteLine($"Error execution the query (parameters)..{ex.Message}");
}
}
static long CalculateDataSize(DataTable table)
{
long size = 0;
try
{
foreach (DataRow row in table.Rows)
{
foreach (DataColumn column in table.Columns)
{
object value = row[column];
if (value is byte[])
{
size += ((byte[])value).Length;
}
else
{
size += System.Text.Encoding.Unicode.GetByteCount(value.ToString());
}
}
}
}
catch (Exception ex)
{
Console.WriteLine($"Error execution bytes function.{ex.Message}");
}
return size;
}
// Function to try connecting with a retry policy
static SqlConnection ConnectWithRetry(string connectionString, int maxRetries, int delayMs, string sMsg)
{
int retryCount = 0;
SqlConnection connection = null;
while (retryCount < maxRetries)
{
try
{
Stopwatch stopwatch = Stopwatch.StartNew();
connection = new SqlConnection(connectionString);
connection.Open();
Console.WriteLine($"{sMsg} - Connection successful - Time (ms): {stopwatch.ElapsedMilliseconds}");
return connection;
}
catch (SqlException ex)
{
retryCount++;
Console.WriteLine($"Connection attempt {retryCount} failed. Retrying in {delayMs / 1000} seconds...{ex.Message}");
Thread.Sleep(delayMs); // Wait before retrying
}
}
return null;
}
static void Log(string message)
{
Console.WriteLine(message);
try
{
using (FileStream stream = new FileStream(LogFile, FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
{
using (StreamWriter writer = new StreamWriter(stream))
{
writer.WriteLine($"{DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss.fff")}: {message}");
}
}
}
catch (IOException ex)
{
Console.WriteLine($"Error writing in the log file: {ex.Message}");
}
}
static Config LoadConfiguration(string path)
{
try
{
string jsonString = File.ReadAllText(path);
return JsonSerializer.Deserialize<Config>(jsonString);
}
catch (Exception ex)
{
Console.WriteLine($"Error reading configuration: {ex.Message}");
return null;
}
}
static void CompareWaitStats(DataTable waitStatsTable, SqlCommand commandWaitStats, SqlDataAdapter adapterWaitStats, ref List<WaitStat> previousWaitStats)
{
waitStatsTable.Clear();
try
{
adapterWaitStats.Fill(waitStatsTable);
}
catch (Exception ex)
{
Console.WriteLine($"Error executing wait stats query: {ex.Message}");
return;
}
// Convert DataTable to List<WaitStat>
List<WaitStat> currentWaitStats = waitStatsTable.AsEnumerable().Select(row => new WaitStat
{
WaitType = row["wait_type"].ToString(),
WaitingTasksCount = Convert.ToInt64(row["waiting_tasks_count"]),
WaitTimeMs = Convert.ToInt64(row["wait_time_ms"]),
MaxWaitTimeMs = Convert.ToInt64(row["max_wait_time_ms"]),
SignalWaitTimeMs = Convert.ToInt64(row["signal_wait_time_ms"])
}).ToList();
// If the array is empty, fill it and exit
if (previousWaitStats == null || previousWaitStats.Count == 0)
{
previousWaitStats = currentWaitStats;
foreach (var currentStat in currentWaitStats)
{
long deltaWaitingTasks = currentStat.WaitingTasksCount;
long deltaWaitTimeMs = currentStat.WaitTimeMs;
long deltaMaxWaitTimeMs = currentStat.MaxWaitTimeMs;
long deltaSignalWaitTimeMs = currentStat.SignalWaitTimeMs;
if (deltaWaitingTasks != 0 || deltaWaitTimeMs != 0 || deltaMaxWaitTimeMs != 0 || deltaSignalWaitTimeMs != 0)
{
Log($"Wait Type: {currentStat.WaitType}");
Log($"Delta Waiting Tasks: {deltaWaitingTasks}");
Log($"Delta Wait Time (ms): {deltaWaitTimeMs}");
Log($"Delta Max Wait Time (ms): {deltaMaxWaitTimeMs}");
Log($"Delta Signal Wait Time (ms): {deltaSignalWaitTimeMs}");
}
}
return;
}
// If the array is not empty, compare the current values with the previous ones
foreach (var currentStat in currentWaitStats)
{
var previousStat = previousWaitStats.FirstOrDefault(p => p.WaitType == currentStat.WaitType);
if (previousStat != null)
{
// Calculate the deltas (differences)
long deltaWaitingTasks = currentStat.WaitingTasksCount - previousStat.WaitingTasksCount;
long deltaWaitTimeMs = currentStat.WaitTimeMs - previousStat.WaitTimeMs;
long deltaMaxWaitTimeMs = currentStat.MaxWaitTimeMs - previousStat.MaxWaitTimeMs;
long deltaSignalWaitTimeMs = currentStat.SignalWaitTimeMs - previousStat.SignalWaitTimeMs;
// Display the differences
if (deltaWaitingTasks != 0 || deltaWaitTimeMs != 0 || deltaMaxWaitTimeMs != 0 || deltaSignalWaitTimeMs != 0)
{
Log($"Wait Type: {currentStat.WaitType}");
Log($"Delta Waiting Tasks: {deltaWaitingTasks}");
Log($"Delta Wait Time (ms): {deltaWaitTimeMs}");
Log($"Delta Max Wait Time (ms): {deltaMaxWaitTimeMs}");
Log($"Delta Signal Wait Time (ms): {deltaSignalWaitTimeMs}");
}
}
}
// Update the array with the new data
previousWaitStats = currentWaitStats;
}
}
To connect to the database, you can use any method. For example, I am using a JSON file with the details, but you can also use AKV (Azure Key Vault), EntraID, etc.
This is an example of the information that I retrieved.
2024-09-09 15:24:05.247: ---------------------------------------------------------------------------------------
2024-09-09 15:24:05.384: Time (ms): 129 Rows: 134 Data Size (MB): 25.52
2024-09-09 15:24:05.736: Time (ms): 338 Rows: 402 Data Size (MB): 76.84
2024-09-09 15:24:06.210: Time (ms): 461 Rows: 523 Data Size (MB): 99.92
2024-09-09 15:24:07.078: Time (ms): 846 Rows: 738 Data Size (MB): 142.49
2024-09-09 15:24:07.902: Time (ms): 801 Rows: 884 Data Size (MB): 170.75
2024-09-09 15:24:07.904: ------------------------------- Average after 370 GUIDs -------------------------------
2024-09-09 15:24:07.905: Average time per query (ms): 515
2024-09-09 15:24:07.906: Average rows per query: 536.2
2024-09-09 15:24:07.907: Average data size per query (MB): 103.10
2024-09-09 15:24:07.909: Wait Type: ASYNC_NETWORK_IO
2024-09-09 15:24:07.910: Delta Waiting Tasks: 3362
2024-09-09 15:24:07.910: Delta Wait Time (ms): 1553
2024-09-09 15:24:07.911: Delta Max Wait Time (ms): 0
2024-09-09 15:24:07.912: Delta Signal Wait Time (ms): 40
2024-09-09 15:24:07.914: Wait Type: MEMORY_ALLOCATION_EXT
2024-09-09 15:24:07.915: Delta Waiting Tasks: 12
2024-09-09 15:24:07.916: Delta Wait Time (ms): 0
2024-09-09 15:24:07.916: Delta Max Wait Time (ms): 0
2024-09-09 15:24:07.918: Delta Signal Wait Time (ms): 0
2024-09-09 15:24:07.931: Wait Type: RESERVED_MEMORY_ALLOCATION_EXT
2024-09-09 15:24:07.934: Delta Waiting Tasks: 2
2024-09-09 15:24:07.934: Delta Wait Time (ms): 0
2024-09-09 15:24:07.935: Delta Max Wait Time (ms): 0
2024-09-09 15:24:07.936: Delta Signal Wait Time (ms): 0
2024-09-09 15:24:07.937: Wait Type: SOS_SCHEDULER_YIELD
2024-09-09 15:24:07.937: Delta Waiting Tasks: 1
2024-09-09 15:24:07.938: Delta Wait Time (ms): 0
2024-09-09 15:24:07.939: Delta Max Wait Time (ms): 0
2024-09-09 15:24:07.940: Delta Signal Wait Time (ms): 0
2024-09-09 15:24:07.941: ---------------------------------------------------------------------------------------
2024-09-09 15:24:08.211: Time (ms): 260 Rows: 298 Data Size (MB): 56.50
2024-09-09 15:24:08.956: Time (ms): 723 Rows: 771 Data Size (MB): 148.92
2024-09-09 15:24:09.016: Time (ms): 54 Rows: 68 Data Size (MB): 12.43
2024-09-09 15:24:09.194: Time (ms): 164 Rows: 196 Data Size (MB): 37.55
2024-09-09 15:24:09.494: Time (ms): 292 Rows: 266 Data Size (MB): 50.43
Disclaimer:
This script is provided for educational and testing purposes only. It is designed to demonstrate database performance monitoring and data handling in Azure SQL Database.
Before running this script in a production environment, ensure that you understand its functionality, potential impact on performance, and security implications, especially when dealing with sensitive data. Always test in a controlled, non-production environment before applying changes in production systems.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.