database platform cse – sql
32 TopicsEnhancing SQL PaaS Security: Automating Azure Blob SAS Key Rotation Using Azure Runbooks
In today’s cloud environment, security and efficient management of storage resources are paramount for organizations. Azure Blob Storage, known for its scalability and robust security, is a popular solution for handling large volumes of unstructured data. However, ensuring secure access to this data requires careful management, particularly when it comes to Shared Access Signatures (SAS) keys, which provide time-limited access to storage resources. To mitigate the risk of unauthorized access, it’s essential to rotate SAS keys regularly. Automating this process not only enhances security by ensuring the timely retirement of outdated keys but also guarantees the secure creation and updating of new keys. In this guide, we’ll show you how to use Azure Automation PowerShell Runbooks to automate the rotation of Azure Blob Storage SAS keys. This solution also covers updating external user credentials used in SQL to access the blob storage, simplifying access management while reducing manual intervention and the potential for human error. By adopting this automated approach, organizations can strengthen their data security practices, align with key management best practices, and ensure uninterrupted, secure access for users—all while minimizing administrative effort. This article will walk you through the process of automating the rotation of Azure Blob Storage SAS keys using Azure Automation PowerShell Runbooks. You’ll learn how to configure the necessary resources and set up permissions to ensure smooth execution. Note: This article uses SQL Managed Instance as an example to outline the steps in detail, which can also be applied to Azure SQL Database for implementation. Prerequisites Before proceeding with the implementation, ensure you meet the following prerequisites: An active Azure subscription. Azure Automation Account with sufficient privileges to perform administrative tasks. Access to an Azure Storage Account, Azure Key Vault, and Azure SQL Managed Instances that will be involved in the automation process. Summary Configure Azure Automation Account Set up an Automation Account in Azure and import required modules for integration. Grant Required Permissions Assign the necessary roles to the Automation Account to enable access to Azure Key Vault, Storage Accounts, and SQL Managed Instances. Create Stored Procedure in SQL Managed Instance Implement a stored procedure to facilitate database credential updates. Create Login and Grant Permissions Create a login for the Automation Account and grant the required permissions to execute operations in the SQL Managed Instance. Setup Documentation Configure Azure Automation Account To start, configure an Azure Automation Account to run the automation scripts and manage resources. Create an Azure Automation Account: Go to the Azure portal and create a new Azure Automation Account. This account will host the PowerShell Runbooks that will automate the SAS key rotation. Import the SQL Server Module: Once the Automation Account is created, import the SQLServer module into the Automation Account. This module allows the Runbook to interact with SQL Managed Instances and perform necessary database operations. a) Go to the Azure Portal and navigate to yourAutomation Account. b) Select "Modules" under the "Shared Resources" section in the Automation Account. c) Click on "Browse Gallery", and search for "SQLServer" in the gallery. d) Select the "SQLServer" module and click Import. e) Wait for the import to complete and verify the module appears under Installed Modules. Runbook Setup Create a new Runbook, select PowerShell as the type, choose the recommended runtime version, and enter the script code shared (SASKeyRotationAutomationScript.ps1) below as the content. Grant Required Permissions Ensure that the Automation Account has the appropriate permissions to interact with dependent resources such as Azure Key Vault, Storage Account, and SQL Managed Instances. Key Vault: Assign the "Key Vault Secrets Officer" role to the Automation Account on the Key Vault. This role allows the account to manage secrets in the Key Vault, which will be necessary for storing SAS keys. Storage Account: Assign the "Storage Account Key Operator Service Role" to the Automation Account for the relevant Storage Account. This role enables the Automation Account to manage keys for the Storage Account, including generating new SAS tokens. SQL Managed Instances: Assign the "Reader" role to the Automation Account on each SQL Managed Instance. This role allows the Automation Account to read database configurations and settings required for the SAS key update process. Subscription Level: Assign the "Reader" role at the subscription level to ensure that the Automation Account can access all the resources it needs across the subscription. Create Stored Procedure in SQL Managed Instance Next, create a stored procedure in the SQL Managed Instance that will handle updates to the database credentials, as part of the SAS key rotation process. Create the Stored Procedure: Implement the stored procedure UpdateDatabaseScopedCredential.sqlwithin your SQL Managed Instance to manage database credential updates. This procedure will be invoked by the Automation Account during the execution of the Runbook. Create Login and Grant Permissions Create a login for the Automation Account within the SQL Managed Instance and assign the necessary permissions to execute the stored procedure and other required tasks. Execute the Following SQL Commands: USE [master] GO CREATE LOGIN [AutomationAccount] FROM EXTERNAL PROVIDER; GO USE [DBA] GO CREATE USER [AutomationAccount] FOR LOGIN [AutomationAccount]; GRANT EXECUTE ON dbo.UpdateDatabaseScopedCredential TO [AutomationAccount]; GRANT CREATE TABLE TO [AutomationAccount]; Explanation: The above SQL commands will create a login for the Automation Account from an external provider (likely Azure AD), map it to a user, and grant the required permissions to execute the stored procedure and create tables, if necessary, during the process. With these steps completed, your Azure environment will be ready to automate the SAS key rotation and securely manage credentials through Azure Automation. Next, manually run the Runbook to test its functionality, ensuring that it performs as expected. After confirming the successful execution, schedule the Runbook according to your requirements. PowerShell Script File for SAS Key Rotation SASKeyRotationAutomationScript.ps1 #________________________________ #Input parameters #________________________________ $subscriptionId =xxxxx-459a-409f-8339-xxxxxxxx $resourceGroupName = "ProdRG" $storageAccountName = "Prodstorageaccount" $keyVaultName = "Prodkeyvault" $secretName = "SASForHRTK" $SASexpiryDate =90 # days $automationAccountName = "AutomationAccount" $global:NewSASKey=""; $database ="master" Connect-AzAccount -Identity Set-AzContext -subscriptionId $subscriptionId | Out-Null *>$null # Retrieve SQL Managed Instances and construct DNS names $sqlInstances = Get-AzSqlInstance | Select-Object ManagedInstanceName, DnsZone $SQLMIList = $sqlInstances | ForEach-Object { $dnsName = "$($_.ManagedInstanceName).public.$($_.DnsZone).database.windows.net,3342" [PSCustomObject]@{ DNSName = $dnsName } } #_______________________________________________________________ # Variables used #_______________________________________________________________ Write-Output "[$(Get-Date)]: __________________ Beginning of SAS Key Rotation Script __________________" Write-Output "[$(Get-Date)]: Input Parameters for Execution:" # Add input parameters Write-Output "`n[$(Get-Date)]: Subscription ID : $subscriptionId" Write-Output "[$(Get-Date)]: Resource Group Name : $resourceGroupName" Write-Output "[$(Get-Date)]: Storage Account Name : $storageAccountName" Write-Output "[$(Get-Date)]: Key Vault Name : $keyVaultName" Write-Output "[$(Get-Date)]: Secret Name : $secretName" Write-Output "[$(Get-Date)]: SAS Expiry Date : $SASexpiryDate" Write-Output "[$(Get-Date)]: User Principal Name : $UserPrincipalName" Write-Output "[$(Get-Date)]: Connected to subscription: [$( (Get-AzSubscription -subscriptionId $subscriptionId).Name )]`n" Write-Output "[$(Get-Date)]: SQL Managed Instances identified in the current subscription `n" Write-Output ("_" * 60 + "`n"); $SQLMIList.DNSName; Write-Output ("`n" + ("_" * 60)) #_______________________________________________________________ # Generate a new SAS token and update Keyvault #_______________________________________________________________ function GenerateSASToken_And_UpdateKeyVault { $context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey ((Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName -WarningAction SilentlyContinue)[0].Value ) -WarningAction SilentlyContinue $expiryTime = (Get-Date).Adddays($SASexpiryDate).ToUniversalTime().ToString("yyyy-MM-ddTHH:mm:ssZ") try { $sasToken = New-AzStorageAccountSASToken -Service Blob -ResourceType Container,Object -Permission "rwdlact" -ExpiryTime $expiryTime -Context $context Write-Output "[$(Get-Date)]: SAS Token generated successfully" $global:NewSASKey=$sasToken } catch { $ErrorOnSASkeyGeneration=$True Write-Output "[$(Get-Date)]: Failed to generate SAS Token. Error: $_" return ("ErrorOnSASkeyGeneration") } $securesasToken = $sasToken | ConvertTo-SecureString -AsPlainText -Force if ($ErrorOnSASkeyGeneration -ne $True) { # Check if the secret exists in Key Vault $secret = Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $secretName -ErrorAction SilentlyContinue try { if ($secret) { # Update the existing secret with the new SAS token $res=Set-AzKeyVaultSecret -VaultName $keyVaultName -Name $secretName -SecretValue $securesasToken -Expires $expiryTime -ContentType ("SASToken_Created_"+(Get-Date -Format "yyyy-MM-dd_HH:mm:ss")) Write-Output "[$(Get-Date)]: Secret updated in Key Vault - $keyVaultName with key - $secretName." } else { # Create a new secret in Key Vault $res=Set-AzKeyVaultSecret -VaultName $keyVaultName -Name $secretName -SecretValue $securesasToken -Expires $expiryTime -ContentType ("SASToken_Created_"+(Get-Date -Format "yyyy-MM-dd_HH:mm:ss")) Write-Output "[$(Get-Date)]: New Secret created in Key Vault - $keyVaultName with key - $secretName." } } catch { # Handle the error Write-Output "[$(Get-Date)]: An error occurred while updating or creating the secret in Key Vault." Write-Output "Error Message: $_" $exception=$True } } } #_______________________________________________________________ # Update Database scope credentials on SQL Managed instance #_______________________________________________________________ function UpdateDatabaseScopeCredential { param ([string]$sqlmi) $myCred = Get-AutomationPSCredential -Name "CredentialforMI" Write-Output "[$(Get-Date)]:Connecting to the Server : $sqlmi" try { #Using managed identity $secureToken = Get-AzAccessToken -ResourceUrl "https://database.windows.net/" -AsSecureString $token = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto([System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($secureToken.Token)) $Conn = New-Object System.Data.SqlClient.SqlConnection("Server=$sqlmi;Database=$database;Encrypt=True;Connection Timeout=30;") $Conn.AccessToken = $token $Conn.Open() $Command = $Conn.CreateCommand() $Command.CommandText = "EXEC $database.[dbo].[UpdateDatabaseScopedCredential] '$storageAccountName', '$NewSASKey'" $rowsAffected = $Command.ExecuteNonQuery() if ($rowsAffected -eq -1) { Write-Output "[$(Get-Date)]: The command executed successfully." } else { Write-Output "[$(Get-Date)]: $rowsAffected rows were affected." } Write-Output "[$(Get-Date)]: Updated the SASKey on $sqlmi" } catch { Write-Output "[$(Get-Date)]:An error occurred: $_" } finally { if ($Conn.State -eq 'Open') { $Conn.Close() } } } Write-Output "[$(Get-Date)]: Attempting to generate a new SASToken for Storage account :[$storageAccountName] and update Keyvault :[$keyVaultName]" GenerateSASToken_And_UpdateKeyVault Write-Output "[$(Get-Date)]: Attempting to update the database scope credential for SQL Managed instances" Write-Output "$SQLMIList`n" foreach ($SQLMI in $SQLMIList) { UpdateDatabaseScopeCredential -sqlmi $SQLMI.DNSName } Write-Output "[$(Get-Date)]: __________________ End of SAS Key Rotation Script __________________" SQL Script to Update a Database Scoped Credential UpdateDatabaseScopedCredential.sql Use [DBA] go CREATE OR ALTER PROCEDURE UpdateDatabaseScopedCredential @Storactaccount NVARCHAR(128), @NewKey NVARCHAR(MAX) WITH ENCRYPTION AS BEGIN set nocount on -- Declare variables for dynamic SQL DECLARE @sql NVARCHAR(MAX); DECLARE @dbName NVARCHAR(128); DECLARE @credentialName NVARCHAR(128); IF OBJECT_ID('dbo.SASKeyRotationLogging', 'U') IS NULL BEGIN CREATE TABLE SASKeyRotationLogging (Timestamp DATETIME DEFAULT GETUTCDATE(), Details NVARCHAR(MAX)); END PRINT REPLICATE('_', 100)+ CHAR(13) + CHAR(10)+ '[' + CONVERT(NVARCHAR(30), GETUTCDATE(), 126) + '] Starting the script to update the credentials on Server [' + @@SERVERNAME + '].'; INSERT INTO master..SASKeyRotationLogging (Details) VALUES ('[' + CONVERT(NVARCHAR(30), GETUTCDATE(), 126) + '] Starting the script to update the credentials on Server [' + @@SERVERNAME + '].'); INSERT INTO master..SASKeyRotationLogging (Details) SELECT '[' + CONVERT(NVARCHAR(30), GETUTCDATE(), 126) + '] Database name: ' + name FROM sys.databases WHERE name LIKE '%_ems'; -- Cursor to iterate over databases DECLARE db_cursor CURSOR FOR SELECT name FROM sys.databases WHERE name LIKE '%_ems'; OPEN db_cursor; FETCH NEXT FROM db_cursor INTO @dbName; WHILE @@FETCH_STATUS = 0 BEGIN -- Construct the dynamic SQL to be executed in each database SET @sql = ' USE [' + @dbName + ']; DECLARE @Result TABLE (CredentialName NVARCHAR(128)); INSERT INTO @Result (CredentialName) SELECT c.name AS CredentialName FROM sys.database_scoped_credentials AS c INNER JOIN sys.external_data_sources AS ds ON ds.credential_id = c.credential_id WHERE c.credential_identity = ''SHARED ACCESS SIGNATURE'' AND ds.type = 5 AND ds.location LIKE ''%' + @Storactaccount + '%''; -- Corrected LIKE syntax -- Update the credentials DECLARE @credName NVARCHAR(128); DECLARE cred_cursor CURSOR FOR SELECT CredentialName FROM @Result; OPEN cred_cursor; FETCH NEXT FROM cred_cursor INTO @credName; WHILE @@FETCH_STATUS = 0 BEGIN -- Construct the ALTER DATABASE SCOPED CREDENTIAL command DECLARE @alterSql NVARCHAR(MAX); SET @alterSql = ''ALTER DATABASE SCOPED CREDENTIAL ['' + @credName + ''] WITH IDENTITY = ''''SHARED ACCESS SIGNATURE'''', SECRET = ''''' + @NewKey + ''''';'' EXEC sp_executesql @alterSql; PRINT ''['' + CONVERT(NVARCHAR(30), GETUTCDATE(), 126) + ''] [Database: ' + @dbName + '] Updated the key for credential ['' + @credName + ''].'' INSERT INTO master..SASKeyRotationLogging (Details) VALUES (''['' + CONVERT(NVARCHAR(30), GETUTCDATE(), 126) + ''] [Database: ' + @dbName + '] Updated the key for credential ['' + @credName + ''].''); FETCH NEXT FROM cred_cursor INTO @credName; END '; -- Execute the dynamic SQL EXEC sp_executesql @sql, N'@Storactaccount NVARCHAR(128)', @Storactaccount; FETCH NEXT FROM db_cursor INTO @dbName; END; CLOSE db_cursor; DEALLOCATE db_cursor; PRINT '[' + CONVERT(NVARCHAR(30), GETUTCDATE(), 126) + '] End of script to update the credentials on Server [' + @@SERVERNAME + '].'+ CHAR(13) + CHAR(10)+REPLICATE('_', 100); INSERT INTO master..SASKeyRotationLogging (Details) VALUES ('[' + CONVERT(NVARCHAR(30), GETUTCDATE(), 126) + '] Ending the script to update the credentials on Server [' + @@SERVERNAME + '].'); ENDEnhancing SQL Server Transactional Replication Initialization Performance
Initializing SQL Server Transactional Replication with default settings can be significantly slow when working with large table migrations. Increasing MaxBCPThreads allows SQL Server to perform multiple tasks concurrently, optimizing hardware utilization and greatly reducing the time needed to synchronize data between the publisher and subscribers. In some cases, tuning this parameter has cut initialization time by up to 75%.Monitoring CDC using Extended Events in SQL Server 2022
What do the new SQL 2022 CDC extended events track? CDC jobs include: Capture Job - Captures changes into capture table. Clean-up Job - Removes older data from capture table. Each event will cover the different CDC job activities. repl_logscan_session reports on capture job progress. cdc_cleanup_job_status reports on Clean-up job progressResolving ADF Copy Upsert Failures in SQL DB for XML Columns
Customers modernizing their Oracle, Mainframe/Midrange Db2, Sybase and other databases to Azure SQL often use Azure Data Factory (ADF) for data transfer. For large datasets, an initial copy is made a few days prior, followed by incremental data copies during cutover. The ADF Copy activity's Upsert method is common for these incremental transfers, but this method fails for tables which contain XML columns. This blog explains using Stored Procedure as the Sink behavior in Copy activity to perform Upsert into Azure SQL Database. Same method can be used even when target is Azure SQL MI or Azure SQL VM.Azure SQL Managed Instance – Sync Agent Jobs and Logins in Failover Groups
In an Azure SQL Managed Instance setup with Failover Group, the schema, data, and database-level users will always be synced between primary and secondary instances. As of October 2021, there is no sync mechanism for SQL Server Agent Jobs or Server Logins/Roles because the functionality of replicating system databases does not exist. This document provides guidance and scripts to implement a workaround solution to have the Agent Jobs and Server Logins/Roles synced between primary and secondary instances.Troubleshooting page-related performance issues in Azure SQL
Azure SQL is a family of managed, secure, and intelligent products that use the SQL Server database engine in the Azure cloud. Though Azure SQL is built upon the familiar SQL Server engine,there are some differences between SQL Server and Azure SQL, such as availability of certain diagnostic commands like DBCC PAGE. DBCC PAGE is a very useful command in SQL Server for troubleshooting and inspecting the internal structure of data pages, but it is not supported in Azure SQL due to differences in the underlying infrastructure and management approaches. This limitation can present some challenges for database administrators and developers who depend on DBCC PAGE for troubleshooting. Nevertheless, Azure SQL provides alternative methods and tools for database troubleshooting, ensuring that DBAs can still achieve effective results, even without the use of DBCC PAGE. This article explores these alternatives, though they do not fully replace DBCC PAGE.Recovering and Validating Data After Unexpected SQL Server Failovers
SQL Server Geographic replication using distributed availability groups (DAGs) provide an extra layer of data security over local synchronous availability groups (AGs) to allow recovery from catastrophic events causing complete data center failures. Recovering transactions from the old primary server that were not replicated before the failure requires some extra steps, which is the focus of this blog.Data Archiving Strategies for SQL Server
As databases grow, keeping them performing well becomes increasingly more difficult. Often pruning data that is no longer required is a better strategy. This blog explores different techniques to remove or archive data at a high level. Some of the techniques depend on the version of SQL Server that you are running (box product, Azure SQL DB/HyperScale, or Azure SQL Managed Instance), but some techniques apply to all.Part 1 - Azure SQL DB Hyperscale Table Partitioning - Best Practices & Recommendations
Implementing table partitioning on a table that is exceptionally large in Azure SQL Database Hyperscale is not trivial due to the large data movement operations involved, and potential downtime needed to accomplish them efficiently.To help customers implement partitioning on these large tables, this 2-part article goes over the details. Part 1 of this blog aims to help consider table partitioning fundamentals alongside other database features and concepts, in preparation to implement table partitioning in an actual table which is covered in Part 2.