When trying to execute a long running SQL query/stored procedure using the Logic App SQL Connector, you will most likely hit the 110 seconds execution timeout limit on SQL connector: SQL Server - Connectors | Microsoft Docs
To overcome timeout limit issue, we do have multiple workarounds via Azure services and among them one of the solution is to use the Azure Automation service to execute a PowerShell script which calls a SQL query/stored procedure.
In this blog, we will see how to execute a long running SQL stored procedure via logic app with the help of automation account.
Automation Account Implementation:
1. First create/add Automation Account; Create a standalone Azure Automation account | Microsoft Docs
2. After Automation Account creation, under Automation Account, click on the 'Credentials' on the left menu pane to store SQL Server login credentials(SQL username, password which will be using this as the Credential parameter in PowerShell script)
3. Then click on Runbooks to create a new runbook under created automation account; This is where we add PowerShell script to execute-> Add a name for the Runbook and ensure the Runbook type is set to PowerShell Workflow, then click on create button.
4. After Runbook creation, click on the Edit icon to add PowerShell script
5. Inside the PowerShell editor, paste the following code. Basically here we are using ADO.Net object to create a SQL connection and to execute the stored procedure and to ensure the command object does not timeout, set the CommandTimeout = 0.
workflow ExecuteSQLStoredProcedure
{
param(
[parameter(Mandatory=$True)]
[string] $SqlServer,
[parameter(Mandatory=$False)]
[int] $SqlServerPort=1433,
[parameter(Mandatory=$True)]
[string] $Database,
[parameter(Mandatory=$True)]
[string] $Sproc,
[parameter(Mandatory=$True)]
[PSCredential] $SqlCredential
)
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password
inlinescript {
# Define the connection to the SQL Database
$SQLConn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$using:SqlServer,$using:SqlServerPort;Database=$using:Database;User ID=$using:SqlUsername;Password=$using:SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")
# Open the SQL connection
$SQLConn.Open()
# Define the SQL command to run.
$Cmd = New-Object System.Data.SQLClient.SqlCommand
$Cmd.Connection = $SQLConn
$Cmd.CommandTimeout=0
$Cmd.CommandType = [System.Data.CommandType]::StoredProcedure
$Cmd.CommandText = $using:Sproc
$Cmd.ExecuteNonQuery()
# Close the SQL connection
$SQLConn.Close()
}
}
6. Then save the code and make sure to publish it after saved.
7. To add a Logic App to execute the Runbook:
- After trigger-> search for 'automation' -> select the 'Create Job' action of Azure Automation connector-> Create API connection-> provide subscription, resource group and automation account->select 'Yes' for Wait for Job parameter so that it will wait for the job to finish before completing the action.
- Then click on 'add new parameter' select 'Runbook' -> select your Runbook which you have created for executing SQL stored procedure-> once you select Runbook, parameters defined in the PowerShell script will be shown as below-> Runbook Parameter SQL server port will be using default TCP port 1433 of SQL server(which we have specified in PowerShell script), provide values for 'Runbook parameter Database', 'Runbook Parameter SQL server', 'Runbook Parameter SQL credentials', 'Runbook Parameter sproc(stored procedure name):
- After 'Create job' action, to get the results of the SQL stored procedure, add 'Get job output' action of Azure automation connector-> specify Subscription, Resource Group, Automation account(same automation account in which you are creating a job) and use 'Job ID' dynamic content of 'Create job' action.
Testing:
To confirm the Logic App and SQL stored procedure does not timeout, run the logic app and the output of the stored procedure will be shown in the Get job output action
Note: To return a record set as JSON from the PowerShell script, replace the code $Cmd.ExecuteNonQuery() with the following:
$reader = $Cmd.ExecuteReader()
$table = new-object "System.Data.DataTable"
$table.Load($reader)
#--Exporting data to the screen --#
$table | select $table.Columns.ColumnName | ConvertTo-Json
If you return a very large record set, you may exceed the allocated memory so, in this scenario, batching the results will help.
References:
- Azure Automation Connector document: Azure Automation - Connectors | Microsoft Docs