First published on MSDN on May 23, 2018
As we continue our march towards General Availability (GA) of SQL Server Integration Services (SSIS) in Azure Data Factory (ADF), we are excited to announce the release of Execute SSIS Package activity as the most direct and code-free method to invoke SSIS package executions within ADF pipelines.
Now they can simply drag-n-drop the first-class Execute SSIS Package activity into their ADF pipeline, select Azure-SSIS Integration Runtime (IR) to run their package, and specify the path to their deployed package in SSIDB:
Execute SSIS Package activity lets you do the following:
Schedule SSIS package executions on Azure-SSIS IR by triggering runs of ADF pipelines that contain Execute SSIS Package activities. As you may have known already, SQL Server on premises and Azure SQL Managed Instance (MI) have SQL Server Agent and MI Agent, respectively, as a first-class SSIS job scheduler. Azure SQL DB, on the other hand, does not have any. Consequently, if you use Azure SQL DB to host SSISDB, you can use ADF as a first-class SSIS job scheduler.
Modernize your ETL (Extract – Transform – Load) workflows by chaining/grouping Execute SSIS Package activities with other activities within ADF pipelines. For example, you can create Web activities with webhooks to Azure Automation PowerShell (PSH) runbook and chain Execute SSIS Package activity between them to start and stop your Azure-SSIS IR on demand/just in time before and after your package execution (see
Complement your ELT (Extract – Load – Transform) workflows by splicing/injecting built-in SSIS connectors, tasks, and data transformations within ADF pipelines. If there is no built-in SSIS component that is suitable for your purpose, you can always use custom/Open Source/3
party components for it (see
). In short, you can mix-n-match various SSIS and ADF components to get the best of both ETL and ELT worlds, completing/enhancing your data integrations along the way.
If you use SSIS project/package parameters in your packages, you can assign values to them when running Execute SSIS Package activities within ADF pipelines in the following manners:
Create SSIS execution environments in SSISDB using SQL Server Management Studio (SSMS) with variables and their values that correspond to your project/package parameters. Configure your SSIS packages using SSMS to associate those environment variables to your project/package parameters. Switch between those environments when running Execute SSIS Package activities by specifying different paths to those environments in SSIDB on Settings tab of ADF User Interface (UI)/app (see the above screenshot), essentially assigning different values of environment variables to your project/package parameters.
Assign values/expressions/functions that can refer to ADF system variables to your project/package parameters in JSON format on Advanced tab of ADF UI/app. For example, you can assign ADF pipeline parameters to your SSIS project/package parameters:
In the near future, we will provide more direct and code-free methods on ADF UI/app to assign values to SSIS project/package parameters, connection managers, and property overrides just like using SSMS.
Speaking of SSMS, once you run your pipelines containing Execute SSIS Package activities, you can monitor the pipeline runs on ADF UI/app and obtain the relevant SSISDB operation/execution IDs from the output of your activity runs to check more comprehensive execution logs/error messages on SSMS:
One more thing, we have also released a new scheduling feature on SSMS that will fast-track the introduction of ADF capabilities to SSIS users. This feature lets you schedule SSIS package executions on Azure-SSIS IR by automatically generating ADF pipelines, Execute SSIS Package activities within those pipelines, and ADF triggers that schedule the pipeline runs, all through a familiar UI just like using SQL Server Agent on premises:
The auto-generated ADF pipelines/activities/triggers can then be monitored/edited/extended on ADF UI/app.
We hope you will find these new features useful to modernize and extend your ETL/ELT workflows, so please do not hesitate to contact us if you have any feedbacks/questions/issues and we will follow up ASAP. Thank you as always for your support.