Blog Post

Azure PaaS Blog
4 MIN READ

Create ADF Events trigger that runs an ADF pipeline in response to Azure Storage events.

RidhimaSinha's avatar
RidhimaSinha
Icon for Microsoft rankMicrosoft
May 13, 2022

Storage Event Trigger in Azure Data Factory is the building block to build an event-driven ETL/ELT architecture (EDA). Data Factory's native integration with Azure Event Grid let you trigger processing pipeline based upon certain events. Currently, Storage Event Triggers support events with Azure Data Lake Storage Gen2 and General Purpose version 2 storage accounts, including Blob Created and Blob Deleted.

 

Event-driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption, and reaction to events. Data integration scenarios often require customers to trigger pipelines based on events happening in storage account, such as the arrival or deletion of a file in Azure Blob Storage account. Data Factory and Synapse pipelines natively integrate with Azure Event Grid, which lets you trigger pipelines on such events.

 

This blog demonstrates how we can use ADF triggers for running the ADF pipeline in events of Azure Storage events.

 

Prerequisites:

  • An ADLS Gen2 storage account or GPv2 Blob Storage Account
    Create a storage account - Azure Storage | Microsoft Docs
  • The integration described in this article depends on Azure Event Grid. Make sure that your subscription is registered with the Event Grid resource provider. For more info, see Resource providers and types. You must be able to do the Microsoft.EventGrid/eventSubscriptions/* action. This action is part of the EventGrid EventSubscription Contributor built-in role.

    To do so, the Resource Provider 'Microsoft.EventGrid' needs to be registered in the Subscription as per the below screenshot: 

 

  • If the blob storage account resides behind a private endpoint and blocks public network access, you need to configure network rules to allow communications from blob storage to Azure Event Grid. You can either grant storage access to trusted Azure services, such as Event Grid, following Storage documentation, or configure private endpoints for Event Grid that map to VNet address space, following Event Grid documentation
     
  • The Storage Event Trigger currently supports only Azure Data Lake Storage Gen2 and General-purpose version 2 storage accounts.

 

  • To create a new or modify an existing Storage Event Trigger, the Azure account used to log into the service and publish the storage event trigger must have appropriate role based access control (Azure RBAC) permission on the storage account

  • Service Principal for the Azure Data Factory does not need special permission to either the Storage account or Event Grid

 

Demo:                                    

 

Step 1:

Create ADF resource on Azure Portal.  If you are new to ADF, please refer this link on how to create one:
Create an Azure data factory using the Azure Data Factory UI - Azure Data Factory | Microsoft Docs

 

Step 2:

Once Data Factory is created, navigate to Azure Data Factory Studio present in the Overview section:

 

 

Step 3:

As we land on the ADF portal, Create Linked service for storage account in ADF Portal as per the below screenshots:

 

Once you click on ‘+New’, we need to first select the data source. If you are using GPv2 Blob Storage account use ‘Azure Blob Storage’ and if you are working with ADLS Gen2 account use ‘Azure Data Lake Storage Gen2’. I’ve used Gen2 in this demo.



After selecting the Data Store fill in required details as below:

 

 

Once the Test Connection is successful, click on ‘Create. This will create the storage account linked service.

 

Step 4:

Creating Input and Output Datasets

In this demo, we will create a simple ADF pipeline that will copy an ‘emp.txt’ file from one folder ‘input’ to another folder ‘output’ within a container. Hence, we need input and output datasets in ADF that maps to the blobs in input and output folder. So let’s create InputDataset and OutputDataset:

 

Go to ‘Author’ on ADF portal and click on ‘New Dataset’ as per below screenshot:

 

 

 

 

 

 

 

 

Then click on ‘Ok’.

Similarly, you can create OutputDataset as below:

 

 

Step 5:

Create the ADF pipeline to copy data from ‘input’ to ‘output’ folder as per the below screenshots:

 

 

Give a pipeline name and drag ‘Copy Data’ activity to the designer surface. Name the activity:

 

 

Select Source and Sink as below:

 

 

 

Now ‘Validate’ the pipeline and ‘Debug’ to check whether it works as expected.

 

Step 6:

Once the pipeline is validated, let’s Create BlobCreated event Trigger as per below screenshot:

 

 

Choose Trigger--> New:

 

 

After clicking on ‘Continue’, you will get ‘Data Preview’.  This shows the blobs that matches the event trigger filters thus you can verify whether the filter is correct or not. Click ‘Continue’ and you will see ‘Parameters’ section. This is helpful when you want to pass any parameters to the pipeline. Skip this as we are not using parameters in this demo and click ‘Ok’.

 

Now we have all the components in place and next step would be to ‘Publish’ all the changes.

 

Once publish is completed, let’s test the trigger.

 

Upload file ‘emp.txt’ to input folder and this should fire the BlobCreated event thus firing ADF trigger.

 

 

File copied to output folder:

 

 

ADF Trigger run:

 

 

Pipeline run:

 

 

As we see from the result screenshots above, the BlobCreated trigger works as expected and runs the attached ADF pipeline.

 

Similarly, BlobDeleted event can be created.

 

Reference link:

Create event-based triggers - Azure Data Factory & Azure Synapse | Microsoft Docs

 

Hope this helps!

 

Updated May 13, 2022
Version 1.0
  • himanshu1711's avatar
    himanshu1711
    Copper Contributor

    Hi, Ridhima

     

    Thank you for the explanation.

     

    I already have a Event Grid Resource in my subscription. The problem that I am facing is that when I create a new Storage Account Trigger, a new Event Grid resource is created in my subscription instead of allowing me to select a existing Event Grid resource.

     

    Is there a way to select an existing Event Grid resource already in the Subscription instead of Data Factory automatically creating a new one for each Storage Trigger?

    Thank you,

    Himanshu Punjabi

  • JAugustus's avatar
    JAugustus
    Copper Contributor

    I'm currently moving everything away from Data Factory, as I've had several production outages that Microsoft has apologised for, but never explained. I can't rely on it actually working in production 

  • Rezaal860's avatar
    Rezaal860
    Copper Contributor

    Great article…can you expand on the comment you made on private networking? How does the native triggering work when storage account is behind a private endpoint? Based on your recommendation we need to create a VNet injected Event Grid Topic , correct? How does the storage account know to publish the event to that topic? As far as I know private eventing is not supported in storage account as of writing this comment