Forum Discussion

AjayGopu's avatar
AjayGopu
Copper Contributor
May 10, 2024

How to Achieve Coupe of functionalities in Azure Data Factory Dyanmically.

Hi Team.

 

I have below scenarios as part of my business requirement. These requirements has to achieved dyanmically using Azure Data Factory Data Flows or Pipelines. 

 

Note : Requirement is not to using Function Apps, DataBricks or any other API calls.

 

I have a blob storage which holds the CSV files with varying headers(I mean the headers and content inside it will change all the time) in it all the time. I want to move these CSV files to Parquet file by performing couple of validations, which are as mentioned below.

 

  1. Need to loop through each file from source blob folder.
  2. Need to get the count of rows inside the file dynamically.
    Spoiler
    (can't use pipeline lookup as the data inside it is in millions).
  3. Use the count as conditional logic to continue to next step.
  4. In next step i need to validate the CSV data to find any invalid rows. For Example, i'm using the comma(,) as column delimitor in my dataset. So if any string which is not enclosed in double-quotes("") and has a comma(,) with in it, will be treated as new column without any header column name. These type of column names should be treated as invalid rows and should be moved to another blob storage folder as a ".CSV" 
  5. For example the source CSV file may look like this.

TestColumn1,TestColumn2,TestColumn3
BUDGETS,-1431654712,jgdsgfj,sdfds
BUDGETS,-1431654712,
BUDGETS,-1431654712,

No RepliesBe the first to reply

Resources