Forum Discussion
How can I use multiple connected variable in ADF to be pass in my Databricks notebook
Hi,
I need 3 connected variables which I need to use in my databricks notebook.
This is the context of the variables that I need:
filepath: root/sid=test1/foldername=folder1/
sid: path identifier, this is needed so only correct files in the folder will be loaded
foldername: every "sid" may have multiple folder so I need to identify the foldername. I will also use this to partition my data for duplicates
tablename: this is the target table where I need to load the files
Basically, the two variables are predefined which is stored in a csv file.
sid | tablename |
test1 | test1_tbl |
test2 | test2_tbl |
My source which is in blob storage looks like this:
And this is what I want for my variable
Looking forward to your help.
- dileepkarankiCopper Contributor
zll_0091 you can pass them to databricks as parameters and inside databricks notebook you can read them using dbutils