Forum Widgets
Latest Discussions
Wildcard path in ADF Dataflow
I have a file that comes into a folder daily. The name of the file has the current date and I have to use a wildcard path to use that file has the source for the dataflow. I'm not sure what the wildcard pattern should be. The file name always starts with AR_Doc followed by the current date. The file is inside a folder called `Daily_Files` and the path is `container/Daily_Files/file_name`. I would like to know what the wildcard pattern would be. Thank you!harry619Sep 14, 2021Copper Contributor21KViews0likes1CommentHow to backup and Restore - Azure Mapping Data Flow
Hi All, I have some Mapping Data Flows created which i want to backup and restore to another Azure Data Factory. I do not want to export the whole ARM Template, just specific Data Flows. Is there a way I can accomplish that other than redoing all the work. Any help would be really appreciated. ThanksMujtabaTirmaziMay 04, 2020Copper Contributor8.6KViews0likes2CommentsAccess User Properties - Copy Activity
Hi, I am trying to access or pass values from "User Properties" (Copy Activity). Is there any@activity property which can help me get those values. e.g.@activity('copyactivity1').userproperties.source.valueMujtabaTirmaziMay 19, 2020Copper Contributor8KViews0likes1Commentlinked service ADLS Gen2 via Key Vault error The specified account key is invalid.
Hi, Sorry but I am stuck. I have a ADLS Gen2 with a storage firewall. I setup a Key Vault that has a secret for the ADLS containing the connection string from key 1. ADF is allowed on the Key Vault via policy. For some reason I get this error: The specified account key is invalid. Check your ADF configuration. The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters. This happens on both keys by the way and even when I regenerate the keys. I cannot find anything on that anywhere. Any ideas? ThanksrocksdeJun 25, 2020Copper Contributor6.7KViews0likes2CommentsSelect text from split function
Hi hope someone can help, (I also hope I can explain this issue) I created a pipeline to bring in a CSV, stick it in blob storage and then modify it and stick it in a sql database. But while using data flow to help tidy the contents up I've come unstuck. I created a derived column to split rdfsLabel which contains names of stuff in different languages. Each separated with a |. The issue is that there's no consistency with what order each language is in and each time I run the pipeline the order can change from source. Can someone give me pointer on how to populate a column with the text from the string with@en at the end, once I get this I can then duplicate this for each of the languages and then go in and create another derived column and trim out the language identifiers. I'm hoping its something really silly that I've missed. Thanks in advance JohnSolvedJohn DorrianJan 28, 2021Brass Contributor6.4KViews0likes9CommentsADF Web Activity Missing Response Headers
I may be over looking something, but I can't seem to figure out why the ADF Web Activity does not include the Response Headers. Scenario: I'm using the REST API via ADF Web Activity to Refresh an AAS model. I can call the Refresh (POST) API successfully, but it doesn't provide the Refresh Id in the response. Looking at the DOCs for AAS POST /Refreshes it states the refresh ID is included in the response header. However, the output in ADF does not seem to contain that value. As a result, I have to use the Refreshes (GET) to get the status of the last set of refreshes and fish out the Refresh ID from there. Not ideal, especially since the GET Response isn't ordered chronologically.jonesb321Sep 01, 2021Copper Contributor6.1KViews0likes5CommentsAuto Create SQL table from Imported CSV in ADF?
Hi All Wondering if its possible to automatically create a sql table from an imported CSV in Data Factory? To make things a little more complicated not all csv's will have the same headings and differing number of headings. If so, can anyone point me in the direction of a how to guide? Case I've been ingesting csv's using a HTTP connector in ADF then storing CSV data into a SQL table (manually created) and then transforming and cleaning said data into a Datastore SQL table that is also manually created. I know I'm a little slow to the party but I've been looking at using parameters and was wondering if I pulled the csv's into blob storage and then transformed from there. Then I'd only need to create one SQL table and if I could automate the entire process in ADF that would save future me a lot of time. I have another issue but I'll post that separate as its a slightly different topic. Thanks for Reading Hope someone can point me in the right direction. JohnSolved5.8KViews1like1CommentNo publish changes detected from collaboration branch
I have a development data factory with Github enabled that is tied with Azure Dev Ops CI/CD for deployment. I created a new feature branch Created a new pipeline and tested Created a pull request and merged into master - everything looks okay at this point Hit the publish button in the ADF web tool to deploy to live I am getting a message 'no new changes to publish from collaboration branch'. If I switch to live mode, I can clearly see the new pipeline I created in master is not there. What are the proper steps to remedy this? I thought about disconnecting and reconnecting the repo, and I noticed an 'overwrite live mode' button in the git configuration menu. Would this fix it? Could there be any negative side effects?SolvedDonovan82Sep 28, 2022Copper Contributor5.2KViews0likes4CommentsThis request is not authorized to perform this operation using this permission.\nErrorCode=
Hi, Created a Linked service(Data lake) dynamically by passing URL and access key. While working on copy dataActivity, it's working fine. But while working on Data flow activity facing an error like below. DF-SYS-01 at Source 'source1': HEAD (https: StatusCode=403 StatusDescription=This request is not authorized to perform this operation using this permission. ErrorCode= ErrorMessage= Please help me on this. Thanks in advance, Kishore Makke.kishoremakkeSep 01, 2020Copper Contributor5.1KViews1like0CommentsHow to call SSAS tabular cube from Azure Data Factory
Hi , I need to refresh one SSAS tabular cube in Azure Data Factory Pipeline after ETL process completed . Has someone does this before and help me how can I refresh a particular cube from SSAS in ADF . Thanks, Nakul GargnakulgarMar 03, 2021Copper Contributor5.1KViews0likes1Comment
Resources
Tags
- Azure Data Factory141 Topics
- Azure ETL36 Topics
- Copy Activity33 Topics
- Azure Data Integration33 Topics
- Mapping Data Flows23 Topics
- Azure Integration Runtime21 Topics
- azure data factory v23 Topics
- ADF2 Topics
- REST2 Topics
- azure monitor2 Topics