blob
11 TopicsADF Data Flow Fails with "Path does not resolve to any file" — Dynamic Parameters via Trigger
Hi guys, I'm running into an issue with my Azure Data Factory pipeline triggered by a Blob event. The trigger passes dynamic folderPath and fileName values into a parameterized dataset and mapping data flow. Everything works perfectly when I debug the pipeline manually or trigger the pipeline manually with the trigger and pass in the values for folderPath and fileName directly. However, when the pipeline is triggered automatically via the blob event, the data flow fails with the following error: Error Message: Job failed due to reason: at Source 'CSVsource': Path /financials/V02/Forecast/ForecastSampleV02.csv does not resolve to any file(s). Please make sure the file/folder exists and is not hidden. At the same time, please ensure special character is not included in file/folder name, for example, name starting with _ I've verified the blob file exists. The trigger fires correctly and passes parameters The path looks valid. The dataset is parameterized correctly with @dataset().folderPath and @dataset().fileName I've attached screenshots of: 🔵 00-Pipeline Trigger Configuration On Blob creation 🔵 01-Trigger Parameters 🔵 02-Pipeline Parameters 🔵 03-Data flow Parameters 🔵 04-Data flow Parameters without default value 🔵 05-Data flow CSVsource parameters 🔵 06-Data flow Source Dataset 🔵 07-Data flow Source dataset Parameters 🔵 08-Data flow Source Parameters 🔵 09-Parameters passed to the pipeline from the trigger 🔵 10-Data flow error message https://primeinnovativetechnologies-my.sharepoint.com/:b:/g/personal/john_primeinntech_com/EYoH5Sm_GaFGgvGAOEpbdXQB7QJFeXvbFmCbZiW85PwrNA?e=0yjeJR What could be causing the data flow to fail on file path resolution only when triggered, even though the exact same parameters succeed during manual debug runs? Could this be related to: Extra slashes or encoding in trigger output? Misuse of @dataset().folderPath and fileName in the dataset? Limitations in how blob trigger outputs are parsed? Any insights would be appreciated! Thank youSolved415Views0likes1CommentEdge on iOS, does not download files
Hi Microsoft team, I am using an API request to download a pdf (or a .doc) file from our servers. Most browsers support createObjectURL(blob), however for IE and EDGE , We have been using msSaveOrOpenBlob to save the blob data. iOS Edge browser, however, neither does not support msSaveOrOpenBlob nor does respond to createObjectURL. So what should I do to download a file in iOS edge? Thank you, Koushik Kuppanna1.3KViews0likes0CommentsUnable to unzip file using Extract archive to folder
Hi Team, I used a logic app to download a zip file from a URL and saved it in a blob. Now, I want to unzip the file. So I used "Extract archive to folder" action in Logic App. But since the size of my zip file is more than 50 MB, the extract isn't successful and I am facing the below error: { "status": 413, "message": "The file contains 50.306 megabytes which exceeds the maximum 50 megabytes.\r\nclientRequestId: abcd", "error": { "message": "The file contains 50.306 megabytes which exceeds the maximum 50 megabytes." }, "source": "azureblob-ci.azconn-ci.p.azurewebsites.net" } How can I increase this threshold value of 50 MB and get my action triggered? Please help. Regards, Mitesh Agrawal3.8KViews0likes0CommentsError when creating an index from a data source in azure cognitive search
I was recently tasked with indexing our Azure Blob Storage account (1.2tb). I am currently using the free version of Cog. Search (50mb storage limit). The issue I'm having is after I import data and create an indexer I get the following Import configuration failed, error creating Index Error creating Index: "" What causes the issue to arise and how do I fix it? Is it a problem that I'm using the free version and so the storage isn't enough?1.7KViews0likes0CommentsLesson Learned #25: Export/Import Azure SQL Database using Azure File Service?
First published on MSDN on Apr 03, 2017 In some situations, we need to import or export our Azure SQL Database using SQLPackage, but, unfortunately, either source and destination file we cannot specify a blob storage, in case that we want to save the file in this storage.Could a Blob Trigger in a Function "Miss"
I have a function (2.0 written in C#) that has a blob trigger. In this function, whenever a blob is created (to the tune of about 100,000 blobs every 90 minutes) I inspect the blob properties and write some data to a database (SQL Server) table. I am seeing instances where a blob is created, but there is no corresponding row in the table for the blob properties Is it possible that the blobs are being written so quickly that the trigger "misses" and fails to fire?1.5KViews0likes0Comments