<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>Azure Data Factory topics</title>
    <link>https://techcommunity.microsoft.com/t5/azure-data-factory/bd-p/AzureDataFactory</link>
    <description>Azure Data Factory topics</description>
    <pubDate>Tue, 28 Apr 2026 02:24:44 GMT</pubDate>
    <dc:creator>AzureDataFactory</dc:creator>
    <dc:date>2026-04-28T02:24:44Z</dc:date>
    <item>
      <title>Web activity failure due to Invoking endpoint failed with HttpStatusCode - 403 -- help?</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/web-activity-failure-due-to-invoking-endpoint-failed-with/m-p/4497130#M957</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;I have an Azure Data Factory (ADF) instance that I am using to create a Pipeline to ingest external (cloud based) 3rd party data into my Azure SQL Server database. I am a novice with ADF and have only used it to ingest some external SQL data into my SQL database - it did work.&lt;BR /&gt;The external source I'm attempting to extract from uses an OAuth 2.0 API and an API is something I've not used before.&lt;BR /&gt;&lt;BR /&gt;Using Postman (never used this software before this attempt), I have passed the external source's base_url, client_id, and client_secret, and in return successfully received an access token. This tells me that the base_url, client_id, and client_secret values I passed are correct and accepted by the target source/application.&lt;BR /&gt;&lt;BR /&gt;Feeling encourage to implement the same values into ADF, I first created a Linked Service which with a successful test connection returned - see below. This Linked Service uses the same values as the Postman entry which granted an access token.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I then created a Pipeline with a Web activity object within it. The General and User Properties don't have any configuration, only the Settings tab does which shown below. Again, the URL, Client ID and Client Secret configured here are the same as those used in Postman (and the Linked Service).&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I execute the Web object and it returns with a failure - see below.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The error states the endpoint refused the request (for an access token). Is this accurate as I was able to receive an access token via Postman when using the same credentials?&amp;nbsp; I don't understand why via Postman I can received an access token but via ADF it errors. I'm wondering if I've completed the ADF parts incorrectly, or if there is more needed just to received an access token, or if it's something else?&lt;BR /&gt;Are you able to advise what's taking place here?&lt;BR /&gt;Thanks.&lt;/P&gt;</description>
      <pubDate>Wed, 25 Feb 2026 17:21:42 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/web-activity-failure-due-to-invoking-endpoint-failed-with/m-p/4497130#M957</guid>
      <dc:creator>AzureNewbie1</dc:creator>
      <dc:date>2026-02-25T17:21:42Z</dc:date>
    </item>
    <item>
      <title>Unable to INSERT rec into Table.</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/unable-to-insert-rec-into-table/m-p/4495191#M956</link>
      <description>&lt;P&gt;I am using the ADF . Execute Pipeline activity.&lt;BR /&gt;My PL is parametrized.&lt;BR /&gt;&lt;BR /&gt;My stage table have the below rec:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"&gt;&lt;table&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;network_org_name&lt;/td&gt;&lt;td&gt;partner_name&lt;/td&gt;&lt;td&gt;parent_partner_name&lt;/td&gt;&lt;td&gt;CorPart_number&lt;/td&gt;&lt;td&gt;pipeline_run_id&lt;/td&gt;&lt;td&gt;activity_run_id&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;Se&amp;nbsp; Network Producing&lt;/td&gt;&lt;td&gt;UB&amp;nbsp; SA&lt;/td&gt;&lt;td&gt;S Reine Company Ltd&lt;/td&gt;&lt;td&gt;1226133&lt;/td&gt;&lt;td&gt;ebf961f9-3afc-4abc-9ee0-111330c9f1eb&lt;/td&gt;&lt;td&gt;&lt;P&gt;858c1b5c-a699-4e73-84d8-3a42cab115d2&lt;/P&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;/DIV&gt;&lt;LI-CODE lang=""&gt;WITH CodeSourceTableCTE AS
	(
	SELECT DISTINCT p.[partner_number]
		  ,n.[network_org_number]
	FROM [utl].[stg_partner_network_org] stg
	LEFT JOIN [utl].[partner] p1
	ON stg.[parent_partner_name] = p1.[partner_name]
	INNER JOIN [utl].[partner] p
	ON stg.[partner_name] = p.[partner_name] AND ISNULL(p.[delete_ind], '') &amp;lt;&amp;gt; 'Y' 
    AND ISNULL(p.[parent_partner_number], '') = ISNULL(p1.[partner_number], '') AND ISNULL(p1.[delete_ind], '') &amp;lt;&amp;gt; 'Y' 
	AND ISNULL(stg.[corpart_number], '') = ISNULL(p.[external_id], '')
	INNER JOIN [utl].[network_org] n
	ON stg.[network_org_name] = n.[network_org_name]
	WHERE stg.[partner_name] IS NOT NULL
	)
			
	MERGE [utl].[partner_network_org] T
	USING CodeSourceTableCTE S 
	   ON T.[partner_number] = S.[partner_number]
	  AND T.[network_org_number] = S.[network_org_number]

	  WHEN MATCHED AND (ISNULL(T.[delete_ind], '') &amp;lt;&amp;gt; 'N') THEN
		  UPDATE 
			 SET T.[delete_ind] = 'N',
				 T.[last_modified_date] = GETUTCDATE(),
				 T.[last_modified_by] = '@{pipeline().parameters.Source_File_Name}'
			
	 WHEN NOT MATCHED BY TARGET THEN
		  INSERT ([partner_number], [network_org_number], [created_date], [created_by], [last_modified_date], [last_modified_by], [delete_ind])
		  VALUES (S.[partner_number], S.[network_org_number], GETUTCDATE(), '@{pipeline().parameters.Source_File_Name}', GETUTCDATE(), '@{pipeline().parameters.Source_File_Name}', 'N');

SELECT COUNT(*) FROM [utl].[partner_network_org] WHERE [created_by] = '@{pipeline().parameters.Source_File_Name}'&lt;/LI-CODE&gt;&lt;P&gt;But when I am trying to check using below&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="sql"&gt;select * from utl.partner_network_org
where partner_number in (select partner_number  from utl.partner where last_modified_by like '%corpart%')&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;I don't see records any help is appreciated&lt;/P&gt;</description>
      <pubDate>Mon, 16 Feb 2026 10:38:02 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/unable-to-insert-rec-into-table/m-p/4495191#M956</guid>
      <dc:creator>DRN_SDE</dc:creator>
      <dc:date>2026-02-16T10:38:02Z</dc:date>
    </item>
    <item>
      <title>Azure Data Factory - help needed to ingest data, that uses an API, into a SQL Server instance</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/azure-data-factory-help-needed-to-ingest-data-that-uses-an-api/m-p/4490111#M955</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;I need to ingest 3rd-party data, that uses an API, into my Azure SQL Server instance (ASSi) using Azure Data Factory (ADF).&lt;BR /&gt;I'm a report developer so this area is unfamiliar to me, although I have previously integrated external, on-premise SQL Server data into our ASSi using ADF so I do have some exposure to the tool (just not API connections).&lt;BR /&gt;&lt;BR /&gt;The 3rd-party data belongs to a company named 'iLevel' (in case this is relevant). iLevel have provided some API documentation which is targeted for an experienced data engineer that understands API connections. This documentation has just a few sections before the&amp;nbsp;&lt;EM&gt;connection&lt;/EM&gt;&amp;nbsp;focused details end. I'll list them below:&lt;BR /&gt;1) It mentions to download 'Postman Collection' and mentions no more than this. I've never heard of Postman Collection and I don't know why it's needed. Through limited exposure online, I don't understand its purpose, certainly not in my scenario.&lt;BR /&gt;&lt;BR /&gt;2) It has the title 'Access to the API' and then lists four URLs which are the endpoints (I don't know which to use and will need to ask iLevel of this but, I guess, any will do for testing purposes).&lt;BR /&gt;&lt;BR /&gt;3) Authentication and Authorization&lt;BR /&gt;a) Generate a 'client id' and 'client secret' by logging into iLevel and generating these values with some clicks of a button. I've successfully generated both these values.&lt;BR /&gt;b) Obtain an Access Token - it'll be easier to screenshot the instructions for this (I've blanked part of the URL for confidentiality).&lt;/P&gt;&lt;img /&gt;&lt;P&gt;These are all the instructions on connection to the 3rd-party data. Unfortuntely for me, my lack of experience in this area means these instrcutions don't help me. I don't believe I'm any closer to connecting to the 3rd-party source data.&lt;BR /&gt;&lt;BR /&gt;Taking into consideration the above instructions but choosing to try and error in ADF, a tool I'm a little bit more familiar with, I've performed the following steps:&lt;BR /&gt;1) Created a Linked Service.&lt;BR /&gt;I understand the iLevel solution is in the cloud and therefore the 'AutoResolveIntegrationRuntime' option has been selected as the 'Connect via integration runtime' value. For the 'Base URL' I've entered one of the four URL endpoints that were listed in the documentation (again, I will need to confirm which endpoint to use).&lt;/P&gt;&lt;img /&gt;&lt;P&gt;The 'Test Connection' returns a successful result but I think it means nothing because if I were to placed 'xxx' at the end of the Base URL and test the connection, it stills returns successful when I know the URL with the 'xxx' post-fix isn't legit.&lt;BR /&gt;&lt;BR /&gt;2) Create an ADF Pipeline containing 'Web activity' and 'Set variable' objects.&lt;BR /&gt;The only configuration under the Web activity is the 'Settings' pane which has:&lt;/P&gt;&lt;img /&gt;&lt;P&gt;The 'Body' property has (the client id and client secret taken from the iLevel solution are included in the body but blanked out):&lt;/P&gt;&lt;img /&gt;&lt;P&gt;If the Web activity is successful then the Pipeline's next object (the Set variable) should assign the access token to a variable - as I understand this is what the Web activity is doing:&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&lt;BR /&gt;The 'Value' property has:&lt;/P&gt;&lt;img /&gt;&lt;P&gt;This is as far as I've got into my efforts of this integration task because the Web activity object fails when executed. The error message does state it is to do with an invalid 'client id' or 'client secret' - see below:&lt;/P&gt;&lt;img /&gt;&lt;P&gt;You may direct to focus on the incorrect client id or client secret, however I don't have any confidence that I understand how to configure ADF to obtain an access token, and I'm maybe missing something if I see no need for Postman Collection use.&lt;BR /&gt;&lt;BR /&gt;What is Postman Collection and do I need it for what I'm trying to achieve?&amp;nbsp; If yes, can anyone provide training material that suits my need?&lt;BR /&gt;Have I configured ADF correctly and it is indeed an issue with the client id or client server, or is the error message received just a by-product of an incorrect ADF configuration?&lt;BR /&gt;&lt;BR /&gt;Your help will be most appreciated. Many thanks.&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jan 2026 15:22:42 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/azure-data-factory-help-needed-to-ingest-data-that-uses-an-api/m-p/4490111#M955</guid>
      <dc:creator>AzureNewbie1</dc:creator>
      <dc:date>2026-01-28T15:22:42Z</dc:date>
    </item>
    <item>
      <title>ADF unable to ingest partitioned Delta data from Azure Synapse Link (Dataverse/FnO)</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/adf-unable-to-ingest-partitioned-delta-data-from-azure-synapse/m-p/4484300#M953</link>
      <description>&lt;P&gt;We are ingesting Dynamics 365 Finance &amp;amp; Operations (FnO) data into ADLS Gen2 using Azure Synapse Link for Dataverse, and then attempting to load that data into Azure SQL Database using Azure Data Factory (ADF).&lt;/P&gt;&lt;P&gt;This is part of a migration effort as Export to Data Lake is being deprecated.&lt;/P&gt;&lt;P&gt;Source Details&lt;BR /&gt;Source: ADLS Gen2&lt;BR /&gt;Data generated by: Azure Synapse Link for Dataverse (FnO)&lt;BR /&gt;Format on lake:&lt;BR /&gt;Delta / Parquet&lt;BR /&gt;Partitioned folder structure (e.g. PartitionId=xxxx)&lt;BR /&gt;Destination:&lt;BR /&gt;Azure SQL Database&lt;BR /&gt;Issue Observed in ADF&lt;/P&gt;&lt;P&gt;When configuring ADF pipelines:&lt;/P&gt;&lt;P&gt;Using ADLS Gen2 dataset with:&lt;BR /&gt;Delta / Parquet&lt;BR /&gt;Recursive folder traversal&lt;BR /&gt;Wildcard paths&lt;BR /&gt;We encounter:&lt;BR /&gt;No data returned in Data Preview&lt;BR /&gt;Or runtime error such as:&lt;BR /&gt;“No partitions information found in metadata file”&lt;BR /&gt;Despite this:&lt;BR /&gt;The data is present in ADLS&lt;BR /&gt;The same data can be successfully queried using Synapse serverless SQL&lt;BR /&gt;Key Question for ADF / Synapse Engineers&lt;/P&gt;&lt;P&gt;What is the recommended and supported ADF ingestion pattern for:&lt;/P&gt;&lt;P&gt;Partitioned Delta/Parquet data produced by Azure Synapse Link for Dataverse&lt;/P&gt;&lt;P&gt;Specifically:&lt;/P&gt;&lt;P&gt;Should ADF:&lt;BR /&gt;Read Delta tables directly, or&lt;BR /&gt;Use Synapse serverless SQL external tables/views as an intermediate layer?&lt;BR /&gt;Is there a reference architecture for:&lt;BR /&gt;Synapse Link → ADLS → ADF → Azure SQL&lt;BR /&gt;Are there ADF limitations when consuming Synapse Link–generated Delta tables?&lt;/P&gt;&lt;P&gt;Many customers are now forced to migrate due to Export to Data Lake deprecation, but current ADF documentation does not clearly explain how to replace existing ingestion pipelines when using Synapse Link for FnO.&lt;/P&gt;&lt;P&gt;Any guidance, patterns, or official documentation would be greatly appreciated.&lt;/P&gt;</description>
      <pubDate>Fri, 09 Jan 2026 07:10:44 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/adf-unable-to-ingest-partitioned-delta-data-from-azure-synapse/m-p/4484300#M953</guid>
      <dc:creator>Raheelislam</dc:creator>
      <dc:date>2026-01-09T07:10:44Z</dc:date>
    </item>
    <item>
      <title>Copy Data Activity Failed with Unreasonable Cause</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/copy-data-activity-failed-with-unreasonable-cause/m-p/4474745#M951</link>
      <description>&lt;P&gt;It is a simple set up but it has baffled me a lot. I'd like to copy data to a data lake via API. Here are the steps I've taken:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Created a HTTP linked service as below:&lt;/LI&gt;&lt;/UL&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Created a dataset with a HTTP Binary data format as below:&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Created a pipeline with a Copy Data activity only as shown below:&lt;/LI&gt;&lt;/UL&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Made sure linked service and dataset all working fine as below:&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Created a Sink dataset with 3 parameters as shown below:&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;UL&gt;&lt;LI&gt;Passed parameters from pipeline to Sink dataset as below:&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;That's all. Simple, right? But the pipeline failed with a clear message "usually this is caused by invalid credentials." as below:&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Summary: No need to worry about the Sink side of parameters etc. which I have used same thing for years on other pipelines and all succeeded. This time the API failed to reach a data lake from source side as said "invalid credentials". In Step 4 above one could see the linked service and dataset connections were succeeded, ie. credentials have been checked and passed already. How come it failed in data copy activity complaining an invalid credentials? Pretty weird. Any advice and suggestions will be welcomed.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 03 Dec 2025 02:59:45 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/copy-data-activity-failed-with-unreasonable-cause/m-p/4474745#M951</guid>
      <dc:creator>AJ81</dc:creator>
      <dc:date>2025-12-03T02:59:45Z</dc:date>
    </item>
    <item>
      <title>User Properties of Activities in ADF: How to add dynamic content in it?</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/user-properties-of-activities-in-adf-how-to-add-dynamic-content/m-p/4469628#M950</link>
      <description>&lt;P&gt;On ADF, I am using a for each loop in which I am using an Execute Pipeline Activity which is getting executed for different iterations as per the values of the items provided to the For-Each Loop.&lt;BR /&gt;&lt;BR /&gt;I am stuck on a scenario which requires me to add the Dynamic Content Expression in the User Properties of individual activities of ADF. Specific to my case, I want to add the Dynamic Content Expression in the User Properties of Execute Pipeline Activity so that I get to individual runs of these activities on Azure Monitor with a specific label attached to it through its User Properties.&lt;BR /&gt;&lt;BR /&gt;The necessity to add the Dynamic Content Expression in the User Properties is due to the reason that each execution in respective iterations of these activities corresponds to a particular Step from a set of Steps configured for the Data Load Job as a whole, which has been orchestrated through ADF. To identify the association with the respective Job-Step, I require to add Dynamic Content Expression in its User Properties.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any sort of response regarding this is highly appreciated.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank You!&lt;/P&gt;</description>
      <pubDate>Thu, 13 Nov 2025 12:17:10 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/user-properties-of-activities-in-adf-how-to-add-dynamic-content/m-p/4469628#M950</guid>
      <dc:creator>manuj</dc:creator>
      <dc:date>2025-11-13T12:17:10Z</dc:date>
    </item>
    <item>
      <title>Data flow sink to Blob storage not writing to subfolder</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/data-flow-sink-to-blob-storage-not-writing-to-subfolder/m-p/4461441#M946</link>
      <description>&lt;P&gt;Hi Everybody&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This seems like it should be straightforward, but it just doesn't seem to work... I have a file containing JSON data, one document per line, with many different types of data. Each type is identified by a field named "OBJ", which tells me what kind of data it contains. I want to split this file into separate files in Blob storage for each object type prior to doing some downstream processing. So, I have a very simple data flow - a source which loads the whole file, and a sink which writes the data back to separate files. In the sink settings, I've set the "File name option" setting to "Name file as column data" and selected my OBJ column for the "Column Data", and this basically works - it writes out a separate file for each OBJ value, containing the right data. So far, so good.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, what &lt;EM&gt;doesn't&lt;/EM&gt; seem to work is the very simplest thing - I want to write the output files to a folder in my Blob storage container, but the sink seems to completely ignore the "Folder path" setting and just writes them into the root of the container. I can write my output files to a different container, but not to a subfolder inside the same container. It even creates the folder if it's not there already, but doesn't use it.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Am I missing something obvious, or does the "Folder path" setting just not work when naming files from column data? Is there a way around this?&lt;/P&gt;</description>
      <pubDate>Tue, 14 Oct 2025 10:21:58 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/data-flow-sink-to-blob-storage-not-writing-to-subfolder/m-p/4461441#M946</guid>
      <dc:creator>DuncanKing</dc:creator>
      <dc:date>2025-10-14T10:21:58Z</dc:date>
    </item>
    <item>
      <title>Problem with Linked Service to SQL Managed Instance</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/problem-with-linked-service-to-sql-managed-instance/m-p/4457147#M945</link>
      <description>&lt;P&gt;Hi&lt;/P&gt;&lt;P&gt;I'm trying to create a linked Service to a SQL Managed Instance. The Managed Instance is configured with a Vnet_local endpoint&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If I try to connect with an autoresolve IR or a SHIR I get the following error&lt;BR /&gt;&lt;BR /&gt;The value of the property '' is invalid: 'The remote name could not be resolved: 'SQL01.public.ec9fbc2870dd.database.windows.net''.&lt;BR /&gt;The remote name could not be resolved: 'SQL01.public.ec9fbc2870dd.database.windows.net'&lt;BR /&gt;&lt;BR /&gt;Is there a way to connect to it without resorting to a private endpoint?&lt;BR /&gt;&lt;BR /&gt;Cheers&lt;/P&gt;&lt;P&gt;Alex&lt;/P&gt;</description>
      <pubDate>Fri, 26 Sep 2025 06:31:27 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/problem-with-linked-service-to-sql-managed-instance/m-p/4457147#M945</guid>
      <dc:creator>alexp01482</dc:creator>
      <dc:date>2025-09-26T06:31:27Z</dc:date>
    </item>
    <item>
      <title>ADF connection issue with Cassandra</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/adf-connection-issue-with-cassandra/m-p/4453099#M943</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am trying to connect a cassandra DB hosted in azure cosmos db. I created the linked service but getting below error on test connection.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Already checked the cassandra DB and its public network access is set to all networks.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Google suggested enabling SSL but there is no such option in linked service.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please help.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Failed to connect to the connector. Error code: 'Unknown', message: 'Failed to connect to Cassandra server due to, ErrorCode: InternalError'&lt;/EM&gt;&lt;BR /&gt;&lt;EM&gt;Failed to connect to the connector. Error code: 'InternalError', message: 'Failed to connect to Cassandra server due to, ErrorCode: InternalError'&lt;/EM&gt;&lt;BR /&gt;&lt;EM&gt;Failed to connect to Cassandra server due to, ErrorCode: InternalError&lt;/EM&gt;&lt;BR /&gt;&lt;EM&gt;All hosts tried for query failed (tried 51.107.58.67:10350: SocketException 'A request to send or receive data was disallowed because the socket is not connected and (when sending on a datagram socket using a sendto call) no address was supplied')&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 10 Sep 2025 22:00:34 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/adf-connection-issue-with-cassandra/m-p/4453099#M943</guid>
      <dc:creator>panksume</dc:creator>
      <dc:date>2025-09-10T22:00:34Z</dc:date>
    </item>
    <item>
      <title>User configuration issue</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/user-configuration-issue/m-p/4445545#M939</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am getting below error&lt;BR /&gt;"Execution fail against sql server. Please contact SQL Server team if you need further support. Sql error number: 3930. Error Message: The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction"&lt;BR /&gt;I never faced this kind of error.&lt;BR /&gt;Could anyone please let me know what i can do and i needs to do.&amp;nbsp;&lt;BR /&gt;I am beginner, please explain me.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks.&lt;/P&gt;</description>
      <pubDate>Tue, 19 Aug 2025 12:27:01 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/user-configuration-issue/m-p/4445545#M939</guid>
      <dc:creator>Aviavi-123</dc:creator>
      <dc:date>2025-08-19T12:27:01Z</dc:date>
    </item>
    <item>
      <title>Help with Partial MongoDB Update via Azure Data Factory Data Flow</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/help-with-partial-mongodb-update-via-azure-data-factory-data/m-p/4443596#M937</link>
      <description>&lt;P&gt;Hello, everyone!&lt;BR /&gt;&lt;BR /&gt;I have a complex question about how to perform a partial update on a MongoDB collection using Data Flow in Azure Data Factory. My goal is to modify only some nested fields without overwriting the entire document.&lt;BR /&gt;My flow reads JSON files with the following structure:&lt;BR /&gt;&lt;BR /&gt;{&lt;BR /&gt;"_id": {&lt;BR /&gt;"$oid": "1xp3232to"&lt;BR /&gt;},&lt;BR /&gt;"root_field": "root_value",&lt;BR /&gt;"main_array": [&lt;BR /&gt;{&lt;BR /&gt;"array_id": "id001",&lt;BR /&gt;"status": "PENDING",&lt;BR /&gt;"nested_array": []&lt;BR /&gt;}&lt;BR /&gt;],&lt;BR /&gt;"numeric_value": {&lt;BR /&gt;"$numberDecimal": "10.99"&lt;BR /&gt;}&lt;BR /&gt;}&lt;BR /&gt;&lt;BR /&gt;I need Data Flow to make two changes in a single run:&lt;BR /&gt;&lt;BR /&gt;Change the status field from "PENDING" to "SENT".&lt;BR /&gt;Add a new object to the nested_array with the following data:&lt;BR /&gt;event: "SENT"&lt;BR /&gt;description: "FILE GENERATED"&lt;BR /&gt;timestamp: (current date and time)&lt;BR /&gt;system: "Sis Test"&lt;BR /&gt;&lt;BR /&gt;I've tried some expressions with update and append in the Derived Column transformation, but I can't get the syntax right to make both changes at the same time.&lt;BR /&gt;&lt;BR /&gt;My biggest concern is with the MongoDB Sink: how to configure it so that Data Flow performs a partial update and doesn't overwrite the entire document, losing root_field, numeric_value, etc.?&lt;BR /&gt;&lt;BR /&gt;My questions are:&lt;BR /&gt;What is the correct expression for the Derived Column that makes these two nested modifications in a single step?&lt;BR /&gt;&lt;BR /&gt;How should I configure the MongoDB Sink to ensure the update is partial, using _id as the key?&lt;BR /&gt;&lt;BR /&gt;I really appreciate the community's help!&lt;/P&gt;</description>
      <pubDate>Thu, 14 Aug 2025 16:29:48 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/help-with-partial-mongodb-update-via-azure-data-factory-data/m-p/4443596#M937</guid>
      <dc:creator>leopoldinoex</dc:creator>
      <dc:date>2025-08-14T16:29:48Z</dc:date>
    </item>
    <item>
      <title>Getting an Oauth2 API access token using client_id and client_secret - help</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/getting-an-oauth2-api-access-token-using-client-id-and-client/m-p/4443568#M936</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;I'm attempting to integrate external data into our SQL Server. The third-party data is from a solution called iLevel. They use token based Oauth2 APIs for access. The integration tool is ADF Pipelines.&lt;BR /&gt;I'm not a data engineer but it has fallen upon me to complete this exercise. What I've attempted so far is failing and I don't know why. I would like your help on this. I'll explain what I've configured so far in the order I configured it.&lt;BR /&gt;&lt;BR /&gt;1) To generate a client_id and client_secret, I logged on to the iLevel solution itself and generated the same for my account (call me 'Joe' account) and the Team account (call it 'Data team' account). I've recorded the client_id and client_secret for both users/accounts in notepad for reference.&lt;BR /&gt;2) I logged in Azure Data Factory using my 'Joe Admin' admin account (this is the account I need to log in with for any ADF development).&lt;BR /&gt;3) I created a Linked Service with the following configuration. Note how the Test connection was successful. I guess this means our ADF instance can connect to iLevel's Base URL.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;4) I then created a dataset for iLevel. I configured this based on an online example I was following which I can't get working, so this configuration may be incorrect.&lt;/P&gt;&lt;img /&gt;&lt;img /&gt;&lt;P&gt;5) I then created a Pipeline which contains a 'Web' activity and a 'Set variable' activity.&lt;BR /&gt;The Pipeline has a variable as shown below.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;The 'Web' activity has the following configuration:&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&lt;STRONG&gt;URL &lt;/STRONG&gt;= is iLevel's token URL (it is different from the Base URL used in the Linked Service).&lt;BR /&gt;&lt;STRONG&gt;Body &lt;/STRONG&gt;= I've blocked out the client_id and client_secret (I'm using the client_id and client_secret generated for the 'Data team' account - remember I'm logged into ADF using the 'Joe Admin' account - not sure if this makes a difference) but have placed red brackets around where the start and end of each values is. I'm wrapping the values in any single or double quotes - not sure if I'm meant to.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;I'm not sure if I have configured the &lt;STRONG&gt;Body &lt;/STRONG&gt;correctly. The ilevel documentation states to use an Authorization header, Content-Type header and Body - it states to the following is needed to obtain an access token, but it doesn't state exactly how to submit the information (i.e. how to format it). Notice how, in my configuration, I haven't used an Authorization header - this partly because an online example I've followed doesn't use one. If iLevel state to use one then I think I should but I don't know how to format it - any ideas?&lt;/P&gt;&lt;img /&gt;&lt;P&gt;The 'Set variable' activity has the following activity. The idea is the access token is retrieved from the 'Web' activity and placed in the 'Set variable' "iLevel access token" variable.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&lt;BR /&gt;At this point I validate all and it comes back with no errors found. I then Debug it to see if it does indeed work but it returns an error stating the request contains an invalid client_id or client_secret. The client_id and client_secret values used are the exact same I generated from within the iLevel solution just a few hours ago.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&lt;BR /&gt;Is anyone able to point out to me why this isn't working?&lt;BR /&gt;Have I populated all that I need to (as mentioned, iLevel say to use an Authorization header which I haven't but I don't know how to format it if I were to use one)?&lt;BR /&gt;What can I do to get this working?&lt;BR /&gt;&lt;BR /&gt;I'm just trying to get the access token at the moment. I've not even attempted to extract the iLevel data and can't until I get a working token. iLevel's token have a 1 hour time-to-live so the Pipeline needs to generate a new token each time it's executed.&lt;BR /&gt;&lt;BR /&gt;You help will be most appreciated. Thanks.&lt;/P&gt;</description>
      <pubDate>Thu, 14 Aug 2025 15:42:48 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/getting-an-oauth2-api-access-token-using-client-id-and-client/m-p/4443568#M936</guid>
      <dc:creator>AzureNewbie1</dc:creator>
      <dc:date>2025-08-14T15:42:48Z</dc:date>
    </item>
    <item>
      <title>Dynamics AX connector stops getting records after amount of time</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/dynamics-ax-connector-stops-getting-records-after-amount-of-time/m-p/4438621#M934</link>
      <description>&lt;P&gt;Hello everyone,&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;I am using the Dynamics AX connector to get data out of Finance. After a certain amount of time it suddenly doesnt get any new records anymore and it keeps running until it reaches the general timeout.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;It gets 290,000 records in like 01:30:00 and then keeps running and doesn't get new records anymore. Sometimes it gets stuck earlier or later.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;img /&gt;&lt;img /&gt;&lt;P&gt;Sometimes it also gives me this error:&lt;BR /&gt;&lt;BR /&gt;Failure happened on 'Source' side. ErrorCode=ODataRequestTimeout,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Fail to get response from odata service in a expected time.,Source=Microsoft.DataTransfer.Runtime.ODataConnector,''Type=System.Threading.Tasks.TaskCanceledException,Message=A task was canceled.,Source=mscorlib,'&lt;BR /&gt;&lt;BR /&gt;This is my pipeline JSON:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;{&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; "name": "HICT - Init Sync SalesOrders",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; "properties": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "activities": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "Get FO SalesOrders",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "Copy",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "dependsOn": [],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "policy": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "timeout": "0.23:00:00",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "retry": 0,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "retryIntervalInSeconds": 30,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "secureOutput": false,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "secureInput": false&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "userProperties": [],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "typeProperties": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "source": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DynamicsAXSource",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "query": "$filter=FM_InterCompanyOrder eq Microsoft.Dynamics.DataEntities.NoYes'No' and dataAreaId eq 'prev'&amp;amp;$select=SalesOrderNumber,SalesOrderName,IsDeliveryAddressPrivate,FormattedInvoiceAddress,FormattedDeliveryAddress,ArePricesIncludingSalesTax,RequestedReceiptDate,QuotationNumber,PriceCustomerGroupCode,PBS_PreferredInvoiceDate,PaymentTermsBaseDate,OrderTotalTaxAmount,OrderTotalChargesAmount,OrderTotalAmount,TotalDiscountAmount,IsInvoiceAddressPrivate,InvoiceBuildingCompliment,InvoiceAddressZipCode,LanguageId,IsDeliveryAddressOrderSpecific,IsOneTimeCustomer,InvoiceAddressStreetNumber,InvoiceAddressStreet,InvoiceAddressStateId,InvoiceAddressPostBox,InvoiceAddressLongitude,InvoiceAddressLatitude,InvoiceAddressDistrictName,InvoiceAddressCountyId,InvoiceAddressCountryRegionISOCode,InvoiceAddressCity,FM_Deadline,Email,DeliveryTermsCode,DeliveryModeCode,DeliveryBuildingCompliment,DeliveryAddressCountryRegionISOCode,DeliveryAddressZipCode,DeliveryAddressStreetNumber,SalesOrderStatus,DeliveryAddressStreet,DeliveryAddressStateId,SalesOrderPromisingMethod,DeliveryAddressPostBox,DeliveryAddressName,DeliveryAddressLongitude,DeliveryAddressLocationId,DeliveryAddressLatitude,DeliveryAddressDunsNumber,DeliveryAddressDistrictName,DeliveryAddressDescription,DeliveryAddressCountyId,DeliveryAddressCity,CustomersOrderReference,IsSalesProcessingStopped,CustomerRequisitionNumber,SalesOrderProcessingStatus,CurrencyCode,ConfirmedShippingDate,ConfirmedReceiptDate,SalesOrderOriginCode,URL,OrderingCustomerAccountNumber,InvoiceCustomerAccountNumber,ContactPersonId,FM_WorkerSalesTaker,FM_SalesResponsible,PaymentTermsName,DefaultShippingSiteId,DefaultShippingWarehouseId,DeliveryModeCode,dataAreaId,FM_InterCompanyOrder&amp;amp;cross-company=true",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "httpRequestTimeout": "00:15:00",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "additionalHeaders": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "Prefer": "odata.maxpagesize=1000"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "retrieveEnumValuesAsString": true&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sink": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "JsonSink",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "storeSettings": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "AzureBlobStorageWriteSettings",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "copyBehavior": "FlattenHierarchy"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "formatSettings": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "JsonWriteSettings"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "enableStaging": false,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "enableSkipIncompatibleRow": true,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "logSettings": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "enableCopyActivityLog": true,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "copyActivityLogSettings": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "logLevel": "Warning",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "enableReliableLogging": false&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "logLocationSettings": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "linkedServiceName": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "referenceName": "AzureBlobStorage",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "LinkedServiceReference"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "path": "ceexports"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "inputs": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "referenceName": "AX_SalesOrders_Dynamics_365_FO_ACC",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DatasetReference"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "outputs": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "referenceName": "Orders_FO_D365_Data_JSON",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DatasetReference"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ]&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "Get_All_CE_Table_Data",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "ForEach",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "dependsOn": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "activity": "Get FO SalesOrders",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "dependencyConditions": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "Completed"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ]&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "userProperties": [],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "typeProperties": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "items": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "value": "@pipeline().parameters.CE_Tables",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "Expression"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "activities": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "Copy_CE_TableData",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "Copy",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "dependsOn": [],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "policy": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "timeout": "0.12:00:00",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "retry": 0,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "retryIntervalInSeconds": 30,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "secureOutput": false,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "secureInput": false&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "userProperties": [],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "typeProperties": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "source": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "CommonDataServiceForAppsSource"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sink": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DelimitedTextSink",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "storeSettings": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "AzureBlobStorageWriteSettings",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "copyBehavior": "FlattenHierarchy"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "formatSettings": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DelimitedTextWriteSettings",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "quoteAllText": true,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "fileExtension": ".txt"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "enableStaging": false&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "inputs": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "referenceName": "CE_Look_Up_Tables",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DatasetReference",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "parameters": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "entiryName": "@item().sourceDataset"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "outputs": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "referenceName": "CE_GenericBlobSink",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DatasetReference",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "parameters": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "value": "@item().sinkPath",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "Expression"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ]&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ]&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "Transform_Create_CE_JSON",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "ExecuteDataFlow",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "dependsOn": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "activity": "Get_All_CE_Table_Data",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "dependencyConditions": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "Succeeded"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ]&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "policy": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "timeout": "0.12:00:00",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "retry": 0,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "retryIntervalInSeconds": 30,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "secureOutput": false,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "secureInput": false&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "userProperties": [],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "typeProperties": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "dataflow": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "referenceName": "FO_Transform_CE_Select",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "DataFlowReference"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "compute": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "coreCount": 16,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "computeType": "General"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "traceLevel": "Fine"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "parameters": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "CE_Tables": {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "type": "array",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "defaultValue": [&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_AccountRelations",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "crmp_accountrelation",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_AccountRelations.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_ContactRelations",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "crmp_contactrelation",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_ContactRelations.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_PriceCustomerGroup",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "msdyn_pricecustomergroup",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_PriceCustomerGroup.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_SalesOrderOrigin",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "odin_salesorderorigin",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_SalesOrderOrigin.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_ShipVia",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "msdyn_shipvia",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_ShipVia.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_SystemUser",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "systemuser",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_SystemUser.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_TermsOfDelivery",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "msdyn_termsofdelivery",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_TermsOfDelivery.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_Worker",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "cdm_worker",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_Worker.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_TransactionCurrency",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "transactioncurrency",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_TransactionCurrency.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_Warehouse",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "msdyn_warehouse",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_Warehouse.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_OperationalSite",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "msdyn_operationalsite",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_OperationalSite.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; {&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "name": "D365_CE_ACC_PaymentTerms",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sourceDataset": "odin_paymentterms",&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "sinkPath": "ce-exports/D365_CE_ACC_PaymentTerms.json"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; ]&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; }&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "annotations": [],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "lastPublishTime": "2025-07-30T12:55:32Z"&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; },&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; "type": "Microsoft.DataFactory/factories/pipelines"&lt;/P&gt;&lt;P&gt;}&lt;/P&gt;</description>
      <pubDate>Thu, 31 Jul 2025 10:44:40 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/dynamics-ax-connector-stops-getting-records-after-amount-of-time/m-p/4438621#M934</guid>
      <dc:creator>boydVos</dc:creator>
      <dc:date>2025-07-31T10:44:40Z</dc:date>
    </item>
    <item>
      <title>Azure Data Factory ForEach Loop Fails Despite Inner Activity Error Handling - Seeking Best Practices</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/azure-data-factory-foreach-loop-fails-despite-inner-activity/m-p/4434806#M932</link>
      <description>&lt;PRE&gt;Hello Azure Data Factory Community,&lt;BR /&gt;&lt;BR /&gt;I'm encountering a persistent issue with my ADF pipeline where a ForEach loop is failing, even though I've implemented error handling for the inner activities. I'm looking for insights and best practices on how to prevent internal activity failures from propagating up and causing the entire ForEach loop (and subsequently the pipeline) to fail, while still logging all outcomes.&lt;BR /&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;My Setup:&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;BR /&gt;My pipeline processes records using a ForEach loop. Inside the loop, I have a Web activity (Sample_put_record) that calls an external API. This API call can either succeed or fail for individual records.&lt;BR /&gt;&lt;BR /&gt;My current error handling within the ForEach iteration is structured as follows:&lt;BR /&gt;&lt;BR /&gt;1.&lt;STRONG&gt;Sample_put_record (Web Activity):&lt;/STRONG&gt; Makes the API call.&lt;BR /&gt;&lt;BR /&gt;2.&lt;STRONG&gt;Conditional Logic:&lt;/STRONG&gt; I've tried two main approaches:&lt;BR /&gt;&lt;BR /&gt;•&lt;STRONG&gt;Approach A (Direct Success/Failure Paths):&lt;/STRONG&gt; The Sample_put_record activity has a green arrow (on success) leading to a Log Success Items (Script activity) and a red arrow (on failure) leading to a Log Failed Items (Script activity). Both logging activities are followed by Wait activities (Dummy Wait For Success/Failure).&lt;BR /&gt;&lt;BR /&gt;•&lt;STRONG&gt;Approach B (If Condition Wrapper):&lt;/STRONG&gt; I've wrapped the Sample_put_record activity and its success/failure logging within an If Condition activity. The If Condition's expression is @equals(activity('Sample_put_record').status, 'Succeeded'). The True branch contains the success logging, and the False branch contains the failure logging. The intention here was for the If Condition to always report success, regardless of the Sample_put_record outcome, to prevent the ForEach from failing.&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;The Problem:&lt;/STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;Despite these error handling attempts, the ForEach loop (and thus the overall pipeline) still fails when an Sample_put_record activity fails. The error message I typically see for the ForEach activity is "Activity failed because an inner activity failed." When using the If Condition wrapper, the If Condition itself sometimes fails with the same error, indicating that an activity within its True or False branch is still causing a hard failure.&lt;BR /&gt;&lt;BR /&gt;For example, a common failure for Sample_put_record is: "valid":false,"message":"WARNING: There was no xxxxxxxxxxxxxxxxxxxxxxxxx scheduled..." (a user configuration/data issue). Even when my Log Failed Items script attempts to capture this, the ForEach still breaks.&lt;BR /&gt;&lt;BR /&gt;What I've Ensured/Considered:&lt;BR /&gt;&lt;BR /&gt;•&lt;STRONG&gt;Wait Activity Configuration:&lt;/STRONG&gt; Wait activities are configured with reasonable durations and do not appear to be the direct cause of failure.&lt;BR /&gt;&lt;BR /&gt;•&lt;STRONG&gt;No Unhandled Exceptions:&lt;/STRONG&gt; I'm trying to ensure no unhandled exceptions are propagating from my error handling activities.&lt;BR /&gt;&lt;BR /&gt;•&lt;STRONG&gt;Pipeline Status Goal:&lt;/STRONG&gt; My ultimate goal is for the overall pipeline status to be Succeeded as long as the pipeline completes its execution, even if some Sample_put_record calls fail and are logged. I need to rely on the logs to identify actual failures, not the pipeline status.&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;My Questions to the Community:&lt;/STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;1.What is the definitive best practice in Azure Data Factory to ensure a ForEach loop never fails due to an inner activity failure, assuming the inner activity's failure is properly logged and handled within that iteration?&lt;BR /&gt;&lt;BR /&gt;2.Are there specific nuances or common pitfalls with If Condition activities or Script activities within ForEach loops that could still cause failure propagation, even with try-catch and success exits?&lt;BR /&gt;&lt;BR /&gt;3.How do you typically structure your ADF pipelines to achieve this level of resilience where internal failures are logged but don't impact the overall pipeline success status?&lt;BR /&gt;&lt;BR /&gt;4.Are there any specific configurations on the ForEach activity itself (e.g., Continue on error setting, if it exists for ForEach?) or other activities that I might be overlooking?&lt;BR /&gt;&lt;BR /&gt;Any detailed examples, architectural patterns, or debugging tips would be greatly appreciated. Thank you in advance for your help!&lt;BR /&gt;&lt;BR /&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 20 Jul 2025 09:29:40 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/azure-data-factory-foreach-loop-fails-despite-inner-activity/m-p/4434806#M932</guid>
      <dc:creator>vijaybandari</dc:creator>
      <dc:date>2025-07-20T09:29:40Z</dc:date>
    </item>
    <item>
      <title>Copy Activity Successful, But Times Out</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/copy-activity-successful-but-times-out/m-p/4432469#M931</link>
      <description>&lt;P&gt;This appears to be an edge case, but I wanted to share.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;A copy activity is successful, but times out.&amp;nbsp; Duration is 1:58:55.&amp;nbsp; Times out at 2:00:12.&amp;nbsp; Runs a second time time and is successful, loading duplicate records.&amp;nbsp; The duplicate records is the undesired result.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Copy Activity&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;General&lt;UL&gt;&lt;LI&gt;Timeout: 0.02:00:00&lt;/LI&gt;&lt;LI&gt;Retry: 2&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;Source&lt;UL&gt;&lt;LI&gt;mySQL Parameterized&lt;/LI&gt;&lt;LI&gt;SQL Parameterized&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;Sink&lt;UL&gt;&lt;LI&gt;Synapse SQL Pool Parameterized&lt;/LI&gt;&lt;LI&gt;Copy method: COPY command&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;Settings&lt;UL&gt;&lt;LI&gt;Use V2 Hiearchy storage for staging&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;General&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Synapse/ADF Managed Network&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 11 Jul 2025 20:14:56 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/copy-activity-successful-but-times-out/m-p/4432469#M931</guid>
      <dc:creator>istock-ewh</dc:creator>
      <dc:date>2025-07-11T20:14:56Z</dc:date>
    </item>
    <item>
      <title>Advice requested: how to capture full SQL CDC changes using Dataflow and ADLS gen2</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/advice-requested-how-to-capture-full-sql-cdc-changes-using/m-p/4429917#M929</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I'm working on a fairly simple ETL process using Dataflow in Azure Data Factory, where I want to capture the changes in a CDC-enabled SQL table, and store those in Delta Lake format in a ADLS gen2 sink. The resulting dataset will be further processed, but for me this is the end of the line. I don't have an expert understanding of all the details of the Delta Lake format, but I do know that I can use it to store changes to my data over time. So in the sink, I enabled all Update methods (Insert, Delete, Upsert, Update), since my CDC source should be able to figure out the correct row transformation. Key columns are set to the primary key columns in SQL.&lt;/P&gt;&lt;P&gt;All this works fine as long as I configure my source to use CDC with 'netChanges: true'. That yields a single change row for each record, which is correctly stored in the sink.&lt;/P&gt;&lt;P&gt;But I want to capture all changes since the previous run, so I want to set the source to netChanges: false. That yields rows for every change since the previous time the dataflow ran. But for every table that actually has records with more than one change, the dataflow fails saying "Cannot perform Merge as multiple source rows matched and attempted to modify the same target row in the Delta table in possibly conflicting ways."&lt;/P&gt;&lt;P&gt;I take that to mean that my dataflow is, as it is, not smart enough to loop through all changes in the source, and apply them to the sink in order. So apparently something else has to be done. My intuition says that, since CDC actually provides all the metadata to make this possible, there's probably an out-of-the-box way to achieve what I want. But I can't readily find that magic box I should tick 😉.&lt;/P&gt;&lt;P&gt;I can probably build it out 'by hand', by somehow looping over all changes and applying them in order, but before I go down that route, I came here to learn from the experts whether this is indeed the only way, or, preferably, that there is a neat trick I missed to get this done easily.&lt;/P&gt;&lt;P&gt;Thanks so much for your advice!&lt;/P&gt;&lt;P&gt;BR&lt;/P&gt;</description>
      <pubDate>Thu, 03 Jul 2025 16:02:49 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/advice-requested-how-to-capture-full-sql-cdc-changes-using/m-p/4429917#M929</guid>
      <dc:creator>AnnejanBarelds</dc:creator>
      <dc:date>2025-07-03T16:02:49Z</dc:date>
    </item>
    <item>
      <title>Solution: Handling Concurrency in Azure Data Factory with Marker Files and Web Activities</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/solution-handling-concurrency-in-azure-data-factory-with-marker/m-p/4428127#M928</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I wanted to share a concurrency issue we encountered in Azure Data Factory (ADF) and how we resolved it using a small but effective enhancement—one that might be useful if you're working with &lt;STRONG&gt;shared Blob Storage&lt;/STRONG&gt; across multiple environments (like Dev, Test, and Prod).&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Background: Shared Blob Storage &amp;amp; Marker Files&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;In our ADF pipelines, we extract data from various sources (e.g., SharePoint, Oracle) and store them in &lt;STRONG&gt;Azure Blob Storage&lt;/STRONG&gt;. That Blob container is &lt;STRONG&gt;shared across multiple environments&lt;/STRONG&gt;.&lt;/P&gt;&lt;P&gt;To prevent duplicate extractions, we use marker files:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;started.marker → created when a copy begins&lt;/LI&gt;&lt;LI&gt;completed.marker → created when the copy finishes successfully&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;If both markers exist, pipelines reuse the existing file (caching logic). This mechanism was already in place and worked well under normal conditions.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;The Issue: Race Conditions&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;We observed that simultaneous executions from multiple environments sometimes led to:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Overlapping attempts to create the same started.marker&lt;/LI&gt;&lt;LI&gt;Duplicate copy activities&lt;/LI&gt;&lt;LI&gt;Corrupted Blob files&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;This became a serious concern because the Blob file was later loaded into &lt;STRONG&gt;Azure SQL Server&lt;/STRONG&gt;, and any corruption led to failed loads.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;The Fix: Web Activity + REST API&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;To solve this, we modified only the creation of started.marker by:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Replacing Copy Activity with a &lt;STRONG&gt;Web Activity that calls the Azure Storage REST API&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;The API uses Azure Blob Storage's conditional header &lt;STRONG&gt;If-None-Match: *&lt;/STRONG&gt; to safely create the file only if it doesn't exist&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;If the file already exists, the API returns "&lt;STRONG&gt;BlobAlreadyExists&lt;/STRONG&gt;", which the pipeline handles by skipping.&lt;/P&gt;&lt;P&gt;The Copy Activity is still used to copy the data and create the completed.marker—no changes needed there.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Updated Flow&lt;/STRONG&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Check marker files:&lt;UL&gt;&lt;LI&gt;If both exist (started and completed) → use cached file&lt;/LI&gt;&lt;LI&gt;If only started.marker → wait and retry&lt;/LI&gt;&lt;LI&gt;If none → continue to step 2&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;Web Activity calls REST API to create started.marker&lt;UL&gt;&lt;LI&gt;Success → proceed with copy in step 3&lt;/LI&gt;&lt;LI&gt;Failure → another run already started → skip/retry&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;Copy Activity performs the data extract&lt;/LI&gt;&lt;LI&gt;Copy Activity creates completed.marker&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;STRONG&gt;Benefits&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Atomic creation of started.marker → no race conditions&lt;/LI&gt;&lt;LI&gt;Minimal change to existing pipeline logic with marker files&lt;/LI&gt;&lt;LI&gt;Reliable downstream loads into Azure SQL Server&lt;/LI&gt;&lt;LI&gt;Preserves existing architecture (no full redesign)&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;STRONG&gt;Would love to hear:&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Have you used similar marker-based patterns in ADF?&lt;/LI&gt;&lt;LI&gt;Any other approaches to concurrency control that worked for your team?&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Thanks for reading! Hope this helps someone facing similar issues.&lt;/P&gt;</description>
      <pubDate>Sun, 29 Jun 2025 03:45:17 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/solution-handling-concurrency-in-azure-data-factory-with-marker/m-p/4428127#M928</guid>
      <dc:creator>mkoralage</dc:creator>
      <dc:date>2025-06-29T03:45:17Z</dc:date>
    </item>
    <item>
      <title>Blob Storage Event Trigger Disappears</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/blob-storage-event-trigger-disappears/m-p/4426713#M926</link>
      <description>&lt;P&gt;Yesterday I ran into an odd situation where there was a resource lock and I was unable to rename pipelines or drop/create storage event triggers.&amp;nbsp; An admin cleared the lock and I was able to remove and clean up the triggers and pipelines.&amp;nbsp; Today, when I try to recreate the blob storage trigger to process a file when it appears in a container, the trigger creates just fine but on refresh, it disappears.&amp;nbsp; If I try to recreate it again with the same name as the one that went away ADF UI says it already exists.&amp;nbsp; I cannot assign it to a pipeline because the UI does not see it.&amp;nbsp; Any insight as to where it is, how I can see it, or even what logs would have such activity recorded to give a clue as to what is going on.&amp;nbsp; This seems like a bug.&lt;/P&gt;</description>
      <pubDate>Tue, 24 Jun 2025 13:13:21 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/blob-storage-event-trigger-disappears/m-p/4426713#M926</guid>
      <dc:creator>RobDuMo</dc:creator>
      <dc:date>2025-06-24T13:13:21Z</dc:date>
    </item>
    <item>
      <title>Best practice to integrate to Azure DevOps?</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/best-practice-to-integrate-to-azure-devops/m-p/4421860#M922</link>
      <description>&lt;P&gt;Different sources suggesting different recommendations regarding ADF and ADO integration. Some say to use 'adf_publish' branch, while some suggest to use 'main' branch to be source for triggering yaml pipelines and disabling 'Publish' function in ADF. I guess practices are changing and setup could be different. The problem is finding all this information on the Internet makes it so confusing.&amp;nbsp;&lt;/P&gt;&lt;P&gt;So, the question is what is the best practice now (taking into account all the latest changes in ADO) regarding branches? How you set up your ADF and ADO integrations?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 08 Jun 2025 17:43:27 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/best-practice-to-integrate-to-azure-devops/m-p/4421860#M922</guid>
      <dc:creator>alwaysLearner</dc:creator>
      <dc:date>2025-06-08T17:43:27Z</dc:date>
    </item>
    <item>
      <title>Parameterization of Linked Services</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-factory/parameterization-of-linked-services/m-p/4421859#M921</link>
      <description>&lt;P&gt;I am trying to parameterize Linked Service in ADF. Probably got confused, and hope someone will make it clear.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Two questions:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;I have two parameters: 'url' and 'secretName'. However, in ARM template I only see 'url' parameter, but not 'secretName'. Why 'secretName' is not parameterized?&lt;/LI&gt;&lt;LI&gt;How do I supply a value for the 'url' parameter when I will deploy ARM template to another environment (let's say 'Test' environment)?&amp;nbsp;&amp;nbsp;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;These are files:&lt;/P&gt;&lt;P&gt;Linked Service:&lt;/P&gt;&lt;LI-CODE lang="json"&gt;{
    "name": "LS_DynamicParam",
    "properties": {
        "parameters": {
            "SA_URL": {
                "type": "String",
                "defaultValue": "https://saforrisma.dfs.core.windows.net/"
            },
            "SecretName": {
                "type": "String",
                "defaultValue": "MySecretInKeyVault"
            }
        },
        "annotations": [],
        "type": "AzureBlobFS",
        "typeProperties": {
            "url": "@{linkedService().SA_URL}",
            "accountKey": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_AKV",
                    "type": "LinkedServiceReference"
                },
                "secretName": {
                    "value": "@linkedService().SecretName",
                    "type": "Expression"
                }
            }
        }
    }
}&lt;/LI-CODE&gt;&lt;P&gt;ARMTemplatePArametersForFactory.json&lt;/P&gt;&lt;LI-CODE lang="json"&gt;{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "factoryName": {
            "value": "ADF-Dev"
        },
        "LS_AKV_properties_typeProperties_baseUrl": {
            "value": "https://kv-forrisma.vault.azure.net/"
        },
        "LS_MAINStorage_properties_typeProperties_connectionString_secretName": {
            "value": "storageaccount-adf-dev"
        },
        "LS_DynamicParam_properties_typeProperties_url": {
            "value": "@{linkedService().SA_URL}"
        }
    }
}&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 08 Jun 2025 17:28:04 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-factory/parameterization-of-linked-services/m-p/4421859#M921</guid>
      <dc:creator>alwaysLearner</dc:creator>
      <dc:date>2025-06-08T17:28:04Z</dc:date>
    </item>
  </channel>
</rss>

