ADF
11 TopicsParameterization of Linked Services
I am trying to parameterize Linked Service in ADF. Probably got confused, and hope someone will make it clear. Two questions: I have two parameters: 'url' and 'secretName'. However, in ARM template I only see 'url' parameter, but not 'secretName'. Why 'secretName' is not parameterized? How do I supply a value for the 'url' parameter when I will deploy ARM template to another environment (let's say 'Test' environment)? These are files: Linked Service: { "name": "LS_DynamicParam", "properties": { "parameters": { "SA_URL": { "type": "String", "defaultValue": "https://saforrisma.dfs.core.windows.net/" }, "SecretName": { "type": "String", "defaultValue": "MySecretInKeyVault" } }, "annotations": [], "type": "AzureBlobFS", "typeProperties": { "url": "@{linkedService().SA_URL}", "accountKey": { "type": "AzureKeyVaultSecret", "store": { "referenceName": "LS_AKV", "type": "LinkedServiceReference" }, "secretName": { "value": "@linkedService().SecretName", "type": "Expression" } } } } } ARMTemplatePArametersForFactory.json { "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#", "contentVersion": "1.0.0.0", "parameters": { "factoryName": { "value": "ADF-Dev" }, "LS_AKV_properties_typeProperties_baseUrl": { "value": "https://kv-forrisma.vault.azure.net/" }, "LS_MAINStorage_properties_typeProperties_connectionString_secretName": { "value": "storageaccount-adf-dev" }, "LS_DynamicParam_properties_typeProperties_url": { "value": "@{linkedService().SA_URL}" } } }50Views0likes0CommentsParsing error while calling service now table api
Hi, I am trying to copy data from service now to blob storage using service now table https://www.servicenow.com/docs/bundle/xanadu-api-reference/page/integrate/inbound-rest/concept/c_TableAPI.html . While calling from postman, I am getting json output with data, but when I configure the same in ADF using ServiceNow linked service as mentioned https://learn.microsoft.com/en-us/azure/data-factory/connector-servicenow?tabs=data-factory, I getting this error "Failed to parse the API response. Failed at: TableAPIClient FetchTableDataAsync Error message: Unexpected character encountered while parsing value: }. Path 'result[3716]', line 1, position 17316023". how I can resolve this error?109Views0likes0CommentsProcess your data in seconds with new ADF real-time CDC
In January, we announced that we've elevated our Change Data Capture features front-and-center in ADF. Up until just today, the lowest latency we were allowing for CDC processing was 15 minutes. But today, I am super-excited to announce that we have enabled the real-time option!25KViews12likes7CommentsUsing Azure Data Factory orchestrating Kusto query-ingest
In this blog post, we’ll explore how Azure Data Factory (ADF) can be used for orchestrating large query ingestions. With this approach you will learn, how to split one large query ingests into multiple partitions, orchestrated with ADF.7.9KViews3likes1CommentEmpty File is getting created in ADF
I have a ADF pipeline, which has data flows. The data flows reads excel file and puts the records to a SQL DB. The incorrect records are pushed to a Sink of Blob Storage as CSV File. When all the records are correct and empty. csv file is getting created and pushed to Blob. How can I avoid creation of this empty file.1.7KViews0likes0CommentsHow to calculate the cost of copy data from AzureBlob to Azure Data Explorer with Azure Data Factory
Azure Data Factory is a fully managed cloud-based data integration service. You can use the service to populate your Azure Data Explorer database with data from various locations and save time when building your analytics solutions. To estimate the total time and E2E cost to ingest Azure blob using Azure Data Factory to Azure Data Explorer5.6KViews1like2Comments