Data Factory
21 TopicsHTTP request was forbidden with client authentication when connecting to dataverse
Hi, I'm trying to setup a linked Service that connects to the dataverse. This dataverse was created from the Microsoft Developer Program. As I need multi factor authentication to log in, I tried using authentication type Service Principal. When I tested the connection, I get the following error: The HTTP request was forbidden with client authentication scheme 'Anonymous'. => The remote server returned an error: (403) Forbidden.Unable to Login to Dynamics CRM Below is what I used for the settings (data has been masked): In the Azure AD registration, I have assigned the following API permissions: I have created a certificate & secret for this application and used the secret value as the "service principal key" (I used the Application (client) ID for the "service principal ID"): I tried the secret ID as well, but I always get the same error. I was wondering if there is another section I need to look to fix the problem? Would it be because the dataverse is a developer version? Jason7.8KViews0likes3CommentsPrivate Endpoint to Dedicated Pool
Any advice on how to connect a private endpoint to a dedicated pool? Within ADF I created an integration runtime and connected it to my DB, but now I am struggling with connecting it to my dedicated pool. I was able to connect without issue in a development workspace but now we are looking to do it in a production workspace and I keep getting the error while provisioning: “failed to create private endpoint for client xxxxxxxxx: ID=xxxxxxx, name=xxxxxxx. I don’t know why I was able to connect without issue in the test environment but in a product subscription I cannot. Could it be that there was already some existing that allowed for the connection? Thanks in advance!988Views0likes0CommentsDesigning system to enable Adhoc queries
Hi, we are designing a data processing system in which the data goes through three different stages as shown below. What azure platforms or technologies do you recommend for a dynamic scenario like the one below where the input file format can change all the time, the transformations applied are not standard and the reports generated vary every time? Extract Data size can be around 1 GB. Can be of various formats and from various sources like FTP, API etc. Transform Transformations are applied on the data. Results After the transformations, results are exported to a final report table from which reports are generated.985Views0likes1CommentData Factory REST API Parameters in URL
Hi there, In Azure Data Factory I want to get data from a REST API and push it to a Azure Table Storage. My challenge is I need to make a call to two different API endpoints to get the details of each purchase order. First I get the header level information from the Purchase List endpoint, no problem doing that. Sample API endpoint used: https://private-anon-92de43072c-dearinventory.apiary-mock.com/ExternalApi/v2/purchaseList Purchase List documentation here: https://dearinventory.docs.apiary.io/#reference/purchase/purchase-list/get Then I need to make a call to the Purchase Order endpoint using the ID from the Purchase List to get the detailed purchase order information. Purchase List [ID] > Purchase Order [TaskID], where ID = TaskID https://private-anon-92de43072c-dearinventory.apiary-mock.com/ExternalApi/v2/purchase/order?TaskID=TaskID So my question is, how do I grab the ID from the Purchase List and pass it to the second API endpoint in a for-each within the Purchase Order API URL? Purchase Order API documentation: https://dearinventory.docs.apiary.io/#reference/purchase/purchase-order/get Where does the TaskID get defined and how to use it? Against the Linked Services? The Dataset? Pipeline? Any help is much appreciated. Purchase Order sample of what gets returned: { "Total": 2, "Page": 1, "PurchaseList": [ { "ID": "60b75408-f432-407d-bc69-66a0df49bc4c", "BlindReceipt": false, "OrderNumber": "PO-00026", "Status": "INVOICED", "OrderDate": "2018-02-22T00:00:00Z", "InvoiceDate": "2018-02-22T00:00:00Z", "Supplier": "Bayside Club", "SupplierID": "807b6bb2-69d5-46dc-8a6b-1bfe73519dca", "InvoiceNumber": "345345", "InvoiceAmount": 15, "PaidAmount": 15, "InvoiceDueDate": "2018-03-24T00:00:00Z", "RequiredBy": null, "BaseCurrency": "USD", "SupplierCurrency": "USD", "CreditNoteNumber": "CR-00026", "OrderStatus": "AUTHORISED", "StockReceivedStatus": "DRAFT", "UnstockStatus": "NOT AVAILABLE", "InvoiceStatus": "PAID", "CreditNoteStatus": "DRAFT", "LastUpdatedDate": "2018-04-19T04:03:06.52Z", "Type": "Simple Purchase", "CombinedInvoiceStatus": "INVOICED", "CombinedPaymentStatus": "PAID", "CombinedReceivingStatus": "NOT RECEIVED", "IsServiceOnly": false, "DropShipTaskID": null }, { "ID": "4e60f55a-1690-4024-a58c-29d62106a645", "BlindReceipt": false, "OrderNumber": "PO-00080", "Status": "COMPLETED", "OrderDate": "2018-04-12T00:00:00Z", "InvoiceDate": "2018-04-10T00:00:00Z", "Supplier": "Bayside Wholesale", "SupplierID": "ca1e53f5-0560-4de6-8956-fac50a32540b", "InvoiceNumber": "INV-00080", "InvoiceAmount": 54.31, "PaidAmount": 0, "InvoiceDueDate": "2018-05-10T00:00:00Z", "RequiredBy": null, "BaseCurrency": "USD", "SupplierCurrency": "USD", "CreditNoteNumber": "CR-00080", "OrderStatus": "AUTHORISED", "StockReceivedStatus": "AUTHORISED", "UnstockStatus": "AUTHORISED", "InvoiceStatus": "PAID", "CreditNoteStatus": "AUTHORISED", "LastUpdatedDate": "2018-04-12T05:36:13.177Z", "Type": "Simple Purchase", "CombinedInvoiceStatus": "INVOICED / CREDITED", "CombinedPaymentStatus": "PAID", "CombinedReceivingStatus": "FULLY RECEIVED", "IsServiceOnly": false, "DropShipTaskID": "31760d16-c2e5-4620-9e09-0b236b776cef" } ] } Purchase Order sample of what gets returned https://private-anon-92de43072c-dearinventory.apiary-mock.com/ExternalApi/v2/purchase/order?TaskID=TaskID Purchase Order sample of what gets returned https://private-anon-92de43072c-dearinventory.apiary-mock.com/ExternalApi/v2/purchase/order?TaskID=02b08cd2-51d2-41e6-ab97-85bcd13e7136 { "TaskID": "02b08cd2-51d2-41e6-ab97-85bcd13e7136", "CombineAdditionalCharges": false, "Memo": "", "Status": "AUTHORISED", "Lines": [ { "ProductID": "c08b3876-89cc-46c4-af52-b77f058fdf81", "SKU": "Bread", "Name": "Baked Bread", "Quantity": 2, "Price": 2, "Discount": 0, "Tax": 0, "TaxRule": "Sales Tax on Imports", "SupplierSKU": "", "Comment": "", "Total": 4 }, { "ProductID": "c08b3876-89cc-46c4-af52-b77f058fdf81", "SKU": "Bread", "Name": "Baked Bread", "Quantity": 1, "Price": 2, "Discount": 0, "Tax": 0, "TaxRule": "Sales Tax on Imports", "SupplierSKU": "", "Comment": "", "Total": 2 } ], "AdditionalCharges": [ { "Description": "Half day training - Microsoft Office", "Reference": "", "Price": 3, "Quantity": 1, "Discount": 0, "Tax": 0, "Total": 3, "TaxRule": "Sales Tax on Imports" } ], "TotalBeforeTax": 9, "Tax": 0, "Total": 9 }1.9KViews0likes0CommentsAzure Data Factory pipeline - copy data from on-prem SQL Server and Dynamics CRM
Hello, I need to transfer data from an on-premises SQL Server into CDS (Dynamics 365). This can be achieved by using a CopyData activity, if all data to be transferred is already stored in SQL Server Unfortunately, this is not my case. Steps for my requirement are: - create a CRM record (parent record); - the GUID of this record must be then included into the source data for CopyData activity. In fact, all the data transferred into CRM (by CopyData activity) need to be child records of the previous parent record. How can I accomplish this data flow, please? Best regards, Radu Antonache1.1KViews0likes0CommentsAzure Data Factory July new features update
We are glad to announce that Azure Data Factory has added more new features in July, including: Preview for Data Management Gateway high availability and scalability Skipping or logging incompatible rows during copy for fault tolerance Service principal authentication support for Azure Data Lake Analytics We will go through each of these new features one by one in the blog post on the https://azure.microsoft.com/en-us/blog/azure-data-factory-july-new-features-update/.1.6KViews0likes1CommentAzure Analysis Services now available in Japan East and UK South
Last October we released the preview of Azure Analysis Services, which is built on the proven analytics engine in Microsoft SQL Server Analysis Services. With Azure Analysis Services you can host semantic data models in the cloud. Users in your organization can then connect to your data models using tools like Excel, Power BI, and many others to create reports and perform ad-hoc data analysis. We are excited to share with you that the preview of Azure Analysis Services is now available in 2 additional regions: Japan East and UK South. This means that Azure Analysis Services is now available in the following regions: Australia Southeast, Canada Central, Brazil South, Southeast Asia, North Europe, West Europe, West US, South Central US, North Central US, East US 2, West Central US, Japan East, and UK South. Read about it on the https://azure.microsoft.com/en-us/blog/azure-analysis-services-now-available-in-japan-east-and-uk-south/.884Views0likes0CommentsLift SQL Server Integration Services packages to Azure with Azure Data Factory
Data is vital to every app and experience we build today. With increasing amount of data, organizations do not want to be tied down by increasing infrastructural costs that come with it. Data engineers and developers are realizing the need to start moving their on-premise workloads to the cloud to take advantage of its massive scale and flexibility. Azure Data Factory capabilities are generally available for SQL Server Integration Services (SSIS) customers to easily lift SSIS packages to Azure gaining scalability, high availability, and lower TCO, while ADF manages resources for them. Read more about it in the https://azure.microsoft.com/en-us/blog/lift-sql-server-integration-services-packages-to-azure-with-azure-data-factory/.1.2KViews0likes0CommentsAzure Data Factory new capabilities are now generally available
Microsoft is excited to announce the General Availability of new Azure Data Factory (ADF V2) features that will make data integration in the cloud easier than ever before. With a new browser-based https://docs.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-portal, you can accelerate your time to production by building and scheduling your data pipelines using drag and drop. Manage and monitor the health of your data integration projects at scale, wherever your data lives, in cloud or on-premises, with enterprise-grade security. ADF comes with support for over 70 https://docs.microsoft.com/en-us/azure/data-factory/data-factory-data-movement-activities and enables you to easily dispatch data transformation jobs at scale to transform raw data into processed data that is ready for consumption by business analysts using their favorite BI tools or custom applications. For existing https://docs.microsoft.com/en-us/sql/integration-services/lift-shift/ssis-azure-lift-shift-ssis-packages-overview users, ADF now allows you to easily lift and shift your SSIS packages into the cloud and run SSIS as a service with minimal changes required to your existing packages. ADF will now manage your SSIS resources for you so you can increase productivity and lower total cost of ownership. Meet your security and compliance needs while taking advantage of extensive capabilities and paying only for what you use. Read about it in the https://azure.microsoft.com/en-us/blog/azure-data-factory-new-capabilities-are-now-generally-available/.1.4KViews0likes0CommentsEvent trigger based data integration with Azure Data Factory
Event driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption and reaction to events. Today, we are announcing the support for event based triggers in your https://docs.microsoft.com/en-us/azure/data-factory/ (ADF) pipelines. A lot of data integration scenarios requires data factory customers to trigger pipelines based on events. A typical event could be file landing or getting deleted in your Azure storage. Now you can simply create an event based trigger in your data factory pipeline. Read about it in the https://azure.microsoft.com/en-us/blog/event-trigger-based-scheduling-in-azure-data-factory/.1.1KViews0likes0Comments