copy activity
60 TopicsAnnouncing Public Preview of the SAP CDC solution in Azure Data Factory and Azure Synapse Analytics
UPDATE! We've launched the new SAP connector in ADF today, June 30, 2022, and updated the blog post below accordingly. For decades, companies have relied on Microsoft and SAP software to run their most mission-critical operations. Today, we’re excited to launch public preview of SAP Change Data Capture (CDC) in Azure Data Factory (ADF). Combining a new data connector with predefined data flow templates, this solution streamlines the integration of SAP data within core Azure services like Azure Synapse Analytics and Azure Machine Learning. The new SAP ODP connector leverages SAP Operational Data Provisioning (ODP) framework, which is an established best practice for data integration within SAP landscapes. ODP provides access to a wide range of sources across all major SAP applications and comes with built-in CDC capabilities. In combination with the predefined data flow templates to process and update the changed records to any sink, this makes SAP data integration into Azure very much straight forward.41KViews15likes18CommentsSharePoint Online Multiple Files (Folder) Copy with Http Connector
This blog shows how to copy multiple files from a folder from SharePoint Online using ADF. Go through this public documentation on how to copy a single file - Copy data from SharePoint Online List by using Azure Data Factory - Azure Data Factory | Microsoft Docs31KViews9likes36CommentsGeneral availability of SAP CDC capabilities for Azure Data Factory and Azure Synapse Analytics
Customers use SAP systems for their business-critical operations. Today, customers want to be able to combine their SAP data with non-SAP data for their analytics needs. Azure Data Factory (ADF) is an industry-leading data integration service which enables customers to ingest data from diverse data sources (e.g., multi-cloud, SaaS, on-premises), transform data at scale, and more. Azure Data Factory (ADF) works seamlessly to combine data and prepare it at cloud-scale. Customers are using ADF to ingest data from different SAP data sources (e.g., SAP ECC, SAP Hana, SAP Table, SAP BW Open Hub, SAP BW via MDX, SAP Cloud for Customers), and combining them with data from other operational stores (e.g., Cosmos DB, Azure SQL family, and more). This enables customers to gain deep insights from both SAP and non-SAP data. Today, we are excited to announce the General Availability of SAP CDC support in Azure Data Factory and Azure Synapse Analytics.21KViews7likes13CommentsOracle 2.0 Upgrade Woes with Self-Hosted Integration Runtime
This past weekend my ADF instance finally got the prompt to upgrade linked services that use the Oracle 1.0 connector, so I thought, "no problem!" and got to work upgrading my self-hosted integration runtime to 5.50.9171.1 Most of my connection use service_name during authentication, so https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory, I should be able to connect using the Easy Connect (Plus) Naming convention. When I do, I encounter this error: Test connection operation failed. Failed to open the Oracle database connection. ORA-50201: Oracle Communication: Failed to connect to server or failed to parse connect string ORA-12650: No common encryption or data integrity algorithm https://docs.oracle.com/error-help/db/ora-12650/ I did some digging on this error code, and the troubleshooting doc suggests that I reach out to my Oracle DBA to update Oracle server settings. Which, I did, but I have zero confidence the DBA will take any action. https://learn.microsoft.com/en-us/azure/data-factory/connector-troubleshoot-oracle Then I happened across this documentation about the upgraded connector. https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory#upgrade-the-oracle-connector Is this for real? ADF won't be able to connect to old versions of Oracle? If so I'm effed because my company is so so legacy and all of our Oracle servers at 11g. I also tried adding additional connection properties in my linked service connection like this, but I have honestly no idea what I'm doing: Encryption client: accepted Encryption types client: AES128, AES192, AES256, 3DES112, 3DES168 Crypto checksum client: accepted Crypto checksum types client: SHA1, SHA256, SHA384, SHA512 But no matter what, the issue persists. :( Am I missing something stupid? Are there ways to handle the encryption type mismatch client-side from the VM that runs the self-hosted integration runtime? I would hate to be in the business of managing an Oracle environment and tsanames.ora files, but I also don't want to re-engineer almost 100 pipelines because of a connector incompatability.Solved6.2KViews3likes15Comments