Analytics and Integration for SAP Global Instance running on-premises with ADF

Published Jun 01 2020 10:24 AM 10.4K Views
Microsoft

How a retail customer unleashed the power of SAP data and improved data ingestion with ADF data extraction in Azure.  

 

A US-based multinational manufacturing and retail company had its legacy systems, aged infrastructure, and traditional data extraction tools running on-premises. With this current infrastructure, they were unable to perform real time analytics to derive meaningful business insights. They grappled with challenges around reducing operational cost and using quality data-driven insights in real-time to increase revenue.

The company decided to take benefits of the Azure platform services like Azure Data Factory (ADF) to overcome their current challenges, while their SAP real estate remain on on-premises. They also wanted to leverage modern technologies like Azure Machine Learning to predict the sales pattern and customer requirements to increase their revenue.

Read this article to learn how Cognizant teamed up with Microsoft to help this multinational manufacturing and retail company unleashed the power of SAP data and improved data ingestion with ADF data extraction in Azure .

 

This blog is co-authored by Madhan Kumar Munipandian, Solution Architect – Data Analytics and Rajib Mandal, Data Engineering - Lead, Cognizant; Prabhjot Kaur, Senior CSA, Microsoft.

 

Legacy deployment and its challenges

The customer’s legacy data warehouse was designed a decade ago, resulting in a very slow and inefficient data extraction process.

The following challenges were presented by the customer to the team:

  • Legacy data warehouse was designed to connect to the transactional database directly. This design impacted the performance of the transactional systems during the data extraction process causing long running transactions for users
  • To reduce performance impact because of direct pull, legacy data warehouse was designed to pull data once a week or month
  • Legacy data warehouse was designed to extract full data during each batch. No real time data extraction available
  • Incremental pull capabilities weren’t configured. Performing full extract each time took a long time, and consumed a lot of storage space
  • Analytics and reporting tools were not intuitive to visualize the data for business analysis

 

Customer requirements

Speed and accuracy are the basic requirements for many customers performing ETL (Extract, Transform, and Load) operation. This retail customer had the following requirements:

  • Derive accurate insights from sales and revenue data to unblock hindering factors to potentially increase the revenue
  • Reduce operational costs in the supply chain by leveraging cost effective infrastructure and retire legacy on-premises infrastructure which are expensive to support
  • Redundant dataset is being extracted from various SAP system - Improve data quality, data redundancy, data accuracy by pulling the data from source of the truth systems
  • Ability to build intuitive dashboards and visualization to get supply chain operational cost, customer demand, store’s inventory etc
  • Real time and near real time data processing
  • Leverage Machine Learning capabilities to predict the sales pattern
  • Offer Data as a service (DaaS) to its suppliers for product info, and inventory

 

Proposed Solutions

To modernize the business, to meet the requirements, and to take advantages of analytics and integration platform services, the company decided to migrate their on-premises enterprise ware house workload to Azure.

One of the important requirement is to seamlessly integrate SAP (running on on-premises) and non-SAP systems and extract the data from various sources, which would then be analyzed by business decision makers to make key decisions. To fulfill this requirement, the company decided to embark on a digital transformation initiative to build Data Lake on the cloud and leverage Azure data analytics services. This will help them to resolve challenges faced in legacy data warehouse solution with on-premises.

 

Here are the primary business drivers for the solution(s):

 

  • Leverage Azure PaaS offering to reduce cost and save time spent on infrastructure provision and management
  • SAP data extraction must work seamlessly with Azure Databricks for data analytics and data standardization 
  • Highly available, scalable and proven  platform  to extract data from global instance of the SAP systems – SAP  ECC, Supply Chain (APO) and SAP BW (Business Warehouse)
  • Ingest data into Azure Data Lake Storage (ADLS) with great performance and throughput
  • Configurable capabilities to extract Incremental data from the source systems
  • Must support dynamic configuration (metadata driven extraction ingestion framework) to extract full and incremental data from SAP systems

 

Solution must also consider the following technical requirements :

 

  • SAP extracted data size varies widely from couple of MB’s to 100 GB’s
  • SAP systems – ECC & supply chain are highly transactional system; Don’t extract the data from direct database layer to comply with SAP support requirements, and to avoid negative performance impact on the SAP systems
  • Data extraction from SAP systems must be done from SAP application layer using RFC protocol
  • SAP source systems (from where the data is being extracted) had various technical configurations like process connection time out, memory allocation, maximum process run time etc. that could have impact on the extraction process

 

Pilot phase was executed to compare different Integration Platform as a Service (iPaaS) solutions to evaluate SAP data extraction and design pattern’s to handle different avenues of data scenario’s such as source data size, delta extraction, etc.  Based on Pilot results, performance benchmark and primary drivers tool selection criteria, ADF was selected as the SAP data extraction tool for full batch and mini batches. After a lot of analysis, ADF emerged as the  ETL  engine that could fulfill the business requirements, and overcome the current technical challenges.

 

The big question that still needed to be addressed was: “How to efficiently leverage Azure Data Factory to extract data from different SAP Applications, SAP - ECC (ERP Central Component) , SAP – APO (Advance planning) and SAP – BW (Business Warehouse) ?”

 

Solution Deployment

ADF was deployed with the following connectors and configuration. However, to meet the business SLA, we had to make some configuration changes (documented below):

 

  1. SAP Table/Open Hub connector: SAP table connector was used to extract the data from SAP systems. During performance testing, we noticed huge drop in copy throughput (from Mb/s to Kb/s) when source data size was above 2GB. Worked with Microsoft engineering team and SAP support team to identify the root cause. Copy throughput was low due to the design of classic RFC protocol handle size more than 2GB. The solution was to change the protocol from RFC to BASXML  to handle large size (>2GB) data copy (if you use latest S-H-IR, this protocol config is already there). After the protocol change in Self-hosted-IR, performance improved significantly:
  • Before changes:  2 hours 25 minutes 7 throughput; approx. 90 KB/s
  • After changes: 2 minutes and 45 seconds & Throughput: approx. 3 MB/s

Just with this small change, process was 48 times faster, and throughput was 35 times higher

  1. Database connector: SAP systems like ECC and supply chain are transactional in nature, directly extracting the data from transaction database may degrade the overall system. To overcome the performance bottleneck on the primary database, the extraction was performed on the secondary database of the highly available database configuration of Oracle and SAP HANA systems. This also helped to overcome the SAP technical configuration challenge mentioned in the technical requirements section of the proposed solution.

 

Additionally, the oDATA connector was evaluated. However, it was not deployed to the production as it didn’t meet customer’s requirement.

 

The following diagram illustrates the reference architecture of the deployed solution:

 

Sachin-Ghorpade_0-1591031391440.png

 

 

Here the various component and their roles in the architecture:

  • RAW: File Storage layer data are stored as is from the source systems.
  • Enriched: File Storage layer where data are cleansed, standardized and Data quality checks
  • Curated: Database storage layer where Business rules & transformation are applied and stored for easily consumption for Reports, dashboard, and visualization
  • ADF:  Data extraction & for orchestration
  • DataBricks:  Compute for Business, Transformation, Integration, cleansing and standardization
  • Azure Synapse analytics: Enterprise integrated storage for low latency queries
  • Azure SQL Database: Storage layer for Operation Data source and data that can’t be integrated
  • Azure API App: Delivery data as a REST API for the external 3rd party organization
  • Power BI: Visualization, dashboard, and reporting

 

 

Lessons Learned

ADF is a great solution to extract data from various source systems (SAP and non-SAP), perform transformation, and load to the target data sink.

We performed extensive analysis, comparison, and performance tests, and here are some recommendations (based on our learning) on when to use which ADF connectors for efficient data extraction from SAP applications:            

  1. SAP Table /Open Hub– You can use RFC based protocol if the data is less than 2GB in size; however, with more than 2GB of data, protocol BASXML is recommended. Since you can’t have two protocols defined in the same configuration, use BASXML protocol to serve both the small and big size data.
  2. Native DB Connector – great option to extract the data from secondary database. Ensure that SAP support the database extraction method, and database provider support read from secondary feature.

 

Here are some learnings/best practices to consider while using the various data factory connectors.

 

SAP Application

Data Factory Connector

Key features and key considerations

SAP ECC, SAP SCM/APO, SAP BW

SAP Table

•        Data extraction through SAP ECC Application layer using ADF SAP Table Connector

•        Support full and Delta extraction

•        Maximum runtime configuration to be considered when defining data extraction pattern

•        Support low to high data size

•        Use BASXML protocol

SAP ECC, SAP SCM/APO, SAP BW

Native DB connector

•        Data extraction directly from Database objects

•        Primary database access is restricted in SAP Transaction applications (SAP ECC & APO), so leverage. Secondary database/High Availability instance for data extraction

•        Support full and Delta extraction

•        Frequency of replication between Primary to Secondary instance to be considered

SAP – ECC

SAP OData

•        Recommended for only Small data volume

•        Entities exposed by SAP ECC OData services

•        OData Services internally creates objects in SAP ECC for each ODATA services.

•        Recommended for very low volume (a few thousands records)

SAP – BW

SAP Open Hub

•        Data need to be moved from SAP ECC to SAP BW using (BW Business content or customer extractor)

•        Data extraction through SAP BW Application layer using OpenHub Connector

•        SAP BW will become intermediate data storage layer with transformed and Aggregated data

•        Support full and Delta extraction

•        Open Hub Destination Object need to be built in SAP BW

•        Support for low to high data size

•        Use BASXML protocol

SAP – BW on HANA

SAP HANA

•        Only when data stored in SAP HANA

•        Support full and Delta extraction

•        Support for low to high data size

 

Conclusion

With the existing tools and services offered in Azure, you can rapidly deploy, build, and configure your solutions. These solutions were successfully deployed to the customer site, and many more customers are in pipeline to get this solution implemented.

 

You can integrate data silos with Azure Data Factory. Easily construct ETL and ELT processes code-free or write your own code. Visually integrate data sources using more than 90+ natively built and maintenance-free connectors at no added cost. Focus on your data—the serverless integration service does the rest. Read “Azure Data Factory - Hybrid data integration service that simplifies ETL at scale” for more information.

 

 

 

 

8 Comments
%3CLINGO-SUB%20id%3D%22lingo-sub-1432267%22%20slang%3D%22en-US%22%3ERe%3A%20Analytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1432267%22%20slang%3D%22en-US%22%3E%3CP%3EThis%20is%20very%20good.%20Did%20you%20have%20to%20migrate%20out%20or%20integrate%20with%20any%20IDOCs%3F%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1432959%22%20slang%3D%22en-US%22%3ERe%3A%20Analytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1432959%22%20slang%3D%22en-US%22%3E%3CP%3EFantastic%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1440892%22%20slang%3D%22en-US%22%3ERe%3A%20Analytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1440892%22%20slang%3D%22en-US%22%3E%3CP%3EWhat%20was%20the%20need%20of%202%20separate%20databases%20at%20curated%20layer%20-%20Synapse%20and%20SQL%20DB%3F%26nbsp%3B%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EWhat%20is%20referred%20by%20data%20that%20could%20not%20be%20integrated%3F%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1698677%22%20slang%3D%22en-US%22%3ERe%3A%20Analytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1698677%22%20slang%3D%22en-US%22%3E%3CP%3EGreat%20stuff%2C%20thank%20you.%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EWe%20are%20struggling%20with%20the%20same%20issue%20related%20to%20using%20the%20RFC%20protocol%2C%20but%20I%20have%20been%20unable%20to%20find%20more%20information%20on%20how%20to%20change%20from%20the%20RFC%20to%20%3CSPAN%3EBASXML%26nbsp%3Bprotocol.%3C%2FSPAN%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EWould%20you%20be%20able%20to%20point%20us%20in%20the%20right%20direction%3F%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EIs%20it%20on%20the%20SAP%20Server%20config%20side%20or%20on%20the%20Integration%20Runtime%20%2F%20Connector%20side%3F%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EThanks%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%3CSPAN%3ERFC%20based%20protocol%20if%20the%20data%20is%20less%20than%202GB%20in%20size%3B%20however%2C%20with%20more%20than%202GB%20of%20data%2C%20protocol%20BASXML%26nbsp%3B%3C%2FSPAN%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1936940%22%20slang%3D%22en-US%22%3ERe%3A%20Analytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1936940%22%20slang%3D%22en-US%22%3E%3CP%3EWe%20are%20suffering%20from%20the%20same%20issue%20and%20would%20desperately%20like%20to%20get%20hold%20of%20the%20documentation%20for%20changing%20the%20protocol%20from%20RFC%20to%20BASXML%20to%20solve%20the%20out%20of%20memory%20problem.%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EPlease%20could%20you%20let%20us%20know%3F%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3ECheers.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1941552%22%20slang%3D%22en-US%22%3ERe%3A%20Analytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1941552%22%20slang%3D%22en-US%22%3E%3CP%3EHello!%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EAs%20long%20as%20you%20use%20latest%20SHIR%2C%20it%20automatically%20uses%20BASXML%2C%20so%20no%20manual%20changes%20required%20for%20this%20configuration.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1431602%22%20slang%3D%22en-US%22%3EAnalytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1431602%22%20slang%3D%22en-US%22%3E%3CP%3E%3CEM%3EHow%20a%20retail%20customer%20unleashed%20the%20power%20of%20SAP%20data%20and%20improved%20data%20ingestion%20with%20ADF%20data%20extraction%20in%20Azure.%26nbsp%3B%26nbsp%3B%3C%2FEM%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EA%20US-based%20multinational%26nbsp%3Bmanufacturing%20and%20retail%20company%20had%20its%20legacy%20systems%2C%20aged%20infrastructure%2C%20and%20traditional%20data%20extraction%20tools%20running%20on-premises.%20With%20this%20current%20infrastructure%2C%20they%20were%20unable%20to%20perform%20real%20time%20analytics%20to%20derive%20meaningful%20business%20insights.%20They%20grappled%20with%20challenges%20around%20reducing%20operational%20cost%20and%20using%20quality%20data-driven%20insights%20in%20real-time%20to%20increase%20revenue.%3C%2FP%3E%0A%3CP%3EThe%20company%20decided%20to%20take%20benefits%20of%20the%20Azure%20platform%20services%20like%20Azure%20Data%20Factory%20(ADF)%20to%20overcome%20their%20current%20challenges%2C%20while%20their%20SAP%20real%20estate%20remain%20on%20on-premises.%20They%20also%20wanted%20to%20leverage%20modern%20technologies%20like%20Azure%20Machine%20Learning%20to%20predict%20the%20sales%20pattern%20and%20customer%20requirements%20to%20increase%20their%20revenue.%3C%2FP%3E%0A%3CP%3ERead%20this%20article%20to%20learn%20how%20Cognizant%20teamed%20up%20with%20Microsoft%20to%20help%20this%20multinational%26nbsp%3Bmanufacturing%20and%20retail%20company%20unleashed%20the%20power%20of%20SAP%20data%20and%20improved%20data%20ingestion%20with%20ADF%20data%20extraction%20in%20Azure%20.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CEM%3EThis%20blog%20is%20co-authored%20by%20Madhan%20Kumar%20Munipandian%2C%20Solution%20Architect%20%E2%80%93%20Data%20Analytics%20and%20Rajib%20Mandal%2C%20Data%20Engineering%20-%20Lead%2C%20Cognizant%3B%20Prabhjot%20Kaur%2C%20Senior%20CSA%2C%20Microsoft.%3C%2FEM%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CH2%20id%3D%22toc-hId--1327772807%22%20id%3D%22toc-hId--1327772807%22%20id%3D%22toc-hId--1327772807%22%20id%3D%22toc-hId--1327772807%22%3ELegacy%20deployment%20and%20its%20challenges%3C%2FH2%3E%0A%3CP%3EThe%20customer%E2%80%99s%20legacy%20data%20warehouse%20was%20designed%20a%20decade%20ago%2C%20resulting%20in%20a%20very%20slow%20and%20inefficient%20data%20extraction%20process.%3C%2FP%3E%0A%3CP%3EThe%20following%20challenges%20were%20presented%20by%20the%20customer%20to%20the%20team%3A%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3ELegacy%20data%20warehouse%20was%20designed%20to%20connect%20to%20the%20transactional%20database%20directly.%20This%20design%20impacted%20the%20performance%20of%20the%20transactional%20systems%20during%20the%20data%20extraction%20process%20causing%20long%20running%20transactions%20for%20users%3C%2FLI%3E%0A%3CLI%3ETo%20reduce%20performance%20impact%20because%20of%20direct%20pull%2C%20legacy%20data%20warehouse%20was%20designed%20to%20pull%20data%20once%20a%20week%20or%20month%3C%2FLI%3E%0A%3CLI%3ELegacy%20data%20warehouse%20was%20designed%20to%20extract%20full%20data%20during%20each%20batch.%20No%20real%20time%20data%20extraction%20available%3C%2FLI%3E%0A%3CLI%3EIncremental%20pull%20capabilities%20weren%E2%80%99t%20configured.%20Performing%20full%20extract%20each%20time%20took%20a%20long%20time%2C%20and%20consumed%20a%20lot%20of%20storage%20space%3C%2FLI%3E%0A%3CLI%3EAnalytics%20and%20reporting%20tools%20were%20not%20intuitive%20to%20visualize%20the%20data%20for%20business%20analysis%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CH2%20id%3D%22toc-hId-1159740026%22%20id%3D%22toc-hId-1159740026%22%20id%3D%22toc-hId-1159740026%22%20id%3D%22toc-hId-1159740026%22%3ECustomer%20requirements%3C%2FH2%3E%0A%3CP%3ESpeed%20and%20accuracy%20are%20the%20basic%20requirements%20for%20many%20customers%20performing%20ETL%20(Extract%2C%20Transform%2C%20and%20Load)%20operation.%20This%20retail%20customer%20had%20the%20following%20requirements%3A%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3EDerive%20accurate%20insights%20from%20sales%20and%20revenue%20data%20to%20unblock%20hindering%20factors%20to%20potentially%20increase%20the%20revenue%3C%2FLI%3E%0A%3CLI%3EReduce%20operational%20costs%20in%20the%20supply%20chain%20by%20leveraging%20cost%20effective%20infrastructure%20and%20retire%20legacy%20on-premises%20infrastructure%20which%20are%20expensive%20to%20support%3C%2FLI%3E%0A%3CLI%3ERedundant%20dataset%20is%20being%20extracted%20from%20various%20SAP%20system%20-%20Improve%20data%20quality%2C%20data%20redundancy%2C%20data%20accuracy%20by%20pulling%20the%20data%20from%20source%20of%20the%20truth%20systems%3C%2FLI%3E%0A%3CLI%3EAbility%20to%20build%20intuitive%20dashboards%20and%20visualization%20to%20get%20supply%20chain%20operational%20cost%2C%20customer%20demand%2C%20store%E2%80%99s%20inventory%20etc%3C%2FLI%3E%0A%3CLI%3EReal%20time%20and%20near%20real%20time%20data%20processing%3C%2FLI%3E%0A%3CLI%3ELeverage%20Machine%20Learning%20capabilities%20to%20predict%20the%20sales%20pattern%3C%2FLI%3E%0A%3CLI%3EOffer%20Data%20as%20a%20service%20(DaaS)%20to%20its%20suppliers%20for%20product%20info%2C%20and%20inventory%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CH2%20id%3D%22toc-hId--647714437%22%20id%3D%22toc-hId--647714437%22%20id%3D%22toc-hId--647714437%22%20id%3D%22toc-hId--647714437%22%3EProposed%20Solutions%3C%2FH2%3E%0A%3CP%3ETo%20modernize%20the%20business%2C%20to%20meet%20the%20requirements%2C%20and%20to%20take%20advantages%20of%20analytics%20and%20integration%20platform%20services%2C%20the%20company%20decided%20to%20migrate%20their%20on-premises%20enterprise%20ware%20house%20workload%20to%20Azure.%3C%2FP%3E%0A%3CP%3EOne%20of%20the%20important%20requirement%20is%20to%20seamlessly%20integrate%20SAP%20(running%20on%20on-premises)%20and%20non-SAP%20systems%20and%20extract%20the%20data%20from%20various%20sources%2C%20which%20would%20then%20be%20analyzed%20by%20business%20decision%20makers%20to%20make%20key%20decisions.%20To%20fulfill%20this%20requirement%2C%20the%20company%20decided%20to%20embark%20on%20a%20digital%20transformation%20initiative%20to%20build%20Data%20Lake%20on%20the%20cloud%20and%20leverage%20Azure%20data%20analytics%20services.%20This%20will%20help%20them%20to%20resolve%20challenges%20faced%20in%20legacy%20data%20warehouse%20solution%20with%20on-premises.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EHere%20are%20the%20primary%20business%20drivers%20for%20the%20solution(s)%3A%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3ELeverage%20Azure%20PaaS%20offering%20to%20reduce%20cost%20and%20save%20time%20spent%20on%20infrastructure%20provision%20and%20management%3C%2FLI%3E%0A%3CLI%3ESAP%20data%20extraction%20must%20work%20seamlessly%20with%20Azure%20Databricks%20for%20data%20analytics%20and%20data%20standardization%26nbsp%3B%3C%2FLI%3E%0A%3CLI%3EHighly%20available%2C%20scalable%20and%20proven%26nbsp%3B%20platform%26nbsp%3B%20to%20extract%20data%20from%20global%20instance%20of%20the%20SAP%20systems%20%E2%80%93%20SAP%26nbsp%3B%20ECC%2C%20Supply%20Chain%20(APO)%20and%20SAP%20BW%20(Business%20Warehouse)%3C%2FLI%3E%0A%3CLI%3EIngest%20data%20into%20Azure%20Data%20Lake%20Storage%20(ADLS)%20with%20great%20performance%20and%20throughput%3C%2FLI%3E%0A%3CLI%3EConfigurable%20capabilities%20to%20extract%20Incremental%20data%20from%20the%20source%20systems%3C%2FLI%3E%0A%3CLI%3EMust%20support%20dynamic%20configuration%20(metadata%20driven%20extraction%20ingestion%20framework)%20to%20extract%20full%20and%20incremental%20data%20from%20SAP%20systems%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3ESolution%20must%20also%20consider%20the%20following%20technical%20requirements%20%3A%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3ESAP%20extracted%20data%20size%20varies%20widely%20from%20couple%20of%20MB%E2%80%99s%20to%20100%20GB%E2%80%99s%3C%2FLI%3E%0A%3CLI%3ESAP%20systems%20%E2%80%93%20ECC%20%26amp%3B%20supply%20chain%20are%20highly%20transactional%20system%3B%20Don%E2%80%99t%20extract%20the%20data%20from%20direct%20database%20layer%20to%20comply%20with%20SAP%20support%20requirements%2C%20and%20to%20avoid%20negative%20performance%20impact%20on%20the%20SAP%20systems%3C%2FLI%3E%0A%3CLI%3EData%20extraction%20from%20SAP%20systems%20must%20be%20done%20from%20SAP%20application%20layer%20using%20RFC%20protocol%3C%2FLI%3E%0A%3CLI%3ESAP%20source%20systems%20(from%20where%20the%20data%20is%20being%20extracted)%20had%20various%20technical%20configurations%20like%20process%20connection%20time%20out%2C%20memory%20allocation%2C%20maximum%20process%20run%20time%20etc.%20that%20could%20have%20impact%20on%20the%20extraction%20process%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EPilot%20phase%20was%20executed%20to%20compare%20different%20Integration%20Platform%20as%20a%20Service%20(iPaaS)%20solutions%20to%20evaluate%20SAP%20data%20extraction%20and%20design%20pattern%E2%80%99s%20to%20handle%20different%20avenues%20of%20data%20scenario%E2%80%99s%20such%20as%20source%20data%20size%2C%20delta%20extraction%2C%20etc.%26nbsp%3B%20Based%20on%20Pilot%20results%2C%20performance%20benchmark%20and%20primary%20drivers%20tool%20selection%20criteria%2C%20ADF%20was%20selected%20as%20the%20SAP%20data%20extraction%20tool%20for%20full%20batch%20and%20mini%20batches.%20After%20a%20lot%20of%20analysis%2C%20ADF%20emerged%20as%20the%26nbsp%3B%20ETL%26nbsp%3B%20engine%20that%20could%20fulfill%20the%20business%20requirements%2C%20and%20overcome%20the%20current%20technical%20challenges.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CEM%3EThe%20big%20question%20that%20still%20needed%20to%20be%20addressed%20was%3A%20%E2%80%9CHow%20to%20efficiently%20leverage%20Azure%20Data%20Factory%20to%20extract%20data%20from%20different%20SAP%20Applications%2C%20SAP%20-%20ECC%20(ERP%20Central%20Component)%20%2C%20SAP%20%E2%80%93%20APO%20(Advance%20planning)%20and%20SAP%20%E2%80%93%20BW%20(Business%20Warehouse)%20%3F%E2%80%9D%3C%2FEM%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CH2%20id%3D%22toc-hId-1839798396%22%20id%3D%22toc-hId-1839798396%22%20id%3D%22toc-hId-1839798396%22%20id%3D%22toc-hId-1839798396%22%3ESolution%20Deployment%3C%2FH2%3E%0A%3CP%3EADF%20was%20deployed%20with%20the%20following%20connectors%20and%20configuration.%20However%2C%20to%20meet%20the%20business%20SLA%2C%20we%20had%20to%20make%20some%20configuration%20changes%20(documented%20below)%3A%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3E%3CSTRONG%3ESAP%20Table%2FOpen%20Hub%20connector%3A%20%3C%2FSTRONG%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fdata-factory%2Fconnector-sap-table%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3ESAP%20table%20connector%3C%2FA%3E%20was%20used%20to%20extract%20the%20data%20from%20SAP%20systems.%20During%20performance%20testing%2C%20we%20noticed%20huge%20drop%20in%20copy%20throughput%20(from%20Mb%2Fs%20to%20Kb%2Fs)%20when%20source%20data%20size%20was%20above%202GB.%20Worked%20with%20Microsoft%20engineering%20team%20and%20SAP%20support%20team%20to%20identify%20the%20root%20cause.%20Copy%20throughput%20was%20low%20due%20to%20the%20design%20of%20classic%20RFC%20protocol%20handle%20size%20more%20than%202GB.%20The%20solution%20was%20to%20change%20the%20protocol%20from%20RFC%20to%20BASXML%26nbsp%3B%20to%20handle%20large%20size%20(%26gt%3B2GB)%20data%20copy%20(if%20you%20use%20latest%20S-H-IR%2C%20this%20protocol%20config%20is%20already%20there).%20After%20the%20protocol%20change%20in%20Self-hosted-IR%2C%20performance%20improved%20significantly%3A%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CUL%3E%0A%3CLI%3EBefore%20changes%3A%20%26nbsp%3B2%20hours%2025%20minutes%207%20throughput%3B%20approx.%2090%20KB%2Fs%3C%2FLI%3E%0A%3CLI%3EAfter%20changes%3A%202%20minutes%20and%2045%20seconds%20%26amp%3B%20Throughput%3A%20approx.%203%20MB%2Fs%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%3CSTRONG%3EJust%20with%20this%20small%20change%2C%20process%20was%2048%20times%20faster%2C%20and%20throughput%20was%2035%20times%20higher%3C%2FSTRONG%3E%3C%2FP%3E%0A%3COL%20start%3D%222%22%3E%0A%3CLI%3E%3CSTRONG%3EDatabase%20connector%3A%3C%2FSTRONG%3E%20SAP%20systems%20like%20ECC%20and%20supply%20chain%20are%20transactional%20in%20nature%2C%20directly%20extracting%20the%20data%20from%20transaction%20database%20may%20degrade%20the%20overall%20system.%20To%20overcome%20the%20performance%20bottleneck%20on%20the%20primary%20database%2C%20the%20extraction%20was%20performed%20on%20the%20secondary%20database%20of%20the%20highly%20available%20database%20configuration%20of%20Oracle%20and%20SAP%20HANA%20systems.%20This%20also%20helped%20to%20overcome%20the%20SAP%20technical%20configuration%20challenge%20mentioned%20in%20the%20technical%20requirements%20section%20of%20the%20proposed%20solution.%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EAdditionally%2C%20the%20oDATA%20connector%20was%20evaluated.%20However%2C%20it%20was%20not%20deployed%20to%20the%20production%20as%20it%20didn%E2%80%99t%20meet%20customer%E2%80%99s%20requirement.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EThe%20following%20diagram%20illustrates%20the%20reference%20architecture%20of%20the%20deployed%20solution%3A%3C%2FP%3E%0A%3CP%3E%3CSTRONG%3E%26nbsp%3B%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22Sachin-Ghorpade_0-1591031391440.png%22%20style%3D%22width%3A%20400px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195781i0138B2B359306FCA%2Fimage-size%2Fmedium%3Fv%3D1.0%26amp%3Bpx%3D400%22%20role%3D%22button%22%20title%3D%22Sachin-Ghorpade_0-1591031391440.png%22%20alt%3D%22Sachin-Ghorpade_0-1591031391440.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CSTRONG%3E%26nbsp%3B%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3EHere%20the%20various%20component%20and%20their%20roles%20in%20the%20architecture%3A%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3E%3CSTRONG%3ERAW%3A%20%3C%2FSTRONG%3EFile%20Storage%20layer%20data%20are%20stored%20as%20is%20from%20the%20source%20systems.%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EEnriched%3A%20%3C%2FSTRONG%3EFile%20Storage%20layer%20where%20data%20are%20cleansed%2C%20standardized%20and%20Data%20quality%20checks%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3ECurated%3A%20%3C%2FSTRONG%3EDatabase%20storage%20layer%20where%20Business%20rules%20%26amp%3B%20transformation%20are%20applied%20and%20stored%20for%20easily%20consumption%20for%20Reports%2C%20dashboard%2C%20and%20visualization%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EADF%3A%3C%2FSTRONG%3E%26nbsp%3B%20Data%20extraction%20%26amp%3B%20for%20orchestration%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EDataBricks%3A%26nbsp%3B%20%3C%2FSTRONG%3ECompute%20for%20Business%2C%20Transformation%2C%20Integration%2C%20cleansing%20and%20standardization%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EAzure%20Synapse%20analytics%3A%20%3C%2FSTRONG%3EEnterprise%20integrated%20storage%20for%20low%20latency%20queries%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EAzure%20SQL%20Database%3A%20%3C%2FSTRONG%3EStorage%20layer%20for%20Operation%20Data%20source%20and%20data%20that%20can%E2%80%99t%20be%20integrated%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EAzure%20API%20App%3A%3C%2FSTRONG%3E%20Delivery%20data%20as%20a%20REST%20API%20for%20the%20external%203%3CSUP%3Erd%3C%2FSUP%3E%20party%20organization%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EPower%20BI%3A%20%3C%2FSTRONG%3EVisualization%2C%20dashboard%2C%20and%20reporting%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%3CSTRONG%3E%26nbsp%3B%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CH2%20id%3D%22toc-hId-32343933%22%20id%3D%22toc-hId-32343933%22%20id%3D%22toc-hId-32343933%22%20id%3D%22toc-hId-32343933%22%3ELessons%20Learned%3C%2FH2%3E%0A%3CP%3EADF%20is%20a%20great%20solution%20to%20extract%20data%20from%20various%20source%20systems%20(SAP%20and%20non-SAP)%2C%20perform%20transformation%2C%20and%20load%20to%20the%20target%20data%20sink.%3C%2FP%3E%0A%3CP%3EWe%20performed%20extensive%20analysis%2C%20comparison%2C%20and%20performance%20tests%2C%20and%20here%20are%20some%20recommendations%20(based%20on%20our%20learning)%20on%20when%20to%20use%20which%20ADF%20connectors%20for%20efficient%20data%20extraction%20from%20SAP%20applications%3A%20%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3ESAP%20Table%20%2FOpen%20Hub%E2%80%93%20You%20can%20use%20RFC%20based%20protocol%20if%20the%20data%20is%20less%20than%202GB%20in%20size%3B%20however%2C%20with%20more%20than%202GB%20of%20data%2C%20protocol%20BASXML%20is%20recommended.%20Since%20you%20can%E2%80%99t%20have%20two%20protocols%20defined%20in%20the%20same%20configuration%2C%20use%20BASXML%20protocol%20to%20serve%20both%20the%20small%20and%20big%20size%20data.%3C%2FLI%3E%0A%3CLI%3ENative%20DB%20Connector%20%E2%80%93%20great%20option%20to%20extract%20the%20data%20from%20secondary%20database.%20Ensure%20that%20SAP%20support%20the%20database%20extraction%20method%2C%20and%20database%20provider%20support%20read%20from%20secondary%20feature.%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EHere%20are%20some%20learnings%2Fbest%20practices%20to%20consider%20while%20using%20the%20various%20data%20factory%20connectors.%3C%2FP%3E%0A%3CP%3E%3CSTRONG%3E%26nbsp%3B%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CTABLE%20width%3D%22618%22%3E%0A%3CTBODY%3E%0A%3CTR%3E%0A%3CTD%20width%3D%22171%22%3E%3CP%3E%3CSTRONG%3ESAP%20Application%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22152%22%3E%3CP%3E%3CSTRONG%3EData%20Factory%20Connector%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22294%22%3E%3CP%3E%3CSTRONG%3EKey%20features%20and%20key%20considerations%20%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3CTR%3E%0A%3CTD%20width%3D%22171%22%3E%3CP%3E%3CSTRONG%3ESAP%20ECC%2C%20SAP%20SCM%2FAPO%2C%20SAP%20BW%20%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22152%22%3E%3CP%3ESAP%20Table%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22294%22%3E%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Data%20extraction%20through%20%3CSTRONG%3ESAP%20ECC%20Application%3C%2FSTRONG%3E%20layer%20using%20ADF%20SAP%20Table%20Connector%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Support%20%3CSTRONG%3Efull%20and%20Delta%20extraction%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Maximum%20%3CSTRONG%3Eruntime%20configuration%3C%2FSTRONG%3E%20to%20be%20considered%20when%20defining%20data%20extraction%20pattern%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Support%20low%20to%20%3CSTRONG%3Ehigh%20data%20size%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20%3CSTRONG%3EUse%20BASXML%20protocol%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3CTR%3E%0A%3CTD%20width%3D%22171%22%3E%3CP%3E%3CSTRONG%3ESAP%20ECC%2C%20SAP%20SCM%2FAPO%2C%20SAP%20BW%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22152%22%3E%3CP%3ENative%20DB%20connector%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22294%22%3E%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Data%20extraction%20directly%20%3CSTRONG%3Efrom%20Database%20objects%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Primary%20database%20access%20is%20restricted%20in%20SAP%20Transaction%20applications%20(SAP%20ECC%20%26amp%3B%20APO)%2C%20so%20leverage.%20%3CSTRONG%3ESecondary%20database%2FHigh%20Availability%20instance%20%3C%2FSTRONG%3Efor%20data%20extraction%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Support%20%3CSTRONG%3Efull%20and%20Delta%20extraction%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Frequency%20of%20replication%20between%20Primary%20to%20Secondary%20instance%20to%20be%20considered%3C%2FP%3E%0A%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3CTR%3E%0A%3CTD%20width%3D%22171%22%3E%3CP%3E%3CSTRONG%3ESAP%20%E2%80%93%20ECC%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22152%22%3E%3CP%3ESAP%20OData%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22294%22%3E%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Recommended%20for%20only%20%3CSTRONG%3ESmall%20data%20volume%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Entities%20exposed%20%3CSTRONG%3Eby%20SAP%20ECC%20OData%20services%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20OData%20Services%20internally%20creates%20objects%20in%20SAP%20ECC%20for%20each%20ODATA%20services.%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Recommended%20for%20very%20low%20volume%20(%3CSTRONG%3Ea%20few%20thousands%20records)%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3CTR%3E%0A%3CTD%20width%3D%22171%22%3E%3CP%3E%3CSTRONG%3ESAP%20%E2%80%93%20BW%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22152%22%3E%3CP%3ESAP%20Open%20Hub%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22294%22%3E%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Data%20need%20to%20be%20%3CSTRONG%3Emoved%20from%20SAP%20ECC%20to%20SAP%20BW%20%3C%2FSTRONG%3Eusing%20(BW%20Business%20content%20or%20customer%20extractor)%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Data%20extraction%20through%20%3CSTRONG%3ESAP%20BW%20Application%20layer%3C%2FSTRONG%3E%20using%20OpenHub%20Connector%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20SAP%20BW%20will%20become%20%3CSTRONG%3Eintermediate%20data%20storage%20layer%3C%2FSTRONG%3E%20with%20transformed%20and%20Aggregated%20data%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Support%20full%20and%20Delta%20extraction%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20%3CSTRONG%3EOpen%20Hub%20Destination%20Object%3C%2FSTRONG%3E%20need%20to%20be%20built%20in%20%3CSTRONG%3ESAP%20BW%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Support%20for%20low%20to%20%3CSTRONG%3Ehigh%20data%20size%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20%3CSTRONG%3EUse%20BASXML%20protocol%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3CTR%3E%0A%3CTD%20width%3D%22171%22%3E%3CP%3E%3CSTRONG%3ESAP%20%E2%80%93%20BW%20on%20HANA%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22152%22%3E%3CP%3ESAP%20HANA%3C%2FP%3E%0A%3C%2FTD%3E%0A%3CTD%20width%3D%22294%22%3E%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Only%20when%20data%20stored%20in%20SAP%20HANA%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Support%20%3CSTRONG%3Efull%20and%20Delta%20extraction%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%E2%80%A2%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%20Support%20for%20low%20to%20%3CSTRONG%3Ehigh%20data%20size%3C%2FSTRONG%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3C%2FTBODY%3E%0A%3C%2FTABLE%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CH2%20id%3D%22toc-hId--1775110530%22%20id%3D%22toc-hId--1775110530%22%20id%3D%22toc-hId--1775110530%22%20id%3D%22toc-hId--1775110530%22%3EConclusion%3C%2FH2%3E%0A%3CP%3EWith%20the%20existing%20tools%20and%20services%20offered%20in%20Azure%2C%20you%20can%20rapidly%20deploy%2C%20build%2C%20and%20configure%20your%20solutions.%20These%20solutions%20were%20successfully%20deployed%20to%20the%20customer%20site%2C%20and%20many%20more%20customers%20are%20in%20pipeline%20to%20get%20this%20solution%20implemented.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EYou%20can%20integrate%20data%20silos%20with%20Azure%20Data%20Factory.%20Easily%20construct%20ETL%20and%20ELT%20processes%20code-free%20or%20write%20your%20own%20code.%20Visually%20integrate%20data%20sources%20using%20more%20than%2090%2B%20natively%20built%20and%20maintenance-free%20connectors%20at%20no%20added%20cost.%20Focus%20on%20your%20data%E2%80%94the%20serverless%20integration%20service%20does%20the%20rest.%20Read%20%E2%80%9C%3CA%20href%3D%22https%3A%2F%2Fazure.microsoft.com%2Fen-us%2Fservices%2Fdata-factory%2F%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3EAzure%20Data%20Factory%20-%20Hybrid%20data%20integration%20service%20that%20simplifies%20ETL%20at%20scale%3C%2FA%3E%E2%80%9D%20for%20more%20information.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-TEASER%20id%3D%22lingo-teaser-1431602%22%20slang%3D%22en-US%22%3E%3CP%3E%3CSPAN%3EA%20US-based%20multinational%26nbsp%3Bmanufacturing%20and%20retail%20company%20had%20its%20legacy%20systems%2C%20aged%20infrastructure%2C%20and%20traditional%20data%20extraction%20tools%20running%20on-premises.%20With%20this%20current%20infrastructure%2C%20they%20were%20unable%20to%20perform%20real%20time%20analytics%20to%20derive%20meaningful%20business%20insights.%3C%2FSPAN%3E%3C%2FP%3E%3C%2FLINGO-TEASER%3E%3CLINGO-LABS%20id%3D%22lingo-labs-1431602%22%20slang%3D%22en-US%22%3E%3CLINGO-LABEL%3EADF%3C%2FLINGO-LABEL%3E%3C%2FLINGO-LABS%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1943575%22%20slang%3D%22en-US%22%3ERE%3A%20Analytics%20and%20Integration%20for%20SAP%20Global%20Instance%20running%20on-premises%20with%20ADF%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1943575%22%20slang%3D%22en-US%22%3EOK%2C%20thanks%20Sachin.%20We%20are%20using%20the%20latest%20SHIR%20but%20still%20have%20the%20memory%20error%20and%20performance%20problem.%20We've%20raised%20a%20ticket%20with%20support.%3C%2FLINGO-BODY%3E
Version history
Last update:
‎Nov 30 2020 09:56 AM
Updated by: