Introduction
This article presents a simple solution that performs the conversion of Mainframe and Midrange EBCDIC files to ASCII format. By leveraging the power of serverless Azure Logic App PaaS, this enables seamless data conversion without the hassle of provisioning any Virtual Machines or IaaS resources. The best part? The entire process is achieved without writing a single line of code. With this easy to implement approach, businesses can streamline their operational workflows and easily load converted data to Azure Data Tier within the same GUI.
Below diagram is about different Azure services which are used in this solution to perform EBCDIC data file to ASCII file conversion.
EBCDIC To ASCII Conversion Diagram Workflow:
EBCDIC to ASCII conversion using Logic App Detail Steps:
These activities can be conducted using the Azure Portal or VS Code.
Overview
This solution helps in easy EBCDIC to ASCII data conversion using Azure Logic App PaaS Service. Azure Logic App has in-built connector named IBM Host File - Parse Host File Contents which can be used to convert data in EBCDIC format to ASCII format. The Azure Logic App workflow below shows how to perform EBCDIC to ASCII data conversion.
Logic App Workflow to convert EBCDIC to ASCII using Parse Host File contents connector
Logic App Connectors present in above diagram:
Detailed description of every step above is explained in the sections below.
Step1: Recurrence trigger
Recurrence trigger is used to schedule execution of Logic App workflow on definite frequency (every second, minutes, hour, daily etc.). Additional details of recurrence connector can be found at location: link.
For test purposes we have kept it to run every hour as shown below:
Based on requirement this logic app workflow can even be triggered on multiple events e.g., arrival of file in Azure blob storage using Logic app Blog trigger action.
Step2: Get blob content (V2)
Get blob content (V2) action on Azure Blob Storage connector is to retrieve binary stream of bytes data from blob content in to Azure Logic App.
To get data from Azure Blob Storage first we need to create a connection to Storage account.
Click on Get blob content (V2) --> Parameters --> Connection.
Click on Create Connection:
You can even use different Authentication types like Service Principal etc. We are using Azure AD integrated for the current test.
Provide file name which contains EBCDIC data in Blob parameter as shown below.
Input EBCDIC file content - sample:
Copybook Layout:
As shown above this copybook contains Name has alphanumeric field, Age as native binary, Salary as packed decimal. Data shown below is created using the above copybook in EBCDIC format.
Input sample data from blob storage in EBCDIC format viewed in Hex Editor:
Mainframe EBCDIC file can be transferred to Azure storage using ADF FTP connector, Mainframe to Azure sFTP or other methods.
Step3: Parse Host File Contents
This is the most important step in overall Logic App data conversion workflow. This step parses the contents of a file in IBM EBCDIC format i.e., Converts EBCDIC data to output ASCII format in json layout. Various steps involved in configuring this connector are mentioned below.
Development Steps:
Below steps are required to generate HIDX file from Cobol copybook. As a part of development activities copybook HIDX should be generated and saved to Azure storage so that they can be referenced inside Azure logic App to interpret EBCDIC data files.
Create HIDX layout from Cobol Copybook:
Parse Host File Contents connector expects HIDX layout and EBCDIC data file as input and generates ASCII data file as output. HIDX stands for Host Integration Designer XML, this is XML representation of COBOL copybook. Cobol Copybook can be converted to HIDX layout using HIS Designer for Logic Apps Software. Below are steps involved in this conversion:
1. Download HIS Designer for Logic Apps software from location link
2. Design a metadata artifact as shown on location link
3. Save Host Integration Designer XML (HIDX or metadata artifact link
Logic App workflow parameter for Parse Host File Contents:
Parameters
Name |
Key |
Description |
Remark |
HIDX Name |
hidx |
Name of an HIDX file containing meta-data about the IBM file format. |
HIDX created during development phase should be copied to Logic App --> Artifacts --> Maps section as shown below:
Employee.hidx file contents:
|
Schema Name |
schema |
Name of a Schema contained in the HIDX file. |
DefaultSchema |
Binary Contents |
contents |
Binary data representing the contents of an IBM file. |
File Content |
All parameters used during test execution are shown below:
Returns
Name |
Path |
Type |
Description |
Rows |
rows |
string |
The result of parsing the binary file contents. |
Output rows: (Only few rows shown)
As shown below every output row data is present in ASCII Json format.
Step4: Create CSV table
Step 3 will create output rows in ASCII JSON format present in array as shown above. This step will create output records in CSV record layout format.
Output of Create CSV Table connector:
As shown below rows are converted from input json format to CSV records with new line characters \r\n at the end.
Step5: Create blob (V2)
This step will write output CSV file to blob storage.
Below screenshot shows the parameter that should be set to that file is written to blob storage.
Output converted file in ASCII Format stored in Azure storage – sample
Step6: Execute stored procedure (V2)
This step will execute Stored Procedure to perform Bulk Insert data from ASCII - CSV file stored in blob storage to Azure SQL DB.
Steps to create Stored Procedure with Bulk Insert statements:
a) Create Table to store ASCII converted data
|
b) Create Master Key for Encryption
|
c) Create Database scoped credential
Detailed instruction to create Shared Access Signature for Blob Storage is mentioned at location: link
|
d) Create External Data Source as Azure Blob Storage
|
e) Create Stored Procedure to Bulk Insert ASCII converted data from Azure blob storage to Azure SQL DB
|
Below screenshot shows the parameter that should be set to execute Stored Procedure on SQL DB which will perform bulk insert. Connection needs to be created to Azure SQL DB by following steps mentioned at location: link
Execute Logic App to test EBCDIC to ASCII conversion results
Run Logic App to perform end to end process for reading EBCDIC file from Azure blob storage, converting to ASCII format and loading ASCII data in Azure SQL DB. Below is successful execution of Logic App.
Check Results:
|
Record Count Output:
|
Rows from SQL DB Table Output :
Important points for Consideration
The above example is shown to provide information about important steps to perform EBCDIC To ASCII conversion using Logic App. This example is not a complete solution. Actual implementation may include Error Handling / Splitting Large files into smaller files for processing / Parameterized file names, Stored procedure etc.
Limitations
Feedback and suggestions
If you have feedback or suggestions for improving this data migration asset, please send an email to Azure Databases SQL Customer Success Engineering Team. Thank you for your support!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.