Blog Post

Modernization Best Practices and Reusable Assets Blog
4 MIN READ

Diagnosing and Resolving Oracle to Azure PostgreSQL Migration Failures with Ora2Pg

shaypatel's avatar
shaypatel
Icon for Microsoft rankMicrosoft
Nov 13, 2024

Introduction

Migrating from Oracle to Azure PostgreSQL is a complex endeavor with multiple coordinated steps. Ora2Pg is a widely used tooling to accelerate the migration for both schema conversion and data migration. 

For installing Ora2Pg, access the step-by-step installation guide for Windows and Linux here

After successful schema migration, Data migration can become complicated and time-consuming, especially when dealing with tables containing millions or billions of rows. The process may also encounter failures due to data errors related to platform, data type, or character set incompatibility. Identifying these errors can be challenging, but by applying the following debugging methods and troubleshooting steps, one can efficiently resolve these issues during large and complex migrations.

Steps to debug the errors:

  1. Locate the Configuration File: Open your ora2pg.conf file. This file is typically found in the directory where you initialized your Ora2Pg project.
  2. Find the LOG_ON_ERROR parameter in the Ora2Pg configuration file. If it doesn't exist, you can add it. Set the value of LOG_ON_ERROR to 1 to enable logging of errors. Here's a brief example of how you set this in your ora2pg.conf file:
  3. Specify the Output Directory: Ensure that you have specified an output directory where the error logs will be saved. This can be done using the OUTPUT directive in the same configuration file.
  4. Proceed with the data import. When Ora2Pg encounters an error in a COPY or INSERT statement from PostgreSQL, it logs the problematic statements and data into a file named TABLENAME_error.log within the output directory and the migration will continue. By default, this setting is disabled, causing the import to abort upon error. 
  5. Continue with the next batch of data. This approach allows you to troubleshoot and manually reload the error log file after fixing the issues. 
  6. Review each column of the row to identify the problematic data. If you still can't determine the issue, update the log file by removing the SQL syntax and keeping the data in a tab-delimited format.
  7. Use the pgAdmin import utility to load the file and check if it generates the same error for the specific row.
  8. Capture the problematic data rows and attempt to insert them into a test environment with the same table structure or recreate the table.
  9. Run Ora2PG to import the data and check if the issue can be reproduced.
  10. If you have specific errors that you want to handle differently, you might need to adjust other parameters or review the logged errors to make necessary corrections before re-running the migration. 

Use Case:

1. Data migration failed and debug logs are showing some error message. 

2. Enable LOG_ON_ERROR parameter file in config file.

Re-run the data migration using Ora2Pg, which will generate a file containing data errors in the output directory.

3.  Review the log file to identify problematic data. Apply appropriate remedies to resolve these issues. Search for errors and possible solution from here.

4. If no solution is available to fix data errors, consider using one of the following methods or tools to manually insert the data.

  • Using Azure data factory.
  • Update bad data manually in tablename_error file and import it using pgadmin.
  • Export table using oracle developer into flat file and import using pgAdmin.

Let’s go through some scenarios to illustrate why data copying might fail with Ora2Pg but succeed with Azure Data Factory or manual insertion using pgAdmin:

Scenario 1: Data Type Handling

Issue: Ora2Pg encounters an error while migrating a table with a NUMBER data type that has precision and scale values unsupported by PostgreSQL.

  • Ora2Pg: Throws an error due to the unsupported precision and scale.
  • Azure Data Factory: Automatically converts the NUMBER data type to a compatible PostgreSQL data type, such as NUMERIC.
  • pgAdmin: Allows manual intervention to adjust the data type during insertion, avoiding the errors due to precision mismatch between Number and Numeric.

Scenario 2: Character Set Issues

Issue: The table contains special characters or emojis that are not supported by the configured character set in PostgreSQL.

  • Ora2Pg: Fails to migrate the data due to character set incompatibility.
  • Azure Data Factory: Handles the character set conversion more gracefully, ensuring the special characters are correctly encoded.
  • pgAdmin: Allows manual encoding adjustments during data insertion, bypassing the character set issue.

Scenario 3: Platform-Specific Incompatibilities

Issue: The table includes Oracle-specific functions or features that are not directly compatible with PostgreSQL. (examples, Inbuilt Functions, Views or UDF)

  • Ora2Pg: Encounters errors due to the strict enforcement of platform-specific rules.
  • Azure Data Factory: Uses built-in transformations to convert Oracle-specific features to their PostgreSQL equivalents.
  • pgAdmin: Allows manual editing of the data or SQL queries to ensure compatibility with PostgreSQL.

Scenario 4: Error Handling and Logging

Issue: A specific row contains invalid data that violates PostgreSQL constraints.

  • Ora2Pg: Stops the migration process and logs the error, requiring manual intervention to fix the data.
  • Azure Data Factory: Logs the error but continues with the migration, allowing you to address the problematic rows later.
  • pgAdmin: Provides immediate feedback during manual insertion, allowing you to correct the data on the spot.

Scenario 5: Configuration Differences

Issue: Different default settings for data transfer and conversion.

  • Ora2Pg: Uses strict default settings that might not be optimal for all data types and structures.
  • Azure Data Factory: Offers more flexible configuration options that can be adjusted to handle various data scenarios.
  • pgAdmin: Allows for manual configuration and adjustments during the data insertion process.

Feedback and suggestions

If you have feedback or suggestions for improving this data migration asset, please send an email to Azure Databases CSE Engineering Team.

Updated Nov 14, 2024
Version 2.0