Migration
28 TopicsRemoving barriers to migrating databases to Azure with Striim’s Unlimited Database Migration program
Alok Pareek, co-founder and Executive Vice President of Product and Engineering at Striim Shireesh Thota, Corporate Vice President of Databases at Microsoft Every modernization strategy starts with data. It’s what enables advanced analytics and AI agents today, and prepares enterprises for what’s to come in the future. But before services like Microsoft Fabric, Azure AI Foundry, or Copilot can create that value, the underlying data needs to move into Microsoft’s cloud platforms. It’s within that first step, database migration, where the real complexity often lies. To simplify the process, Microsoft has expanded its investment in the Striim partnership. Striim continuously replicates data from existing databases into Azure in real time, enabling online migrations with zero downtime. Through this partnership, we have collaborated to enable modernization and migration into Azure at no additional cost to our customers. We’ve designed this Unlimited Database Migration program to accelerate adoption by making migrations easier to start, easier to scale, and easier to complete, all without disrupting business operations. Since launch, this joint program has already driven significant growth in customer adoption, indicating the demand for faster, more seamless modernization. And with Microsoft’s continued investment in this partnership, enterprises now have a proven, repeatable path to modernize their databases and prepare their data for the AI era. Watch or listen to our recent podcast episode (Apple Podcasts, Spotify, YouTube) to learn more. Striim’s Unlimited Migration Program Striim’s Unlimited Database Migration Program was designed to make modernization as straightforward as possible for Microsoft customers. Through this initiative, enterprises gain unlimited Striim licenses to migrate as many databases as they need at no additional cost. Highlights and benefits of the program include: Zero-downtime, zero-data-loss migrations. Supported sources include SQL Server, MongoDB, Oracle, MySQL, PostgreSQL, and Sybase. Supported targets include Azure Database for MySQL, Azure Database for PostgreSQL, Azure Cosmos DB, and Azure SQL. Mission-critical, heterogeneous workloads supported. Applies for SQL, Oracle, NoSQL, OSS. Drives faster AI adoption. Once migrated, data is ready for analytics & AI. Access is streamlined through Microsoft’s Cloud Factory Accelerator team, which manages program enrollment and coordinates the distribution of licenses. Once onboarded, customers receive installation walkthroughs, an enablement kit, and direct support from Striim architects. Cutover support, hands-on labs, and escalation paths are all built in to help migrations run smoothly from start to finish. Enterprises can start migrations quickly, scale across business units, and keep projects moving without slowing down for procurement hurdles. Now, migrations can begin when the business is ready, not when budgets or contracts catch up. How Striim Powers Online Migrations Within Striim’s database migrations, schema changes and metadata evolution are automatically detected and applied, preserving data accuracy and referential integrity. As the migration progresses, Striim automatically coordinates both the initial bulk load of historical data and the ongoing synchronization of live transactions. This ongoing synchronization keeps source and target systems in sync for as long as needed to actively test the target applications with real data before doing the cutoff, thereby minimizing risk. However, the foundation of Striim’s approach is log-based Change Data Capture (CDC), which streams database changes in real time from source to target with sub-second latency. This helps migrations avoid just moving the static snapshot of a database. Rather, they continuously replicate every update as it happens, so both environments remain aligned with minimal impact on operational systems throughout the process. While the snapshot (initial load) is being applied to the target system, Striim captures all the changes that occur. Once the initial load process is complete, Striim applies the changes using CDC, and from this point on, the source and target systems are in sync. This eliminates the need for shutting down the source system during the initial load process and enables customers to complete their migrations without any downtime of the source database. Striim is also designed to work across hybrid and multi-cloud architectures. It can seamlessly move workloads from on-premises databases, SaaS applications, or other clouds into Microsoft databases. By maintaining exactly-once delivery and ensuring downstream systems stay in sync, Striim can reduce risk and accelerates the path to modernization. Striim is available in the Azure Marketplace, giving customers a native, supported way to integrate it directly into their Azure environment. This means migrations can be deployed quickly, governed centrally, and scaled as business needs evolve, all while still aligning with Azure’s security and compliance standards. From Migration to Value With workloads fully landed in Azure, enterprises can immediately take advantage of the broader Microsoft data ecosystem. Fabric, Azure AI Foundry, and Copilot become available as extensions of the database foundation, allowing teams to analyze, visualize, and enrich data without delay. Enterprises can begin adopting Microsoft AI services with data that is current, trusted, and governed. Instead of treating migration as an isolated project, customers gain an integrated pathway to analytics and AI, creating value as soon as databases go live in Azure. How Enterprises Are Using the Program Today Across industries, we’re already seeing how this program changes the way enterprises approach modernization. Financial Services Moving from Oracle to Azure SQL, one global bank used Striim to keep systems in sync throughout the migration. With transactions flowing in real time, they stood up a modern fraud detection pipeline on Azure that identifies risks as they happen. Logistics For a logistics provider, shifting package-tracking data from MongoDB to Azure Cosmos DB meant customers could monitor shipments in real time. Striim’s continuous replication kept data consistent throughout the cutover, so the company didn’t have to trade accuracy for speed. Healthcare A provider modernizing electronic medical records from Sybase to Azure SQL relied on Striim to ensure clinicians never lost access. With data now in Azure, they can meet compliance requirements while building analytics that improve patient care. Technology InfoCert, a leading provider of digital trust services specializing in secure digital identity solutions, opted to migrate its critical Legalmail Enterprise application from Oracle to Azure Database for PostgreSQL. Using Striim and Microsoft, they successfully migrated 2 TB of data across 12 databases and completed the project within a six-month timeframe, lowering licensing costs, enhancing scalability, and improving security. What unites these stories is a common thread: once data is in Azure, it becomes part of a foundation that’s ready for analytics and AI. Accelerate Your Path to Azure Now, instead of database migration being the bottleneck for modernization, it’s the starting point for what comes next. With the Unlimited Database Migration Program, Microsoft and Striim have created a path that removes friction and clears the way for innovation. Most customers can simply reach out to their Microsoft account team or seller to begin the process. Your Microsoft representative will validate that your migration scenario is supported by Striim, and Striim will allocate the licenses, provide installation guidance, and deliver ongoing support. If you’re unsure who your Microsoft contact is, you can connect directly with Striim, and we’ll coordinate with Microsoft on your behalf. There’s no lengthy procurement cycle or complex setup to navigate. With Microsoft and Striim jointly coordinating the program, enterprises can begin migrations as soon as they’re ready, with confidence that support is in place from start to finish. Simplify your migration and move forward with confidence. Talk to your Microsoft representative or book a call with Striim team today to take advantage of the Unlimited Database Migration Program and start realizing the value of Azure sooner. Or if you’re attending Microsoft Ignite, visit Striim at booth 6244 to learn more, ask questions, and see how Striim and Microsoft can help accelerate your modernization journey together.
359Views2likes0CommentsGeneral Availability - DMS's PowerShell, Azure CLI, and Python SDK
We’re excited to announce the General Availability (GA) of DMS client tools - PowerShell, Azure CLI, Python SDK and more. This milestone unlocks efficient, stable, and scalable automation options for database migration workflows—making it easier than ever to integrate DMS into your DevOps pipelines and enterprise migration strategies. 💡Introduction: With the general availability of DMS client tools - PowerShell, Azure CLI, Python SDK, users can now use stable release of: PowerShell module 1.0.0 (https://www.powershellgallery.com/packages/Az.DataMigration/1.0.0) Azure CLI extension 1.0.0 (https://learn.microsoft.com/en-us/cli/azure/datamigration?view=azure-cli-latest) DMS V2 APIs (version 2025-06-30) SDKs for multiple languages (listed below) SDKs Releases: Language GA Package / Link .Net https://www.nuget.org/packages/Azure.ResourceManager.DataMigration/1.0.0 Java https://central.sonatype.com/artifact/com.azure.resourcemanager/azure-resourcemanager-datamigration/1.1.0 Go https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/datamigration/armdatamigration/v2 Python azure-mgmt-datamigration · PyPI JavaScript https://www.npmjs.com/package/@azure/arm-datamigration/v/3.0.0 🔧 What’s New? Three new commands have been introduced in the latest releases of the SDK, PowerShell module, and CLI extension, as outlined below: New CLI Commands: az datamigration sql-db retry - Retry the failed SQL DB migrations. az datamigration sql-managed-instance delete - Delete Azure SQL MI’s Database Migration resource. az datamigration sql-vm delete - Delete Azure SQL VM’s Database Migration resource. New PowerShell Commands: Invoke-AzDataMigrationRetryToSqlDb - Retry the failed SQL DB migrations. Remove-AzDataMigrationToSqlManagedInstance - Delete Azure SQL MI’s Database Migration resource. Remove-AzDataMigrationToSqlVM - Delete Azure SQL VM’s Database Migration resource. 🚀Conclusion: With this GA / stable release, users can now: Use them to configure and execute migrations with full control. Automate migrations: DevOps teams can embed migration steps into CI/CD pipelines. Integrate into custom applications and orchestration tools. These support all the DMS migration scenarios—from simple lift-and-shift operations to complex logical migrations—while ensuring stability, and repeatability. For more details, refer: Documentation: Migrate databases at scale using Azure PowerShell / CLI PowerShell: Az.DataMigration Module Azure CLI: az datamigration Python SDK: azure-mgmt-datamigration · PyPI270Views2likes0CommentsMaking Azure DMS More Secure: Azure Portal Permission Enhancements
Migrating databases to Azure SQL Managed Instance or Azure SQL Virtual Machine is a critical step in modernizing enterprise infrastructure. With security and compliance top of mind, Azure Database Migration Service (DMS) has introduced key changes to its Azure portal experience—especially around permission for blob container access. Why the Change? Previously, in case of Azure Portal, DMS relied on account key-based access to Azure Blob Storage for listing and accessing backup files on the migration configuration page. While functional, this approach is not best in terms of security, especially for industries which prohibit the use of shared keys. Now, DMS's Azure portal uses security context of the current signed in user on the Azure portal to list and access backup files in the blob container, making it better security approach. Impact of the Change When migrating to Azure SQL Managed Instance or Azure SQL Virtual Machine via Azure portal make sure the current signed in user has Storage Blob Data Reader role on the Blob container that contains the backup files. This permission is needed to list folders and files in the blob container during migration setup via Azure portal only. If the current signed in user lacks the Storage Blob Data Reader role on the Blob container, users will encounter the following error: Error: "Blob container selection error: Error listing the contents of the container: This request is not authorized to perform this operation using this permission." Solution: Make sure the current signed in user has "Storage Blob Data Reader" role on the Blob container that contains the backup files. For more information, refer : Tutorial: Migrate SQL Server to Azure SQL Managed Instance - Azure Database Migration Service | Microsoft Learn Tutorial: Migrate SQL Server to SQL Server on Azure Virtual Machine Using Azure Data Studio - Azure Database Migration Service | Microsoft Learn183Views0likes0CommentsGeneral Availability of Online Migration to Azure Database for PostgreSQL Flexible server
Online migration minimizes downtime by keeping your source database operational during the migration process, with continuous data synchronization until cut over. How can I use Online migration? The Online migration is available in the Azure portal on the Migration setup screen, in the “Migration mode” drop down selection box, once you initiate a migration from the Flexible server page. Figure 1: Screenshot from the Azure Portal from the Migration setup page. Here you can select the “Online” migration mode to migrate from any of the listed PostgreSQL sources to Azure Database for PostgreSQL- Flexible server It can also be used from the Azure CLI by specifying the 'migration-mode' parameter as 'Online'. How does Online migration work? In an online database migration to Azure Database for PostgreSQL – Flexible Server, your application that is connecting to your Postgres source is not stopped while your database(s) are copied to Flexible Server target. Instead, the initial copy of the database(s) is followed by replication to keep the Postgres Flexible Server in sync with the Postgres source. A cutover is performed when the Azure Database for PostgreSQL - Flexible Server is in complete sync with the Postgres source, resulting in minimal downtime. Figure 2: Cutover in Online migration: Screenshot from the Migration status screen, where you can execute the cutover and complete the migration. The latency here is zero indicating that target Postgres Flex server is in sync with the source Postgres instance. In the ‘OnlineMigrationDemo’ above, the Latency is 0 which indicates that the Azure Database for PostgreSQL - Flexible Server is in sync with the source Postgres instance. Similarly, Online migration can be executed using the Command Line Interface (CLI) as well. Figure 3: Online migration through CLI: Screenshot when you execute ‘show’ to get the Migration status displays latency for the individual Databases In the ‘OnlineMigrationDemo’ above, the Latency is 0 for the ‘customer-info’ Database being migrated which indicates that the target is in sync with the source. Whether you execute the migration from the Portal or the CLI, once the latency parameter decreases to 0 or close to 0, you can go ahead and execute the cutover to complete the migration. Before you execute the cutover, it is essential that you: Stop all writes at the source Postgres instance Validate the data that has been migrated to the target Flexible server Copy any custom server parameters and connection security details from the source to the target server Once you execute the cutover, the migration shows successful completion. At the point, ensure that you make changes to your application to point all connection strings to the Flexible server. What are the differences between Offline and Online migration? The following table gives an overview of Offline and Online modes of migration. Comparison of Migration modes Online Offline Ideal for small Databases ✓ Simple to execute, with no manual intervention for cutover ✓ Migrate without logical replication restrictions ✓ Ideal for Production databases ✓ Minimal downtime to Application & better user experience ✓ Depending on the nature of your workload, you can choose either Offline or Online migration. Get started with Online migration If you’re looking to migrate to Flexible Server from any of the listed PostgreSQL sources, you’ll find the Migration service overview quite useful. If you only have a small downtime window in particular and you want to minimize the downtime of moving your production workloads from any compatible PostgreSQL source to Flexible Server, then Online migration could be a good fit for your situation. Where to find more info about Online migration for Azure Database for PostgreSQL – Flexible Server? Overview: How to migrate from your PostgreSQL source to Flexible server Tutorials: How to migrate Online from your Azure VM/On-premise instance to Flexible server How to migrate Online from your Amazon RDS instance to Flexible server How to migrate Online from your Amazon Aurora instance to Flexible server How to migrate Online from your Google Cloud SQL for PostgreSQL instance to Flexible server We’re always eager to hear from you, so please reach out to us at migrationpm@service.microsoft.com.316Views3likes0CommentsPostgreSQL Discovery and Assessment in Azure Migrate – Public Preview
We’re excited to announce the public preview of PostgreSQL discovery and assessment in Azure Migrate! This feature helps organizations plan their migration journey to Azure by providing deep insights into on-premises PostgreSQL environments. Why This Matters Migrating PostgreSQL workloads to Azure can be challenging without visibility into your current environment. Azure Migrate now offers a unified experience to: Discover PostgreSQL instances across your infrastructure. Assess migration readiness and identify potential blockers. Get configuration-based SKU recommendations for Azure Database for PostgreSQL. Estimate Azure costs for your PostgreSQL workloads. Key Capabilities Comprehensive Discovery Inventory: Catalog PostgreSQL versions and related components. Discovery: Collect database parameters, configurations, table structures, and storage details. Assessment Features Readiness Rules: Determine if your PostgreSQL instances are: Ready: The instance can be migrated to Azure Database for PostgreSQL without any migration issues. Ready with Conditions: The instance has one or more migration issues. Review the identified issues and apply the recommended remediation steps before migration. Not Ready: The assessment did not identify an Azure Database for PostgreSQL configuration that meets the desired performance and configuration requirements. Review the recommendations provided to make the PostgreSQL instance ready for migration. Unknown: Azure Migrate can't assess readiness because discovery is still in progress or there are issues that need to be resolved. To fix discovery issues, check the Notifications blade for details. If the issue persists, contact Microsoft support. Configuration-Based SKU Recommendations: Based on vCores and memory from the machine and storage from the PostgreSQL instance, Example: Memory Optimized – E20ds_v5 Pricing Estimates: Approximate Azure cost for recommended SKUs. Database Parameter Collections - Deep insights into Database parameters How to Get Started? To begin using the PostgreSQL Discovery and Assessment feature in Azure Migrate, follow this four-step onboarding process: Create an Azure Migrate project Initiate your migration journey by setting up a project in the Azure portal. Configure the Azure Migrate appliance Install Windows-based Azure Migrate appliance to obtain a software inventory of servers, PostgreSQL instances, and their attributes, and perform discovery. Review discovered inventory Examine the detailed attributes of the discovered PostgreSQL instances. Create an assessment Evaluate readiness and get detailed recommendations for migration to Azure Database for PostgreSQL. Benefits of Using Azure Migrate for PostgreSQL Single Pane of Glass: Manage PostgreSQL migrations alongside servers, apps, and other databases. Simple Setup: Lightweight collector, no heavy appliances. Actionable Insights: Readiness rules and SKU recommendations tailored to your configuration. For comprehensive, step-by-step instructions, please refer to the discovery and assessment tutorials in the documentation: Provide server credentials to discover software inventory, dependencies, web apps, and SQL Server instances and databases - Azure Migrate | Microsoft Learn Discovery methods in Azure Migrate - Azure Migrate | Microsoft Learn Assessing On-Premises PostgreSQL for Migration to Azure Flexible Server - Azure Migrate | Microsoft Learn Join the Preview and Share Your Feedback! The PostgreSQL Discovery and Assessment feature in Azure Migrate enables you to effortlessly discover, assess, and plan your PostgreSQL database migrations to Azure. Try the features out in public preview and fast-track your migration journey! If you have any queries, feedback, or suggestions, please let us know by leaving a comment below or by directly contacting us at askazurepostgresql@microsoft.com. We are eager to hear your feedback and support you on your journey to Azure249Views1like0CommentsEnhanced SQL Migration Tracking & Bringing SQL Server Arc Assessments to Azure Data Studio
In the ever-evolving landscape of data management, ensuring seamless and efficient migrations is crucial for businesses. Migration is a multi-step process and typically requires multiple tools to complete the migration. To enhance the migration experience, we are bringing a new feature that enhances the migration tracking experience across tools. In this blog post we delve into the benefits this feature offers to streamline the migration tracking process. This feature is available for Azure Data studio users via the latest Azure SQL Migration extension v1.5.8 The Azure SQL migration extension now offers 2 new features in ADS to help users in their migration journey. Ability to view Arc assessments, SKU recommendation in Azure Data Studio Ability to track the migration via SQL Server instance (At no additional cost!) Viewing Arc assessments in Azure Data Studio for SQL Server enabled by Azure Arc If you are planning to migrate SQL Server instances that are Arc-enabled, The Azure Data studio now helps you jump start the migration by providing the ability to view the pre-computed Arc assessments. ADS now provides a link to the pre-computed assessments in the Arc experience Azure portal, which provides migration readiness assessment, SKU recommendation and pricing information (coming soon). Users can continue with rest of their migration journey in the Arc experience. To view the pre-computed assessments computed by Arc, users have to select Yes to the ‘Is your SQL Server instance tracked in Azure? ‘and fill in the Azure resource details of the SQL Server Instance enabled by Azure Arc. The pre-computed assessments and SKU recommendations will be generated as a navigation link to users like below: Ability to streamline the migration tracking process For SQL Server instances which are not Arc-enabled, this feature provides an ability to track the migration by creating an Azure resource with no additional cost associated. Once this resource has been created, users can select this migration resource which is created for their successive migrations that take place for the same source & avail the assessment and readiness benefits. This migration tracking ability is available for both Azure Data Studio and Database migration service portal. Below images show the experience for this ability in Azure DMS portal:Release Announcement of SQL Server Migration Assistant (SSMA) v 10.1
Overview SQL Server Migration Assistant (SSMA) Access, DB2, MySQL, Oracle, and SAP ASE (formerly SAP Sybase ASE) allow users to convert a database schema to a Microsoft SQL Server schema, deploy the schema, and then migrate data to the target SQL Server (see below for supported versions). What’s new? Enhanced monitoring experience for migrations using DMS [ SSMA-Oracle] Migrating Oracle workloads using Azure Database Migration service is in preview, please refer for more details. For Oracle workloads migrating using DMS in SSMA, we are bringing an enhanced monitoring experience through tabular user interface where the users can now view live list of migrations that are in progress or completed phase. Each entry represents a migration activity along with the start time of migration, DMS used for migration and status. Users can view quick brief migration information of individual tables to know more information like Table name, Schema name, copy duration, No. of Rows copied & status real-time. For the sake of users requiring more granular monitoring information about their migration activity, SSMA provides a link to the Azure Database Migration service portal webpage (View Comprehensive Report) where they can view details like Data read, writes, Rows copies, Throughput along with copy duration, etc. Code Conversion Improvements [DB2, Oracle] Conversion enhancements for identity column from Db2 z/OS to SQL Server 2019 Improve conversion of Db2 stored procedure WITH RETURN clause to Azure SQL Database Improve database objects load for Db2 Appropriate error handling for conversion of identifier REPLACE(STRING, CHAR,CHAR) in Db2 Detection of CHAR length in Oracle VARCHAR2 datatype Downloads SSMA for Access SSMA for DB2 SSMA for MySQL SSMA for Oracle SSMA for SAP ASE Supported sources and target versions Source: For the list of supported sources, please review the information on the Download Center for each of the above SQL Server Migration Assistant downloads. Target: SQL Server 2016, SQL Server 2017, SQL Server 2019, Azure SQL Database, an Azure SQL Database managed instance Resources SQL Server Migration Assistant documentation480Views1like0CommentsMigration Data Factory pipelines between tenants
Hi everybody. I need your help please. I'm trying to migrate several Data Factory pipelines between 2 diferent fabric tenants. I'm using Azure DevOps to move all the workspaces, I created the connections with the same name but when I try to restore the data factory pipelines it return an error than datafactory pipielines can't be created because doesn't find the connections. I was trying to update the connection ID but I don't find them into the json file. How can I migrate these data factories and reconnect to the new connections?82Views0likes0CommentsPublic Preview announcement - Unified migration experience in Azure DMS
We are excited to announce that Azure Database Migration Service (DMS) now supports seamless migration of your MySQL on-premises or Virtual Machine (VM) workloads to Azure Database for MySQL - Flexible Server. This new feature, now available in public preview, allows you to use physical backup files of the MySQL server for migration. By restoring your physical data files directly to your target Flexible Server, you can migrate multi-terabyte workloads quickly and effortlessly with minimal downtime ensuring a smooth and efficient transition to Azure Database for MySQL - Flexible Server, enabling you to take full advantage of the platform's capabilities. To migrate your workloads using the Physical Online Data Migration option in Azure DMS, you need to take backups of your workload on the source server using Percona Xtrabackup utility. After taking a backup, upload the backup files to Azure Blob Storage. DMS can read the uploaded backup files from Azure Blob Storage and apply them on the target flexible server for rapid movement of large workloads to MySQL flexible server. To get started, go to your DMS project and choose "[Preview] Physical Online Data Migration" for migrating your workloads from on-premises or VMs. Limitations: You must create and configure the target Flexible server prior to migrating your physical backup files. Migration for encrypted backups isn't supported. Migration cancellation during the import operation is not supported. For more information about using physical online migration with Azure DMS please follow our detailed step-by-step instructions in our documentation: https://aka.ms/dmsPhysicalImportOnlineMigration If you have any feedback or questions about the information provided above, please leave a comment below or email us at AskAzureDBforMySQL@service.microsoft.com. Thank you!