SQL Migration
8 TopicsClient Connection Summary for SQL Server enabled by Azure Arc
Overview We are excited to announce our new SQL Server client connection summary feature for SQL Server enabled by Azure Arc. This addition provides visibility to the clients connecting to your SQL Server instances, helping to simplify monitoring, security and management. Customers wanting to determine the clients that are connecting to their SQL Server instances need to poll the sys.dm_exec_sessions dynamic management view to identify these connections and track activity. With this feature in SQL Server enabled by Azure Arc, the agent is able to capture this information, providing visibility into which clients are connecting to your SQL Server instances. The system automatically polls the DMV at hourly intervals and aggregates these connections for you, enabling more informed decision-making and streamlined operations. How does it work? With this new feature: Hourly Data Collection: The monitoring agent collects client connection data at hourly intervals using the sys.dm_exec_sessions dynamic view. 30-Day Data Storage: The data is stored in telemetry for 30 days. This allows for extended analysis. Data Summarization: The unique client connections are summarized over a time period of your choice within the past 30 days. How to Use the Client Connection Summary Using this feature is simple. Follow these steps: Step 1: Enable the Client Connection Feature Navigate to the Azure portal and select your SQL Server enabled by Azure Arc instance. Under the "Monitoring" tab, navigate to the SQL Server Connections view. Ensure that the feature is enabled. If it is not, click the button to enable data collection. This initiates the task for collecting data at hourly intervals. Step 2: Access the Client Connection Summary Once enabled, you will soon start seeing an overview of client connections. Step 3: Analyze the Data Filter for specific program name values of interest. You can also change the Time Range as required and examine the data to understand connection patterns. Call to Action We encourage you to use the Client Connection Summary feature of SQL Server enabled by Azure Arc. Here are a few steps to get started: Explore the Feature: Visit the Azure portal and enable client connection tracking on your SQL Server instances. Take some time to familiarize yourself with the view and the insights it provides. Join the Community: Share your experiences and insights with the Azure Arc community. Join our forums and participate in discussions to learn how others are leveraging this feature to optimize their server environments. Stay Informed: Keep up with the latest updates and best practices for SQL Server enabled by Azure Arc by following our blog and subscribing to our newsletter. We regularly share tips, tutorials, and success stories from our users. We are confident that the Client Connection summary will be a valuable addition to your SQL Server management toolkit. We look forward to your feedback and success stories as you begin to explore this new feature.389Views3likes0CommentsMigration Dashboard for SQL Server enabled by Azure Arc
Overview We are excited to announce the new Migration Dashboard feature in the Azure portal. SQL Server enabled by Azure Arc, automatically produces an assessment for migration to Azure. This assessment plays a vital role in the success of your cloud migration and modernization journey. This dashboard provides visibility into the assessment coverage and migration readiness across all your SQL Server instances enabled by Azure Arc. How to view the Migration Dashboard Using this feature is simple. Navigate to the Azure portal and search for Azure Arc. Click through into the Azure Arc center. Under Data Services, click on SQL Server Instances. You will be presented with the list of SQL Server instances enabled by Azure Arc. Switch to the Migration Dashboard tab in the right pane to get the at-scale summary of your instances, the migration assessment coverage and migration readiness of your SQL Server enabled by Azure Arc. Understand the Migration Dashboard The dashboard: Provides an overview of SQL Server instances and databases discovered. Provides an overview of the SQL Server instances which have assessments generated. Provides migration readiness summary for each of the Azure SQL offerings. Provides rich filtering capabilities enabling you to tailor the view to your needs. The first section of the migration dashboard presents an overview of all SQL Server instances and databases accessible to you. You can also see the distribution of the instances by version and edition. The next section provides you with an overview of the migration assessment and migration readiness of the instances of SQL Server enabled by Azure Arc. You can see how many instances have assessments available. The migration readiness for each Azure SQL offering is separately shown. Call to Action We encourage you to use the Migration Dashboard to: Track Migration Assessment coverage across your SQL Server instances enabled by Azure Arc. Understand your migration readiness to the Azure SQL offerings. Here are a few steps to get started: Explore the Feature: Visit the Azure portal and take some time to familiarize yourself with the dashboard and the insights it provides. Join the Community: Share your experiences and insights with the Azure Arc community. Join our forums and participate in discussions to learn how others are leveraging this feature to optimize their server environments. Stay Informed: Keep up with the latest updates and best practices for SQL Server enabled by Azure Arc by following our blog and subscribing to our newsletter. We regularly share tips, tutorials, and success stories from our users. We are confident that the Migration Dashboard will be a valuable addition to your SQL Server migration journey to Azure. We look forward to your feedback and success stories as you begin to explore this new feature.813Views1like2CommentsSQL Migration assessments now include Next-gen General Purpose Azure SQL MI
Announcing the inclusion of the Next-gen General Purpose service tier in migration assessments for SQL Server migration to Azure SQL Managed Instance. All database migration tooling includes this capability now. This service tier is more customizable, allowing the configuration to be tailored more closely to the resource requirements.550Views1like0CommentsA quick way to generate Azure SQL migration assessments
Overview Azure Migrate doesn’t allow users to import the SQL Server Inventory and discovery information from sources other than Azure Migrate appliance as of today. As a result, users who cannot deploy Azure Migrate appliance to discover their SQL Server estate for technical, business or compliance reasons or have already discovered their SQL Server estate using tools like rvtools and want a quick and approximate way to check readiness and total cost savings by moving to Azure, cannot leverage the rich capabilities SQL assessment and business case capabilities that Azure Migrate offers. To help customers quickly get migration readiness and cost savings by migrating to Azure, we are introducing “Import SQL Server Discovery data as CSV” feature in Azure Migrate that allows users to import the SQL Server inventory as a CSV file and use the discovery data to generate assessments and business cases. This feature is launched in private preview. In this blog post, we will do a deep dive on how to use this feature to generate quick assessments for your off-Azure SQL Server estate. Here are the high-level steps for importing your off-Azure SQL Server inventory as a CSV: 1) Step-1: Identify the SQL Server Instances and extract the discovery data into a CSV file. 2) Step-2: Validate and upload the inventory data to Azure Migrate 3) Step-3: Generate and view Assessments Pre-requisites: Import SQL Discovery data as a CSV feature is built on top of the new Azure Migrate experience which is also in Private Preview phase. To use Import SQL Discovery data as CSV feature, you must onboard to the private preview of new Azure Migrate experience and here is the form to sign up for Private preview: Azure Migrate Private Preview Sign-Up Survey. We will share the instructions to onboard via the contact details you provided. Make sure you have the credentials to connect to the SQL Server Instances of interest Make sure you have the requisite roles and privileges on each SQL Server Instance to access and extract the required metadata from Dynamic Management Views. To generate a login with required privileges, please run the scripts listed here: Configure custom login for SQL Server Discovery An Azure Subscription. Please note you don't need to deploy an Azure Migrate appliance to use the SQL import CSV feature. Step-1: Identify and extract the discovery data of SQL Server Instances you intend to migrate to Azure SQL into a CSV file Identify the list of all off-Azure SQL Server Instances you intend to migrate to Azure SQL from either on-premises infrastructure or other cloud platforms like Amazon Web Services (AWS), Google Cloud Platform (GCP). To extract the discovery data, please run the SQL Script below against the SQL Server Instance which you would like to import into Azure Migrate via CSV and store the output into a CSV file. /* PLEASE ENTER THE FULLY QUALIFIED DOMAIN NAME OF THE HOST IN THE LINE BELOW BEFORE RUNNING THE QUERY */ Declare @FullyQualifiedDomainName varchar(200) = ''; With ServerCores AS( SELECT cpu_count AS [Logical CPU Count], hyperthread_ratio AS [Hyperthread Ratio], cpu_count/hyperthread_ratio AS [Physical CPU Count], sqlserver_start_time FROM sys.dm_os_sys_info ), Config AS( SELECT value_in_use AS [Max Server Memory(MB) In Use] FROM sys.configurations WHERE name = 'max server memory (MB)'), UserCount AS( SELECT count(database_id) AS [Number User Databases] FROM sys.databases WHERE database_id > 4 ), DbSize AS( SELECT SUM(CAST(Size AS bigint))*8/1024 AS [Max server memory (in MB)] FROM sys.master_files WHERE database_id > 4 ), TempDbSize AS ( SELECT SUM(CAST(Size AS bigint))*8/1024 AS [TempDb Size (In MB)] FROM sys.master_files WHERE db_name(database_id) = 'tempdb' ), Logins AS (Select count(name) AS [NumOfLogins] from sys.syslogins), Properties AS ( select db.name as database_name, db.database_id as database_id, cast(db.compatibility_level as int) as database_compatibility_level, db.collation_name as database_collation, db.is_broker_enabled as is_service_broker_enabled, '0' as [Number of Nics], SERVERPROPERTY ('servername') as instance_name, SERVERPROPERTY ('productversion') as instance_version, SERVERPROPERTY ('edition') as instance_edition, SERVERPROPERTY ('productlevel') as instance_level, SERVERPROPERTY ('IsHadrEnabled') as is_hadr_enabled, SERVERPROPERTY ('collation') as server_collation, CASE WHEN SERVERPROPERTY ('IsClustered') = 1 THEN 'TRUE' ELSE 'FALSE' END as is_failover_clustered, COALESCE(SERVERPROPERTY ('IsIntegratedSecurityOnly'), 0) as is_integrated_security_only, COALESCE(SERVERPROPERTY ('IsPolyBaseInstalled'), 0) as is_polybase_installed, CASE WHEN 'A' = 'a' THEN 0 ELSE 1 END as is_server_case_sensitive, /* '' + @host_platform + '' as host_platform,*/ db.state as database_state, /* db.is_auto_close_on as is_auto_close_on,'+ CASE when @check_cdc = 1 THEN 'db.is_cdc_enabled' else '0' END + ' as is_cdc_enabled,*/ CASE when db.user_access = 0 THEN 1 ELSE 0 END as is_multi_user, CASE when db.recovery_model = 0 THEN 0 ELSE 1 END as is_full_recovery_model, db.is_read_only as is_read_only, case when source_database_id IS NULL THEN 0 ELSE 1 end as is_snapshot, case when is_published = 1 OR is_subscribed = 1 OR is_merge_published = 1 or is_distributor = 1 THEN 1 ELSE 0 end as is_replication_enabled, db.is_encrypted from sys.databases db where db.name not in ('master', 'tempdb', 'model', 'msdb') and is_distributor <> 1), EachDbSize AS( SELECT database_name = DB_NAME(database_id), CAST(SUM(CAST(size AS bigint)) * 8. / 1024 AS DECIMAL(12,2)) AS [Database size (in MB)], convert(varchar(12),CAST((sum(case when physical_name like '%ldf'then 0 else size end)*8.0)/1024.0 AS numeric(8,2))) AS [Data file size (in MB)], convert(varchar(12),CAST((sum(case when physical_name like '%ldf'then size else 0 end)*8.0)/1024.0 AS numeric(8,2))) AS [Log file size (in MB)] FROM sys.master_files WHERE database_id > 4 GROUP BY database_id) select @FullyQualifiedDomainName as [Fully qualified domain name], instance_name as [Instance Name], p.database_name as [Database Name], database_state as [Database Status], instance_edition [SQL Edition], instance_version as [SQL Version], [Logical CPU Count], [Hyperthread Ratio], [Physical CPU Count], [TempDb size (in MB)], is_failover_clustered as [Is FCI Enabled], '' as [Failover cluster name], '' as [Failover cluster instance role], '' as [Failover cluster shared disk count], '' as [Is failover cluster multi subnet], [Max server memory (in MB)], [Number of Nics], instance_level as [Service pack], '' as [ESU status], 'FALSE' as [IS AG enabled], instance_level as [Build version], 'Online' as [Service status], 'FALSE' as [Is database highly available], 'FALSE' as [Is part of distributed availability group], [Database size (in MB)], [Data file size (in MB)], [Log file size (in MB)], database_compatibility_level as [Compatibility level], '' as [Availability group ID], '' as [Availability group name], '' as [Availability group type], '' as [Availability replica ID], '' as [Availability replica name], '' as [Commit mode], '' as [Replica type], '' as [Replica state], 'FALSE' as [Is AG multi subnet], '' as [AG replica sync status], '' as [AG replica seed mode], '' as [AG replica read mode] from ServerCores, Config, UserCount, DbSize, TempDbSize,Logins, Properties p LEFT JOIN EachDbSize s on p.database_name = s.database_name order by p.database_name Collate the output of all SQL Server Instances of interest into a single file. Step-2: Validate and upload the CSV file To validate and upload the generated CSV file with SQL Server discovery data: 1) Go to the Azure Migrate experience link you would have received from Microsoft after you signed up for the private preview of this feature. Click on “Create Project” to create a new Project (if not done already): 2) Fill in the required fields in create project wizard 3) Click on the created project and select “Using custom Import” option under “Start Discovery” button. 4) In the Discover page, select “File Type” as “SQL Server inventory (CSV)” 5) (Optional) Click on “Download” button under Step 1: Download template to view and understand the template expected by Azure Migrate. Optionally, you can skip this step as the CSV file generated by SQL script shared above confirms to the template. 6) Browse and select the inventory file in Step-2 and click on Validate to validate the uploaded SQL Inventory data. 7) If the uploaded doesn’t contain any warnings or error messages, it will show validation completed successfully with 0 errors and warnings. If there are any errors or warnings Azure Migrate will show a count of warning and error messages. To view and resolve the warnings or errors (if any) with the imported file, click on Download ErroDetails.csv, which displays errors/warning messages corresponding to each database along with remedial action. Please note warnings are non-blocking messages, i.e., user can still go ahead and complete the Import operation. Whereas errors are blocking in nature i.e. users will not be allowed to import unless the error messages are resolved. 8) Once the validation phase is completed, click on Import button to complete the import operation. Step-3: Create Assessments using imported SQL Discovery Data SQL Server Instances and the databases uploaded via CSV import route are treated at par with the SQL Server instances that are discovered by Azure Migrate appliance. You can view the SQL Server Instance properties like Version details, Number of databases, Support status and all other properties uploaded via CSV file. The quality of property details and the assessments generated is directly dependent on the quality of data uploaded via CSV. To create SQL assessments for the uploaded SQL Server Instances, go to the Azure Migrate project's resource page and click on Create Assessments button: Give a name for the assessment and click on Add workloads to add the imported and appliance discovered workloads: In the following page, select the SQL Server instances that have value “Import” for Discovery Source column. You can also create assessments on a combination of workloads that are imported and appliance discovered. Click Add button to add the workloads and click on Review + Create Assessments to create assessment. For imported SQL Server instances, the quality of assessments is only as good as the data uploaded via CSV, please ensure that the data uploaded via CSV accurately depicts the metadata of the SQL Server instance. Like regular SQL assessments, you will have all the assessment configuration options like choosing Sizing criteria (performance-based vs as on-premises), migration strategy (optimizing cost vs migration to PaaS), comfort factor etc. By default, Azure Migrate will try to create a performance-based incase the performance-metrics are not available, it defaults to As-on-premises assessment. Assessment created on the imported SQL Servers will help you know which SQL Server Instances are ready for migration to Azure SQL, the recommended Azure SQL Target type. Linking and Rediscovery Linking If the host where the imported SQL Instance currently hosted is discovered via import of VM Servers or via appliance, the SQL Server Instance will be automatically associated with that host. SQL Server Instance will be rolled up under that VM Server in the discovery view. The SQL Server instance will be uniquely identified based on the FQDN and the SQL Server Instance name. The combination of attributes is used for conflict resolution as well. Rediscovery and precedence rules 1) If the SQL Server instance is originally imported via CSV and is reuploaded again via CSV, upload operation will rewrite all the properties. 2) If the SQL server instance is first imported via CSV and then discovered via Azure Migrate appliance and the user has provided credentials to do deep discovery via Azure Migrate appliance, then the appliance discovered data will override the data imported via CSV. 3) If the SQL server instance is first imported via CSV and then discovered via Azure Migrate appliance and the user has not provided SQL Server credentials to do deep discovery, then the appliance discovered data will only update the additional data that appliance discovered. 4) If the SQL Server Instance is first discovered via the Azure Migrate appliance and is deep discovered (correct SQL Server credentials are provided), and if the SQL Server is imported via CSV, data imported via CSV will be discarded. 5) If the SQL Server Instance is first discovered via the Azure Migrate appliance and is only inventoried (correct SQL Server credentials are not provided), and if the SQL Server is later imported via CSV, upload operation will completely overwrite the appliance discovered details. How to onboard to Private Preview To onboard Preview of SQL Inventory import as CSV, please fill the Signup form for Private preview What’s Next 1) In the next release will ship out capabilities to create Business case for SQL Servers imported via CSV. 2) We will also build support for Availability Groups and Failover cluster Instances for the SQL Server instances imported via CSV.