Backup
142 TopicsSQL Backup using Azure File Share
SQL Server limits the maximum backup size supported using a page blob to 1 TB. The maximum backup size supported using block blobs is limited to approximately 200 GB (50,000 blocks * 4 MB MAXTRANSFERSIZE). In order to get over the above limitation, we can opt for Azure File Share. SQL Backup is more than 12TiB, you can choose to backup using Azure File Share Standard file shares can span up to 100 TiB, however this feature is not enabled by default. If you need a file share that is larger than 5 TiB, you will need to enable the large file share feature for your storage account. Premium file shares can span up to 100 TiB without any special setting, however premium file shares are provisioned, rather than pay as you go like standard file shares. This means that provisioning file share much larger than what you need will increase the total cost of storage. In local and zone redundant storage accounts, Azure file shares can span up to 100 TiB, however in geo- and geo-zone redundant storage accounts, Azure file shares can span only up to 5 TiB. Prerequisites Storage Account Azure PowerShell Storage Module https://www.powershellgallery.com/packages/Az.Storage/3.7.0 Steps 1. Storage Account with Premium FileShare Enable Large File Share 2. Register for the SMB Multichannel preview with the following commands. Connect-AzAccount # Setting your active subscription to the one you want to register for the preview. # Replace the <subscription-id> placeholder with your subscription id. $context = Get-AzSubscription -SubscriptionId <your-subscription-id> Set-AzContext $context Register-AzProviderFeature -FeatureName AllowSMBMultichannel -ProviderNamespace Microsoft.Storage Register-AzResourceProvider -ProviderNamespace Microsoft.Storage You can also verify if the feature registration is complete Get-AzProviderFeature -FeatureName AllowSMBMultichannel -ProviderNamespace Microsoft.Storage 3. Enable SMB Multichannel Once you have created a File Storage account, you can follow the instructions to update SMB Multichannel settings for your storage account Note: If the SMB Multichannel option is not visible under File share settings or you get a failed to update setting error while updating the configuration, please make sure that your subscription is registered, and your account is in one of the supported regions with supported account type and replication. 4. Create a file share You can set max capacity up to 100TB 5. Connect to FileShare from Window either using Active Directory or Storage Account Connecting to a share using the storage account key is only appropriate for admin access. But mounting the Azure file share with the Active Directory identity of the user is preferred. $connectTestResult = Test-NetConnection -ComputerName testsmbfileshare.file.core.windows.net -Port 445 if ($connectTestResult.TcpTestSucceeded) { # Save the password so the drive will persist on reboot cmd.exe /C "cmdkey /add:`"testsmbfileshare.file.core.windows.net`" /user:`"localhost\testsmbfileshare`" /pass:`"SxbRsNuwc1*******/8lk1TyUkqC+2+************==`"" # Mount the drive New-PSDrive -Name Z -PSProvider FileSystem -Root "\\testsmbfileshare.file.core.windows.net\sqlbackup" -Persist } else { Write-Error -Message "Unable to reach the Azure storage account via port 445. Check to make sure your organization or ISP is not blocking port 445, or use Azure P2S VPN, Azure S2S VPN, or Express Route to tunnel SMB traffic over a different port." } Copy this script and run this in PowerShell to map this as a network drive locally. This script will check to see if this storage account is accessible via TCP port 445, which is the port SMB uses. If port 445 is available, your Azure file share will be persistently mounted. Note: The script will only work on Windows Server 2012 and above Once we had the above script is executed, we could see the Z drive as network Drive in My computer / This PC 6. On the SQL Server you need to first enable XP_cmdshell so we can configure backups to this file share. Enable the Advance SQL Configuration EXECUTE sp_configure 'show advanced options', 1; GO -- To update the currently configured value for advanced options. RECONFIGURE; GO -- To enable the feature. EXECUTE sp_configure 'xp_cmdshell', 1; GO -- To update the currently configured value for this feature. RECONFIGURE; GO We must mount the Z Drive in SQL server and opt for it to be available for backups. Therefore, we map it using the script below xp_cmdshell 'net use Z: \\testsmbfileshare.file.core.windows.net\sqlbackup /u:localhost\testsmbfileshare SxbRsNuwc1*******/8lk1TyUkqC+2+************==`' Get the storage account, username and access key from Step 5 7. Backup the database now to the file share subdirectory using the below command BACKUP DATABASE [AdventureWorks2019] TO DISK = 'Z:\AdventureWorks2019.bak' with stats = 5 Reference (optional) SMB file shares in Azure Files | Microsoft Docs Create an Azure file share - Azure Files | Microsoft Docs https://technet.microsoft.com/en-us/library/dn435916(v=sql.120).aspx#limitations21KViews6likes4CommentsTips and Tricks in using mysqldump and mysql restore to Azure Database for MySQL
While importing data into Azure Database for MySQL, errors may occur. This blog will walk through common issues that you may face and how to resolve it. Access denied; you need (at least one of) the SUPER privilege(s) for this operation: Error ERROR 1227 (42000) at line 101: Access denied; you need (at least one of) the SUPER privilege(s) for this operation Operation failed with exitcode 1 Issue Importing a dump file that contains definers will result in the above error. As all of us know, only super users can perform and create definers in other schemas. Azure Database for MySQL is a managed PaaS solution and SUPER privileges is restricted. Solution Either replace the definers with the name of the admin user that is running the import process or remove it. The admin user can grant privileges to create or execute procedures by running GRANT command as in the following examples: GRANT CREATE ROUTINE ON mydb.* TO 'someuser'@'somehost'; GRANT EXECUTE ON PROCEDURE mydb.myproc TO 'someuser'@'somehost'; Example: Before: DELIMITER ;; /*!50003 CREATE*/ /*!50017 DEFINER=`root`@`127.0.0.1`*/ /*!50003 …… DELIMITER ; After: DELIMITER ;; /*!50003 CREATE*/ /*!50017 DEFINER=`AdminUserName`@`ServerName`*/ /*!50003 …… DELIMITER ; importing triggers while binary logging is enabled: Error ERROR 1419 (HY000) at line 101: You do not have the SUPER privilege and binary logging is enabled (you *might* want to use the less safe log_bin_trust_function_creators variable) Operation failed with exitcode 1 Issue Importing a dump file that contains triggers will result in the above error if binary logging is enabled. Solution To mitigate the issue, you need to enable the parameter “log_bin_trust_function_creators” from Azure portal parameters blade. storage engine not supported: Error ERROR 1030 (HY000) at line 114: Got error 1 from storage engine Operation failed with exitcode 1 Issue You will see the above error when you use a storage engine other than InnoDB and MEMORY. Read more on support engine types here: Storage engine support Solution Before the import process make sure that you are using a supported engine type; InnoDB and MEMORY are the only supported engine types in Azure Database for MySQL. If you dumped the data from a different engine type, edit the file and replace the storage engine. For example, exchange ENGINE=MYISAM with ENGINE=InnoDB. Note: You can always dump the schema first using the command: mysqldump --no-data option, and then dump the data using option: mysqldump --no-create-info option Example : Before: CREATE TABLE `MyTable` ( `ID` bigint(20) NOT NULL AUTO_INCREMENT, `DeviceID` varchar(50) NOT NULL, PRIMARY KEY (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; After: CREATE TABLE `MyTable` ( `ID` bigint(20) NOT NULL AUTO_INCREMENT, `DeviceID` varchar(50) NOT NULL, PRIMARY KEY (`ID`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; InnoDB storage engine row format: Error ERROR 1031 (HY000) at line 114: Table storage engine for 'mytable' doesn't have this option Operation failed with exitcode 1 Issue In Azure Database for MySQL, four row format options are supported: DYNAMIC, COMPACT and REDUNDANT, the COMPRESSED row format is supported under certain conditions. Solution We support compressed format on General Purpose or Memory Optimized. Customer needs to enable the parameter “innodb_file_per_table” from Azure portal parameters blade, and key_block_size must be 8 or greater than 8. In default, key_block_size is 8. Please visit the Performance considerations guide for best practices while migrating into Azure Database for MySQL. Thank You !13KViews4likes2CommentsFree Microsoft webinars on Operations Management Suite
Myself and my MVP colleague Marc Kean, with the help of Microsoft, have released a series of 4 webinars which covers the basics of Azure OMS. https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-15-HowwillyoumanageyourHybridITenvironmentLogAnalytics-269151.html In https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-15-HowwillyoumanageyourHybridITenvironmentLogAnalytics-269151.html of this new webinar series, learn how to separate the signal from the noise by gaining deeper insights into applications and workloads with Log Analytics: Collect, correlate and act on log data from multiple systems in a single view. Map and understand the application dependencies, and address incidents quickly. https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-15-HowwillyoumanageyourHybridITenvironmentLogAnalytics-269151.html https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-22-HowwillyoumanageyourhybridITEnvironment-272590_02OnDemandRegistration-ForminHero.html In https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-22-HowwillyoumanageyourhybridITEnvironment-272590_02OnDemandRegistration-ForminHero.html of this new webinar series, learn how to take advantage of the economies of Cloud technology to get enterprise class Back-up and DR with OMS Availability. Restore applications and services quickly with disaster recovery as a service (DRaaS). Orchestrate replication, create customised recovery plans, and easily test failover. Extend on-premises data protection tools with purpose built enterprise backup services. Remove the cost and complexity of Tape back-up. https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-22-HowwillyoumanageyourhybridITEnvironment-272590_02OnDemandRegistration-ForminHero.html https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-29-HowwillyoumanageyourHybridITenvironment-272784_02OnDemandRegistration-ForminHero.html In https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-29-HowwillyoumanageyourHybridITenvironment-272784_02OnDemandRegistration-ForminHero.html of this new webinar series, learn how to prevent, detect, and respond to threats with the increased visibility that OMS Security and Compliance provides. Visualise outbound malicious IP traffic and malicious threat types. Understand the security posture of your entire environment regardless of platform. Capture all security log and event data required for a compliance audit. Save time and resources when producing a security audit with a searchable and exportable log and event data set. https://info.microsoft.com/AP-OMS-WBNR-FY17-11Nov-29-HowwillyoumanageyourHybridITenvironment-272784_02OnDemandRegistration-ForminHero.html https://info.microsoft.com/AP-HYBD-WBNR-FY17-12Dec-06-HowwillyoumanageyourHybridITenvironmentPart4Automation275348_02OnDemandRegistration-ForminHero.html In https://info.microsoft.com/AP-HYBD-WBNR-FY17-12Dec-06-HowwillyoumanageyourHybridITenvironmentPart4Automation275348_02OnDemandRegistration-ForminHero.html of this new webinar series, learn how to use Automation to take control of your environment by remediating operational issues & get time back to focus on work that adds business value. Consistently deploy, monitor, and automatically update your IT resources, with change management and patching solutions to enable environmental compliance. Use familiar PowerShell skills to automate complex and repetitive tasks. Extend cloud automation capabilities to your on-premises datacentre without human intervention. https://info.microsoft.com/AP-HYBD-WBNR-FY17-12Dec-06-HowwillyoumanageyourHybridITenvironmentPart4Automation275348_02OnDemandRegistration-ForminHero.html2.3KViews4likes3CommentsExport and import MySQL users and privileges to Azure Database for MySQL
In Azure Database for MySQL, “mysql.user” table is a read-only table, to migrate your mysql.user table you can use the following command to script out the users table content before migration: Generate create user statements: Before going to generate the scripts , we need to check the variable secure_file_priv on your database which limit the directories where you can load or writing files to : mysql> SHOW VARIABLES LIKE "secure_file_priv"; +------------------+-----------------------+ | Variable_name | Value | +------------------+-----------------------+ | secure_file_priv | /var/lib/mysql-files/ | +------------------+-----------------------+ After checking the path, now you can generate the create users from your database: Option 1: mysql> SELECT CONCAT('create user ''',user,'''@''',host,''' identified by ''','YourPassword''',';') FROM mysql.user where user not in ('root','mysql.sys','mysql.session') INTO outfile '/var/lib/mysql-files/create_users.sql'; The generated file contents will look like: create user 'myuser1'@'%' identified by 'YourPassword'; create user 'myuser2'@'%' identified by 'YourPassword'; create user 'myuser3'@'%' identified by 'YourPassword'; create user 'myuser4'@'%' identified by 'YourPassword'; create user 'replication_user'@'%' identified by 'YourPassword'; modify the password above based on your requirements and run the above generated create user statements on Azure database for MySQL. option 2: if you are not aware of the passwords in your database and you want to let the password change on the client side , you can generate a user creation script with temporary password and expired option: SELECT CONCAT('create user ''',user,'''@''',host,''' identified by ''','Temp@123''', ' password expire',';') FROM mysql.user where user not in ('root','mysql.sys','mysql.session') INTO outfile '/var/lib/mysql-files/create_users.sql'; The generated file contents will look like: create user 'myuser1'@'%' identified by 'Temp@123' password expire; create user 'myuser2'@'%' identified by 'Temp@123' password expire; create user 'myuser3'@'%' identified by 'Temp@123' password expire; create user 'myuser4'@'%' identified by 'Temp@123' password expire; create user 'replication_user'@'%' identified by 'Temp@123' password expire; run the above generated create user statements on Azure Database for MySQL. after the user logged in to the database with that temp password , and tried to run any query the user will be notified that he must change the password before running any query , and the user can use “set password = ‘newpassword’ ” statement to reset the password: mysql> select 1; ERROR 1820 (HY000): You must reset your password using ALTER USER statement before executing this statement. mysql> mysql> set password = 'Mysql@1234'; Query OK, 0 rows affected (0.01 sec) mysql> select 1; +---+ | 1 | +---+ | 1 | +---+ 1 row in set (0.00 sec) Export users’ privileges: To export the user privileges, you can run the below query: SELECT CONCAT('SHOW GRANTS FOR ''',user,'''@''',host,''';') FROM mysql.user where user not in ('root','mysql.sys','mysql.session') INTO outfile '/var/lib/mysql-files/user_grants.sql'; The generated file will look like: SHOW GRANTS FOR 'myuser1'@'%'; SHOW GRANTS FOR 'myuser2'@'%'; SHOW GRANTS FOR 'myuser3'@'%'; SHOW GRANTS FOR 'myuser4'@'%'; SHOW GRANTS FOR 'myuser5'@'%'; SHOW GRANTS FOR 'replication_user'@'%'; You can source the generated file to get the required commands , before sourcing the file to generate a ready contents of command lines , connect to your local database with options -Ns to remove the headers from the output: mysql -Ns -h root -p mysql> source /var/lib/mysql-files/user_grants.sql GRANT USAGE ON *.* TO 'myuser1'@'%' GRANT USAGE ON *.* TO 'myuser2'@'%' GRANT USAGE ON *.* TO 'myuser3'@'%' GRANT USAGE ON *.* TO 'myuser4'@'%' GRANT USAGE ON *.* TO 'myuser5'@'%' GRANT REPLICATION SLAVE, REPLICATION CLIENT ON *.* TO 'replication_user'@'%' Connect to your Azure database for MySQL and run the above generated commands: mysql -h ServerName.mysql.database.azure.com -u username@servername -p mysql> GRANT USAGE ON *.* TO 'myuser1'@'%'; GRANT USAGE ON *.* TO 'myuser3'@'%'; GRANT USAGE ON *.* TO 'myuser4'@'%'; GRANT USAGE ON *.* TO 'myuser5'@'%'; GRANT REPLICATION SLAVE, REPLICATION CLIENT ON *.* TO 'replication_user'@'%';Query OK, 0 rows affected (0.03 sec) mysql> GRANT USAGE ON *.* TO 'myuser2'@'%'; Query OK, 0 rows affected (0.02 sec) mysql> GRANT USAGE ON *.* TO 'myuser3'@'%'; Query OK, 0 rows affected (0.02 sec) mysql> GRANT USAGE ON *.* TO 'myuser4'@'%'; Query OK, 0 rows affected (0.04 sec) mysql> GRANT USAGE ON *.* TO 'myuser5'@'%'; Query OK, 0 rows affected (0.02 sec) mysql> GRANT REPLICATION SLAVE, REPLICATION CLIENT ON *.* TO 'replication_user'@'%'; Query OK, 0 rows affected (0.02 sec) Please note that select into outfile is not supported in Azure Database for MySQL. Instead, use the Create users and Create privilege queries below and run them from workbench and copy the outputs: Example: Create users: SELECT CONCAT('create user ''',user,'''@''',host,''' identified by ''','YourPassword''',';') FROM mysql.user where user not in ('root','mysql.sys','mysql.session'); Create Privileges: SELECT CONCAT('SHOW GRANTS FOR ''',user,'''@''',host,''';') FROM mysql.user where user not in ('root','mysql.sys','mysql.session'); And run the generated commands: SHOW GRANTS FOR 'myuser1'@'%'; SHOW GRANTS FOR 'myuser2'@'%'; SHOW GRANTS FOR 'myuser3'@'%'; SHOW GRANTS FOR 'myuser4'@'%'; SHOW GRANTS FOR 'myuser5'@'%'; SHOW GRANTS FOR 'replication_user'@'%'; Please visit the Performance considerations guide for best practices while migrating into Azure Database for MySQL. Thank You !11KViews3likes0CommentsOneDrive Webinar: Q1 Roadmap Roundup
As people adjust to the new normal—whether that’s working in a hybrid environment or going back to an office that’s undergone some changes since pre-pandemic times—we’re continuing to innovate new features and services in OneDrive that make storing, syncing, and sharing files easier, no matter where people are working. The past quarter has been loaded with OneDrive releases for every role - from better navigation for end users to reporting enhancements for IT admins. Join us for a free webinar to learn more about these exciting features. Date: April 12, 2022 Time: 9 am PT Registration link: What’s new in OneDrive: Q1 roadmap roundup Thanks, Ankita11KViews3likes4CommentsThrottle copying our data
GoodDay community , We are supposed to switch over so that our users end up using OneDrive as their home drives and Copy a lot of large files to Azure Blob. We thought we had everything ready to copy during our testing but our network got so saturated due to the bandwidth consumed. We opened a ticket with MS premier support and our assigned PFE engineer to see how we could throttle Azure Blob and throttle down the copying speed to OneDrive. Microsoft said there was not a way to throttle and our google searches came out with nothing. We are looking for some software that is "reliable" but can throttle copying data to OneDrive and Azure blob. (Azure blob to give you an idea is sucking 80% of our internet connection (just one server). While budget can be an issue, we do not care at this point as long as we can get something reliable. We do not want something that does blackout hours as our operation cannot be interrupted at night. I am hoping we are not the first ones to run into this situation . Notice : my question posted on the other communities , and I still have no dependable answer ! Thanks2.4KViews3likes7CommentsBacking up Azure file storage to Azure Backup
Hi Team, We are already using Azure file Storage. This is to copy files directly to Azure Storage (file shares). A virtual drive has been mapped for this storage. Since Azure File Storage does not support Recovery/Restore Points (i.e. restore a file one week old) or Point in Time copies, we need to use Azure Backup which has these Point in Time/Recovery Points capabilities which will allows us to restore files like 2 days old in case a file has been delete or overwritten. Per our research, Azure File Storage backed up directly to Azure Backup is not possible (yet). Backup of Azure Storage is only possible on BLOB Storage but not Azure File Storage. See links below. https://docs.microsoft.com/en-us/azure/backup/backup-introduction-to-azure-backup https://github.com/levibotelho/azure-blob-backup https://docs.microsoft.com/en-us/azure/backup/backup-try-azure-backup-in-10-mins Is there any possibiltity to achieve this with current capabilities or if not, has this been taken in to the timelines for future deliveries ? any feedback is really appreciated ! Thank You Manoj Karunarathne manojviduranga@hotmail.comSolved16KViews2likes23CommentsIntroducing Backup Pre-Checks for Backup of Azure VMs
Azure Backup's cloud-first approach Aof backing up critical enterprise data in Azure. Enterprise production environments in Azure are becoming increasingly dynamic and are characterized by frequent VM configuration changes (such as network or platform related updates) that can adversely impact backup. Today, we are taking a step to enable customers to monitor the impact of configuration changes and take steps to ensure the continuity of successful backup operations. We are excited to announce the preview of Backup Pre-Checks for Azure Virtual Machines. Backup Pre-Checks, as the name suggests, check your VMs’ configuration for issues that can adversely affect backups, aggregate this information so you can view it directly from the Recovery Services Vault dashboard and provide recommendations for corrective measures to ensure successful file-consistent or application-consistent backups, wherever applicable. All this without any infrastructure and at no additional cost. Read more on the Azure blog.1.6KViews2likes0CommentsPreventing and recovering from accidental deletion of an Azure Database for MySQL flexible server
Accidental deletion of critical Azure resources, such as Azure Database for MySQL flexible servers, can disrupt operations. To help avoid such accidental deletions, you can use a couple of options, including Azure Resource Locks and Azure Policy. This post explains how to implement these mechanisms, and how to revive a dropped MySQL flexible server by using the Azure CLI.877Views2likes0Comments