azure sql database
295 TopicsAzure Monitor SQL Insights – Preview
Azure Monitor SQL insights (Preview) Comprehensive and reliable monitoring is a top priority for all SQL customers. The Azure SQL and Azure Monitor teams are proud to announce the preview of Azure Monitor SQL insights: a new, Azure-native monitoring experience for almost any SQL deployment in Azure.8.3KViews8likes2CommentsPublic Preview: Shrink for Azure SQL Database Hyperscale
Update: On 29 January 2025 we announced the General Availability for shrink in Hyperscale. For more details, please read the GA announcement. If you are using Hyperscale in Azure SQL Database, you know that it is a powerful tier that lets you rapidly scale up and scale out your database according to your needs along with autoscaling storage. However, there could be situations where storage is scaled up automatically and then due to some business needs, a significant amount of data is removed/purged and a lot of free space is left within the database Today, we are pleased to announce that database and data file shrink is available in the Hyperscale tier in preview. Now you can reduce the allocated size of a Hyperscale database using the same DBCC SHRINK* commands that you might be familiar with. This allows you to reduce the size of the databases and free up unused space to save storage costs. How to use shrink in Hyperscale? Using shrink is easy and straightforward. You use the same set of commands which you might have used in other tiers of Azure SQL Database or in SQL Server. First, identify a Hyperscale database with substantial allocated but unused storage space. For definitions of allocated and used storage space, see Azure SQL Database file space management - Azure SQL Database | Microsoft Docs. Azure portal also provides this information. You can also capture the current used, allocated, and unused space in each database file by executing the following query in the database. DECLARE @NumPagesPerGB float = 128 * 1024; SELECT file_id AS FileId , size / @NumPagesPerGB AS AllocatedSpaceGB , ROUND(CAST(FILEPROPERTY(name, 'SpaceUsed') AS float)/@NumPagesPerGB,3) AS UsedSpaceGB , ROUND((size-CAST(FILEPROPERTY(name, 'SpaceUsed') AS float))/@NumPagesPerGB,3) AS FreeSpaceGB , ROUND(max_size / @NumPagesPerGB,3) AS MaxSizeGB , ROUND(CAST(size - FILEPROPERTY(name, 'SpaceUsed') AS float)*100/size,3) AS UnusedSpacePercent FROM sys.database_files WHERE type_desc = 'ROWS' ORDER BY file_id A shrink operation can be initiated using either command to shrink the entire database, or DBCC SHRINKFILE command for individual data files. We recommend using DBCC SHRINKFILE, because you can run it in parallel on multiple sessions, targeting different data files. DBCC SHRINKDATABASE is a simpler choice because you only need to run one command, but it will shrink one file at a time, which can be time-consuming for larger databases. If shrink operation fails with any error or canceled, the progress it has made so far is retained, and the same shrink command can be simply executed again to continue. Once shrink for all data files has completed, rerun the earlier query (or check in the Azure portal) to determine the resulting reduction in the allocated storage size. If there is still a large difference between used space and allocated space, you can rebuild indexes to reduce the total number of used data pages. This may temporarily increase allocated space further, however shrinking files again after rebuilding indexes should result in a higher reduction in allocated space. For more details about Azure SQL Database space management, see the following documentation article: Database file space management - Azure SQL Database | Microsoft Learn Known behaviors / limitations Database shrink is a long-running operation. For larger databases, it may span multiple days. To avoid shrink getting interrupted, we recommend using a client that is unlikely to get disconnected from the database. While shrink is running, used and allocated space for the database in the Azure portal might not be reported. Running SHRINKFILE with a target size slightly higher than the used space in the file tends to have a higher success rate compared to setting it to the exact used space. For instance, if a file is 128 GB in total size with 50 GB used and 78 GB free, setting the target size to 55 GB results in a better space reduction compared to using 50 GB. When executing DBCC SHRINKFILE concurrently on multiple files, you may encounter occasional blocking between the sessions. This is expected and does not impact the outcome of shrink. Shrinking of the transaction log file in the Hyperscale tier is not required as it does not contribute to the allocated data size and cost. Executing DBCC SHRINKFILE(2) has no effect on the transaction log size. Shrink is currently in preview mode and has the following limitations: Shrink is not allowed on unencrypted databases. Any such attempt raises the following error: Msg 49532, Level 16, State 1, Line 1 DBCC SHRINKFILE for data files is not supported in a Hyperscale database when the database is not encrypted. Enable transparent data encryption and try again. To find the encryption state of the database, execute the following query: SELECT db_name(database_id) AS 'database_name' ,encryption_state_desc FROM sys.dm_database_encryption_keys WHERE database_id = db_id() If encryption state is other than ENCRYPTED then shrink will not start. Conclusion We hope that you will find shrink useful and beneficial for your Hyperscale databases. We welcome your feedback and suggestions on how to improve it. You can contact us by adding to this blog post and we’ll be happy to get back to you. Alternatively, you can also email us at sqlhsfeedback AT microsoft DOT com. We are eager to hear from you all!10KViews7likes19CommentsMicrosoft Copilot in Azure extends capabilities to Azure SQL Database (Public Preview)
We are excited to announce the introduction of new Copilot skills for Azure SQL Database! Microsoft Copilot in Azure can now help you streamline the design, operation, optimization, and health of Azure SQL Database-driven applications. It improves productivity in the Azure portal by offering self-help for database administration and natural language to SQL conversion.21KViews6likes5CommentsAnnouncing Azure Data Tech Groups v2.0: Big News for User Groups and Members
Two years ago, Microsoft publicly launched the new Azure Data Community webpage, and created Azure Data Tech Groups utilizing Meetup. Prior to this, many user groups used PASS resources to host their websites, virtual meetings, manage members and more. Over the following months, we onboarded qualified user groups to the new Azure Data Meetup network and began providing Microsoft Teams access to help groups meet virtually through the pandemic. All of our community efforts were to empower the community, provide you with resources to help you continue to connect with one another, learn and develop your skills, and grow in your careers. V 1.0 of Azure Data Tech Groups had one great big limitation: groups that didn't use Meetup had to start or we weren't able to include them. This had the greatest impact on the community in LATAM, the Middle East and APAC, as they're less likely to utilize Meetup. This latest initiative will expand access to our network to more groups globally, groups that use their own website, LinkedIn, HDX or various ways to connect with members.7.5KViews6likes2CommentsAnnouncing General Availability of Change Data Capture (CDC) on Azure SQL Database
We are excited to announce the general availability (GA) of change data capture (CDC) in Azure SQL Database. What is Change data capture (CDC)? CDC provides historical change information for a user table by capturing both the fact that Data Manipulation Language (DML) changes (insert / update / delete) were made and the changed data. Changes are captured by using a capture process that reads changes from the transaction log and places them in corresponding change tables. These change tables provide a historical view of the changes made over time to source tables. CDC functions enable the change data to be consumed easily and systematically. Why use Change data capture (CDC)? CDC is a widely used feature by a wide range of customers for a variety of purposes as mentioned below, but not limited to: Tracking data changes for auditing Recording data changes on source database and integrating with other services (e.g. Azure Data Factory) to stream these changes to other destinations Performing analytics on change data Change data capture (CDC) on Azure SQL database CDC is now generally available on Azure SQL databases, enabling customers to track insert / update / delete data changes on their Azure SQL Database tables. On Azure SQL database, CDC offers a similar functionality to SQL Server and Azure SQL Managed Instance, providing a scheduler which automatically runs change capture and cleanup processes on the change tables. These capture and cleanup processes used to be run as SQL Server Agent jobs on SQL Server on premises and on Azure SQL Managed Instance, but now they run automatically through the scheduler in Azure SQL databases. Customers can still run scans and cleanup manually on demand. CDC on Azure SQL database has been one of the most requested features through Azure feedback mechanisms. Multiple customers have successfully used CDC during public preview, such as Rubrik, a major cloud data management company. After using CDC, they highlighted: “Flawless implementation. An extremely useful addition to Azure SQL Databases that enabled us to track and capture changes from them meticulously.” Enabling change data capture (CDC) on an Azure SQL database Customers will be able to use CDC on Azure SQL databases higher than the S3 (Standard) tier or vCore equivalent. More specifically, CDC cannot be enabled on subcore databases. Enabling CDC on an Azure SQL database is similar to enabling CDC on SQL Server or Azure SQL Managed Instance. Learn more here: Enable CDC Sending CDC data changes to other destinations Multiple Microsoft technologies such as Azure Data Factory can be used to move CDC change data to other destinations (e.g. other databases, data warehouses). Other third-party services also offer streaming capabilities for changed data from CDC. For instance, Striim and Qlik offer integration, processing, delivery, analysis, or visualization capabilities for CDC changes. “Real-time information is vital to the health of the enterprises,” says Codin Pora, VP of Technology and partnership at Striim. “Striim is excited to support the new change data capture (CDC) capabilities of Azure SQL Database and help companies drive their digital transformation by bringing together data, people, and processes. Striim, through its Azure SQL Database CDC pipelines, provides real-time data for analytics and intelligence workloads, operational reporting, ML/AI implementations and many other use cases, creating value as well as competitive advantage in a digital-first world. Striim builds continuous streaming data pipelines with minimal overhead on the source Azure SQL Database systems, while moving database operations (inserts, updates, and deletes) in real time with security, reliability, and transactional integrity.” “Joint customers are excited about the potential of leveraging Qlik Data Integration alongside CDC in Azure SQL DB and CDC for SQL MI to securely access more of their valuable data for analytics in the cloud,” said Kathy Hickey, Vice President, Product Management at Qlik. “We are happy to announce that in addition to support for Azure SQL MI as a source, the newly available MS-CDC capabilities will also allow us to support Azure SQL DB sources via our Early Access Program. We look forward to partnering with Microsoft on helping customers leverage these capabilities to confidently create new insights from their Azure managed data sources.” More Information To know more about change data capture (CDC), check out the documentation. Learn module: Data replication on Azure SQL Databases - Learn | Microsoft Docs16KViews6likes9CommentsIntroducing the Enhanced Azure SQL Database Free Offer: Now Generally Available
We are thrilled to announce the general availability of our new Azure SQL Database free offer. Now, each Azure subscription includes not just one, but 10 serverless databases. Each database comes with a complimentary allocation of 100,000 vCore seconds of compute, 32 GB of data storage, and 32 GB of backup storage every month, for the lifetime of your subscription. This enhanced free offer is ideal for new Azure customers looking to develop for free, learn SQL, or create a proof of concept, as well as existing Azure customers that are considering adding another database. Get started today To learn more about the offer, see Azure SQL Database documentation. If you already have an Azure account, you can head straight to the Azure SQL Database provisioning page and select the Apply offer button in the banner at the top of the page. This offer is now live in all regions! Offer details The Azure SQL Database free offer provides access to the full capabilities of the General Purpose tier, renowned for its versatility and ability to support a wide range of workloads. Whether you're handling routine operations or running high-performance tasks, the free offer empowers you to configure your database from .5 vCore to 4 vCores, scaling up or down based on your needs. You can also benefit from the serverless option, which automatically pauses the database when idle and resumes it as soon as activity starts, reducing compute costs while maximizing your free allocation. With this free offer, every Azure subscription—whether it's pay-as-you-go, under an enterprise agreement, or part of the Microsoft Partner Network—can now include up to 10 serverless databases. The offer is designed to last for the lifetime of your subscription and refreshes monthly, giving each database a generous allocation of 100,000 vCore seconds, 32 GB of data storage, and 32 GB of backup storage, that’s a total of 1 million vCore-seconds of free compute every month per subscription. Once a region is selected for the free database under a subscription, the same region applies to all 10 free databases in that subscription and cannot be changed If you consume your monthly free resources before the end of the cycle, you have two flexible options: Auto-pause: Allow the database to pause itself and resume usage automatically when the next month's free allocation begins. Continue usage mode: Keep the database running and pay for additional usage. In this mode, you’ll continue to receive the free monthly allocation while unlocking premium capabilities, such as scaling up to 80 vCores of compute and 4 terabytes of storage. This makes it easy to start small and scale seamlessly as your business grows. A standout benefit of this free offer is its seamless transition. You can move from free usage to paid usage without any disruption—your data, schema, settings, and connections remain intact. There’s no need to migrate or reconfigure, making it effortless to grow your database as your needs evolve. Additionally, the Azure portal includes 'Free amount remaining' metrics, enabling you to monitor your consumption and manage your costs with ease. This makes the Azure SQL Database free offer an exceptional choice for developers, learners, and enterprises alike—whether you’re just starting out or preparing to scale. Develop for free The Azure SQL Database Free Offer is tailored for a wide range of users—from students and individual developers to small businesses and large enterprises—looking to develop and scale their SQL workloads at no initial cost. This offer enables you to launch up to 10 fully featured Azure SQL database, optimizing both your financial resources and developmental ambitions. Potential Use Cases: Application Development: Initiate the development of applications or websites with SQL as the core backend database. Skill Enhancement: Engage in hands-on SQL learning or refine your existing skills through practical experience and targeted tutorials. Prototyping: Craft proofs of concept or prototypes for innovative projects and ideas. Expansion and Testing: Integrate additional databases into your Azure subscription to facilitate testing or experimentation. Migration Projects: Migrate applications from on-premises setups or other cloud environments to Azure, leveraging its robust capabilities. Leveraging AI with Azure SQL Database Free Offer: Predictive Analytics: Use AI to forecast user behaviors and business outcomes, enhancing decision-making processes across various applications. Personalization Engines: Develop sophisticated personalization algorithms to enhance user experience by tailoring content and recommendations in real-time. Anomaly Detection: Implement AI to detect unusual patterns or potential threats, ensuring the security and integrity of your data. Automated Data Management: Utilize AI to automate data cleaning, transformation, and integration tasks, reducing manual overhead and increasing efficiency. Key Benefits of the Azure SQL Database Free Offer With the Azure SQL Database Free Offer, you gain the advantages of a cloud-native relational database service known for its high availability, stellar performance, security, scalability, and compatibility with SQL Server. Furthermore, you can harness the comprehensive suite of tools and services integrated with Azure SQL Database, such as Azure Data Studio, Visual Studio Code, Azure Data Factory, Azure Synapse Analytics, and Power BI. This rich ecosystem enhances your database's functionality and seamlessly connects with your existing workflows. Learn more The Azure SQL Database Free offer is available now and you can start using it today. Don’t miss this opportunity to get up to 10 free Azure SQL databases that can help you achieve your goals and grow your business. To learn more about this offer, visit the Azure SQL Database documentation. To get started, create your free database from the Azure portal by creating an Azure SQL Database resource. We hope you enjoy this new offer, and we look forward to hearing your feedback and seeing what you build with Azure SQL Database.5.9KViews5likes6CommentsYour max log rate on SQL Managed Instance Business Critical is now doubled
We are excited to announce that the transaction log write throughput (referred to as "log rate") limit is now doubled for the Azure SQL Managed Instances Business Critical with Premium-series hardware. The new limit of 192 MiB/s significantly improves the overall performance for write-intensive user scenarios such as data ingestion and index maintenance and allows your instances to manage more workload for the same price! What is log rate? Log rate in SQL Managed Instance refers to the rate at which transaction log records are generated and written to the transaction log file. Log rate is directly correlated with the number of transactions processed by the database because every executed transaction generates a certain amount of data that is written to the transaction log. Log rate is important because it directly impacts the performance and stability of your database operations, especially for write-intensive workloads. Log rate limit is now 192 MiB/s In SQL Managed Instance, the log rate limit is based on the number of vCores and the service tier. For the Business Critical tier, the log rate is limited to 4.5 MiB/s per vCore, with a new maximum of 192 MiB/s per instance with no per-database cap limitations (i.e. a single database on a Business Critical instance can use up the whole 192 MiB/s log rate quota). You can benefit from the new log rate limit by choosing Business Critical Premium-series hardware for your SQL Managed Instance. Business Critical Log write throughput limit Before Now Standard series 4 MiB/s per vCore Up to 96 MiB/s per instance 4.5 MiB/s per vCore Up to 96 MiB/s per instance Premium series 4 MiB/s per vCore Up to 96 MiB/s per instance 4.5 MiB/s per vCore Up to 192 MiB/s per instance Premium series memory optimized 4 MiB/s per vCore Up to 96 MiB/s per instance 4.5 MiB/s per vCore Up to 192 MiB/s per instance Why is the log rate limit important? A higher limit for the log rate ensures that your instance can manage more transactions when needed. Therefore, max log rate is important in scenarios when you need higher log rates, such as: High-volume transaction processing workloads (OLTP): If your application has a high volume of data modifications (INSERT, UPDATE, DELETE), a higher log rate can help accommodate the increased transaction log generation and improve performance. Bulk data operations: When performing bulk data operations, such as bulk inserts or large-scale data imports, a higher log rate can help process these operations more efficiently and reduce the time it takes to complete them. Database replication or synchronization: If you are using features like transactional replication or other synchronization mechanisms that rely on transaction log shipping, a higher log rate can ensure timely data synchronization between primary and secondary instances. How to monitor the log rate? Monitoring and managing the log rate is essential to ensure optimal performance and avoid potential bottlenecks in your SQL Managed Instance. You can use QPI library to monitor the log rate and take appropriate actions if needed. STEP 1: Take a snapshot of the current values in sys.dm_io_virtual_file_stats DM. EXEC qpi.snapshot_file_stats STEP 2: Find information about each file (size, throughput, read, write [mbps]). SELECT * FROM qpi.file_stats We are interested in the information about log files (size, throughput, read, write [mbps]). SELECT * FROM qpi.file_stats WHERE file_name = 'log' Note: TempDB log file, templog, is not governed i.e., templog does not count for the log rate of a SQL MI. Note: If you renamed any data log file such that its file_name was no longer called “log”, you would need to readjust these queries to sum all log files (except templog) to the log rate of the MI. STEP 3: Find the sum of values from the column write_mbps for all log files on the instance to get the log rate. SELECT SUM(write_mbps) AS lograte FROM qpi.file_stats WHERE file_name = 'log' This “lograte” is now limited to 192 MiB/s! More good news: this is free of charge! These improvements are a fantastic value for our customers since there are no associated pricing changes; we have doubled your log rate limit on SQL MI Business Critical for no extra charge. These changes were automatically enabled for all existing and future Business Critical Azure SQL Managed Instances that use Premium-series hardware as of the beginning of August 2023. Conclusion This improvement makes Azure SQL Managed Instance an excellent choice for your performance-hungry database workloads and allows you to migrate more of your larger SQL workloads to Azure SQL MI. If you're still new to Azure SQL Managed Instance, now is a great time to get started and take Azure SQL Managed Instance for a spin!8.4KViews5likes1Comment