Blog Post

Azure Database Support Blog
3 MIN READ

When and How to Update Statistics in Azure SQL Database

Mohamed_Baioumy_MSFT's avatar
Jan 06, 2026

Accurate statistics are a cornerstone of good query performance in Azure SQL Database. While the platform automatically manages statistics in most scenarios, there are real‑world cases where manual intervention is not only recommended—but essential. This article explains when, why, and how to update statistics in Azure SQL Database, with practical samples and real customer scenarios drawn from production support cases.

Microsoft Learn reference (overview):

https://learn.microsoft.com/sql/relational-databases/statistics/statistics 

Why Statistics Matter

SQL Server’s query optimizer relies on statistics to estimate row counts, choose join strategies, allocate memory grants, and decide whether to run operations in parallel. When statistics are stale or inaccurate, even well‑indexed queries can suddenly degrade.

In Azure SQL Database:

  • AUTO_CREATE_STATISTICS is enabled and managed by the platform
  • AUTO_UPDATE_STATISTICS runs asynchronously

Because updates are async, queries may continue running with outdated cardinality estimates until statistics refresh completes.

https://learn.microsoft.com/en-us/sql/relational-databases/query-processing-architecture-guide 

When Manual Statistics Updates Are Required

1. After Large Data Changes (ETL / Batch Operations)

Customer scenario

A nightly ETL job bulk‑inserts millions of rows into a fact table. The following morning, reports time out and logical reads spike.

Why it happens

Auto‑update thresholds are based on row‑count changes and may not trigger immediately—especially for append‑only or skewed data.

Recommended action

UPDATE STATISTICS dbo.FactSales;

Target only the critical statistic if known:

UPDATE STATISTICS dbo.FactSales (IX_FactSales_CreatedDate);

2. Query Plan Regression Without Schema Changes

Customer scenario

A stable query suddenly switches from a Nested Loops join to a Hash Join, increasing CPU usage and BUFFERIO waits.

Root cause

Statistics no longer reflect current data distribution.

Recommended action

UPDATE STATISTICS dbo.Customer WITH FULLSCAN;

Learn more:

https://learn.microsoft.com/sql/relational-databases/statistics/update-statistics

3. After Restore Operations (PITR / Geo‑Restore / Database Copy)

Customer scenario

After a Point‑in‑Time Restore (PITR) on a Hyperscale database, queries run slower despite healthy platform telemetry.

Why it happens

Statistics are restored as‑is, but workload patterns often change after the restore point. Auto‑update statistics may lag behind.

Recommended action

EXEC sp_updatestats;

Prioritize heavily accessed tables first on large databases.

Learn more:

https://learn.microsoft.com/azure/azure-sql/database/recovery-using-backups

Query Store Comparison: Before vs After Updating Statistics

One of the most effective ways to validate the impact of statistics updates is Query Store.

Before update (typical signs):

  • Sudden plan change for the same query text
  • Increased logical reads and CPU time
  • Change in join strategy or memory grant

After statistics update:

  • Optimizer selects a more efficient plan
  • Logical reads reduced
  • CPU and duration stabilize

Example workflow

-- Capture runtime stats SELECT * FROM sys.query_store_runtime_stats WHERE query_id = <QueryID>; -- Update statistics UPDATE STATISTICS dbo.Orders; -- Force recompilation EXEC sp_recompile 'dbo.Orders';

Query Store reference:

https://learn.microsoft.com/en-us/sql/relational-databases/performance/monitoring-performance-by-using-the-query-store 

Decision Flow: When Should I Update Statistics?

Performance regression observed? | v Query plan changed without schema change? | Yes | v Recent data change / restore / ETL? | Yes | v Update targeted statistics

If NO at any step, rely on automatic statistics and continue monitoring.

What NOT to Do

❌ Do not run blanket WITH FULLSCAN on all tables

FULLSCAN is CPU and IO expensive, especially on large or Hyperscale databases.

❌ Do not schedule frequent database‑wide sp_updatestats jobs

This can introduce unnecessary workload and plan churn.

❌ Do not update statistics blindly without investigation

Always validate plan regression or stale estimates using Query Store or execution plans.

Checking Statistics Freshness

SELECT OBJECT_NAME(s.object_id) AS table_name, s.name AS stats_name, sp.last_updated, sp.rows, sp.rows_sampled FROM sys.stats s CROSS APPLY sys.dm_db_stats_properties(s.object_id, s.stats_id) sp ORDER BY sp.last_updated;

DMV reference:

https://learn.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-db-stats-properties-transact-sql 

Best Practices Summary

✅ Prefer targeted statistics updates

✅ Update stats after bulk data changes or restores

✅ Validate results using Query Store

✅ Avoid unnecessary FULLSCAN operations

✅ Use stats updates as a diagnostic and remediation step, not routine maintenance

Conclusion

Although Azure SQL Database manages statistics automatically, asynchronous updates and changing workload patterns can result in sub‑optimal query plans. Manually updating statistics after significant data changes, restore operations, or observed plan regressions is a safe and effective best practice to restore optimal query performance.

Updated Jan 06, 2026
Version 5.0
No CommentsBe the first to comment