Performance
127 TopicsEffectively troubleshoot latency in SQL Server Transactional replication: Part 1
Are you struggling with latency issues in SQL Server Transactional replication? This article provides clear, step-by-step instructions to effectively troubleshoot and resolve these challenges. Dive into proven techniques and best practices that will help you enhance your SQL Server's performance and ensure seamless data replication. Don't let latency slow you down—master the art of SQL Server troubleshooting today! I hope you find this teaser engaging! If you need any adjustments or additional content, feel free to let me know. Thanks to Collin Benkler, Senior Escalation Engineer in Microsoft for SQL Server for his valuable feedbacks.4.5KViews4likes4CommentsOS Hang or Out of Memory due to SQL Ser... No Wait, it's SQL Analysis Services (SSAS)
First published on MSDN on Jan 12, 2018 Recently, we have observed a number of cases where DBAs or application developers are complaining about out-of-memory errors or even machine not responding (hangs)despite the fact that there is plenty of available memory on the system.9.6KViews5likes1CommentUpdate Stats Sample Rate does not work
Create/Update statistics allows users to speicfy the sample rate. However, the sample rate may not work as you expected in some scenarios. 1.Tables less than 1024 pages I'm going to use the table Production.Product in AdventureWorks2019 to demonstrate use AdventureWorks2019 go create statistics IProductID on Production.Product(ProductID) with sample 20 percent go dbcc show_statistics('Production.Product','IProductID') In this script, Sample rate is set to 20%. However, the 'DBCC how_statistics' shows that the 'Rows' equals to 'Rows Sampled', which means it's 100% sampled. Why? Because for table with less than 1024 pages in the clustered index(if the table is heap, we count the indexid 0) of table, SQL Server ignores the sample specified and always use 100% sampled. In this case, the Prodcution.Product only has 15 pages, hence it's always 100% sampled. Please note, sample 0 is an exception. If you specify 0, SQL Server does not create histogram. 2.Tables that have more than 1024 pages SQL Server guarantee that at least 1024 pages will be sampled. If the sample rate specified is less than 1024, SQL Server will replace it with 1024 pages. If it’s greater than 1024 pages, SQL Server will use following formula as sample rate:1024/Total Pages. 3.What if sample rate is not specified? If the pages is greater than 1024, SQL Server picks up from smaller one from following two TotalPages (15*power(Rows,0.55)/TotalRows*TotalPages)+1024 So if it's smaller table , the rate isTotalPages/TotalPages=100% For big table,the rate is ((15*power(Rows,0.55)/TotalRows*TotalPages)+1024)/TotalPages=15*power(Rows,0.55)/TotalRows*TotalPages/TotalPages+1024/TotalPages=15*power(Rows,0.55)/TotalRows+1024/TotalPages For large table, the rows in the 1024 pages can be ignored. Here are samples 1.If A table has 1,000,000, then 15*power(1000000.0,0.55) =29929 rows will be sampled, almost 29929/1000000=2.9% 2.If A table has 10,000,000, then 15*power(10000000.0,0.55) =106192 rows will be sampled, almost 106192/1000000=1.06%2.9KViews2likes0CommentsMemory Grants: The mysterious SQL Server memory consumer with Many Names
First published on MSDN on Jan 01, 2013 The Memory Consumer with Many NamesHave you ever wondered what Memory grants are? What about QE Reservations? And Query Execution Memory? Workspace memory? How about Memory Reservations?As with most things in life, complex concepts often reduce to a simple one: all these names refer to the same memory consumer in SQL Server: memory allocated during query execution for Sort and Hash operations (bulk copy and index creation fit into the same category but a lot less common).44KViews3likes0Comments