Recently, we have observed a number of cases where DBAs or application developers are complaining about out-of-memory errors or even machine not responding (hangs) despite the fact that there is plenty of available memory on the system. Frequently this is on systems where SQL Server is running and the issue occurs after increasing the ' max server memory ' sp_configure option, again with plenty of memory available. For example on a system with 64-GB RAM, SQL Server's 'max server memory' is set to 54 GB and you start getting OS errors 1453 or 1450 for example or any other memory-related issue. In some cases the OS stops responding. The key symptom was that when the DBA reduced the SQL Server max server memory below 50 GB (below 80% of RAM on the machine), the problem stopped happening. So logically, you would think there is a problem with SQL Server.
In this one case, the machine where SQL Server was running was experiencing "hangs" - lack of response. One of our colleagues from the Windows team (kudos Syed Yusuf) analyzed a manually-generated kernel memory dump.
ResAvail Pages are the Minimum Working Set that process reserves/set while starting. The value of ResAvail is 4.73 MB which is too low. Even though overall available memory is 8.6 GB, there’s some process which has set Min Working Set to a huge value. This makes available pages reserved to this process and cannot be allocated to anything else.
********** Running out of physical memory **********
VM Peak Work Set Commit Size PP Quota NPP Quota ========= ======= ======== =========== ========= ========= 852.57 MB 1.84 GB 57.73 MB 117.86 MB 571.65 KB 64.37 KB
User Kernel Total ===== ====== ===== 328ms 298ms 626ms
The same problem , but not involving SQL Server or Analysis Services, was described in this blog .
The big surprise (gotcha) came from the fact that Performance Monitor counters showed plenty of available physical memory and working set sizes for all processes were quite reasonable. So don't expect to find this in Perfmon, it just isn't there.
Update (Nov 2018):
Another team of engineers analyzed the issue and proposed a much simpler, non-intrusive method to diagnose the Working Set memory for the Analysis Services process (thanks Venu Cherukupali).
Start it under an Administrator account (Right-click on procexp64.exe, "Run as Administrator")
From top-level menu choose View -> Select Columns... -> Process Memory tab
Check Minimum Working Set and Maximum Working Set check-boxes
Locate the MSMDSRV.EXE process in the list
Examine the Minimum Working Set values
Here is an example of the Minimum Working Set value of SSAS on a 64-GB RAM system.
The Analysis Services experts joined the t-shooting (thank you, Yinn Wong) and as it turns out that by default the SSAS tabular instances, sets minimum working set to 20% of the physical memory on the system. The system was using POWERPIVOT SSAS.
Did this: In most cases it turned out SSAS was not even needed and used on that system. Therefore, in those cases that service was stopped or better yet uninstalled.
Of course, the idea behind Tabular SSAS setting a minimum working set size off the bat is that it requires a lot of memory to begin with. Therefore, if SSAS is indeed installed on the same system where SQL Server is running, that is probably not the best resource-allocation decision. SSAS would be much better served on a dedicated system with lots of RAM.