First published on MSDN on Jan 12, 2018 Recently, we have observed a number of cases where DBAs or application developers are complaining about out-of-memory errors or even machine not responding (hangs) despite the fact that there is plenty of available memory on the system. Frequently this is on systems where SQL Server is running and the issue occurs after increasing the ' max server memory ' sp_configure option, again with plenty of memory available. For example on a system with 64-GB RAM, SQL Server's 'max server memory' is set to 54 GB and you start getting OS errors 1453 or 1450 for example or any other memory-related issue. In some cases the OS stops responding. The key symptom was that when the DBA reduced the SQL Server max server memory below 50 GB (below 80% of RAM on the machine), the problem stopped happening. So logically, you would think there is a problem with SQL Server.

Physical Memory: 16596487 ( 66385948 Kb) ( 63.31 GB) Available Pages: 2254327 ( 9017308 Kb) ( 8.60 GB) ----- Available memory looks good. ResAvail Pages: 1118 ( 4472 Kb) ( 4.37 MB) ----- ResAvail Pages are the Minimum Working Set that process reserves/set while starting. The value of ResAvail is 4.73 MB which is too low. Even though overall available memory is 8.6 GB, there’s some process which has set Min Working Set to a huge value. This makes available pages reserved to this process and cannot be allocated to anything else.

User Kernel Total ===== ====== ===== 328ms 298ms 626ms The same problem , but not involving SQL Server or Analysis Services, was described in this blog .

The big gotchas came from the fact that Performance Monitor counters showed plenty of available physical memory and working set sizes for all processes were quite reasonable. So don't expect to find this in Perfmon, it just isn't there.

Update (Nov 2018):

Another team of engineers analyzed the issue and proposed a much simpler, non-intrusive method to diagnose the Working Set memory for the Analysis Services process (thanks Venu Cherukupali).

Here is an example of the Minimum Working Set value of SSAS on a 64-GB RAM system.

Solution

The Analysis Services experts joined the t-shooting (thank you, Yinn Wong) and as it turns out that by default the SSAS tabular instances, sets minimum working set to 20% of the physical memory on the system. The system was using POWERPIVOT SSAS.

Did this: In most cases it turned out SSAS was not even needed and used on that system. Therefore, in those cases that service was stopped or better yet uninstalled.

Of course, the idea behind Tabular SSAS setting a minimum working set size off the bat is that it requires a lot of memory to begin with. Therefore, if SSAS is indeed installed on the same system where SQL Server is running, that is probably not the best resource-allocation decision. SSAS would be much better served on a dedicated system with lots of RAM.