Monday Morning Mistakes: Not Setting Memory Limits

Welcome back to another addition of Monday Morning Mistakes series. Today’s issue is one I tend to run into quite often with clients and is an important topic to know about as a database administrator. Without further ado, let’s get to our issue


You have SQL Server database engine installed on a system with other services such as Analysis Services, Reporting Services and/ or Integration Services and you constantly seem to run out of memory. Restarting service seems to fix the issue temporarily but some time later the same problem returns.


Always ALWAYS set max memory options for your SQL Server-related services! Setting a hard set maximum keeps your systems from “running away” with memory and causing unexpected performance issues in your environment. This becomes especially important in environments where you’re running multiple services on the same server. In a nutshell here is a breakdown of the different SQL Server services and how they utilize memory by default (read also: running with default settings in relation to memory). The two biggest problem children, in regards to memory configuration, are the database engine service and Analysis Service. Although those two are the most commonly misconfigured, I’ve outlined all four services below.

Database Engine (aka SQL Server service)

By default, the Max Server Memory (MB) setting is set to 2147483647. This is one of the first things you want to change upon a new install of SQL Server! In layman’s terms this default setting tells SQL Server it can essentially take up all of the physical memory on the server for use by the SQL Server buffer pool. Notice I said buffer pool and not SQL Server in total? Pre SQL Server 2012, this setting really is setting max memory for the buffer pool but folks have come across instance where they set the max memory setting and yet SQL Server shows it’s actually using more memory than that. Starting with SQL Server 2012, this setting actually dictates how much SQL Server (buffer pool + everything else) can use so it’s less confusing. See this post by Jonathan Kehayias (Blog | Twitter) for more details on what the max memory setting truly means:

Best practice suggests setting this value instead to 80% of physical memory on a server that only has the database engine running. You will need to use smaller percentage if box is sharing resources with other services. Please note this 80% rule is flexible as systems with larger amounts of memory you can increase that percentage. As an example, in the figure below you can an example where I’ve set the max memory for a system with 8GB of RAM and running only the SQL Server database engine on the box. For a great guide on setting max server memory for the engine service see Glenn Berry’s (Blog | Twitter) post on the matter:


Analysis Services (SSAS)

This one is really interesting as many folks install SQL Server Analysis Services (SSAS) without realizing what the configurations involved are/do. In a default installation of Analysis Services, the service’s value for LowMemoryLimit is set to take 65% of physical memory by default. Now granted this service does not suck up this much at startup (that value is controlled by PreAllocate property) but if you were to use Analysis Services while running the engine on the same box, Analysis Service will not start freeing up memory until this minimum is reached. Up until that point, any memory used by Analysis Service is exclusive to it. If you’re installing multiple services on the same server, you’ll want to not only set the minimum memory setting here, but you’ll also want to set the TotalMemoryLimit and HardMemoryLimit. Your HardMemoryLimit is really the important one you want to configure as that is the percentage at which SSAS will start denying user and system requests due to memory pressure (essentially an out of memory error). For administrators, a must-read guide for Analysis Services is the SQL Server 2008 R2 Analysis Services Operations Guide. It’s lengthy at 108 pages but you can jump to section 2.3 (Memory Configuration) to get the full details on these settings and how they function.

Reporting Services (SSRS)

SQL Server Reporting Services (SSRS) memory settings can be configured but it’s not as straight forward as the other services are. In order to configure SSRS you need to modify an XML configuration file (rsreportserver.config). In a default installation, this file is located at C:Program FilesMicrosoft SQL ServerMSRS10_50.MSSQLSERVERReporting ServicesReportServer. Please note that path could change depending on what version of SQL Server you’re running, if you’re running 64 or 32 bit installation, and what drive/folder path you’ve installed your services on. If you have trouble locating it, simply do a search on your file system for the rsreportserver.config file.


SSRS, like Integration Services, is typically benign in regards to memory usage. However if you have an environment where it gets utilized heavily, especially on a system that is sharing resources with multiple services, then you may want to tweak these settings. For best practices regarding Reporting Services configurations I suggest you look at the whitepaper from SQLCAT team on Scale-Out Deployments for Reporting Services Best Practices.

Integration Services (SSIS)

Unfortunately this service’s memory usage actually can’t be configured like the other services can. Instead optimization needs to occur at the package level. Having the SSIS service installed alongside the database engine service is quite common and usually doesn’t cause too much issue so long as all the other services are configured optimally. You can read more about SSIS Design and Performance Tuning or watch a free webinar from Pragmatic Works on SSIS Performance Tuning:


Remember this post is to help those who have multiple services (or even all the services) running on the same server. Best practice dictates that for best performance you’d want to segregate one or all services to their own servers, but make sure you do what makes sense for your environments. Best practices aren’t necessarily one size fits all* so make sure you do your homework!

*Unless that best practice dictates not to auto-shrink your databases. Seriously, don’t shrink your databases, please think of the kittens.

12 Replies to “Monday Morning Mistakes: Not Setting Memory Limits”

  1. Any recommendations if you have the database engine, SSAS, and SSRS all on the same VM (for TFS) with 4 GB configured?

    1. Sandra, that’s a bit of “it depends” answer (I know, nobody likes to hear it). Is that an isolated VM/server or are those running on a laptop? Also for 4GB seems quite small, are you running 32-bit OS? Since 32-bit is quite limited I’d be pretty conservative with allocations. Maybe 1-1.5 GB for engine, SSAS low limit to 0 and upper limit to 20-30% (again here it depends, how often are you processing/reporting from cubes?). Also fun fact from SSAS Operations Guide:

      Services does not use the AWE API to allocate memory, which means that on
      32-bit systems you are restricted to either 2 GB or 3 GB of memory for Analysis


      Basically allocate as your needs see fit. If you’re doing more engine work, allocate a bit more to that. But also remember to keep enough memory available so that you don’t starve the OS. And finally, if you’re on 32-bit please PLEASE start moving your systems to 64-bit 😉

      1. The current VM is a 64 bit server on Win2008 SP2.  It was originally a “dev” server where the developer installed TFS.  So, SQL, SSIS, SSRS, SSAS and some Sharepoint were all installed on the 40 GB C: drive.  Zoom forward a few months and it finally comes to my attention since they want to call it production now.  This TFS install supports about 4 full time programmers and 2 occasional programmers.  The DB are fairly small.  I’m guessing the engine has the most work. 

        I’m hoping to get them to migrate the necessary components to a real production server and utilize our existing SSRS server.  It’s sort of a crazy mess.  Thanks for info.

        1. Ouch! That sounds like a slightly different problem. Question is, why did they try to cram all that crap (SharePoint, SSIS, SSRS, SSAS) on a single VM with only 4 GB of memory? Also why cap at 4GB if you can go more on 64-bit? The other fun tidbit there is that SharePoint has very specific settings/configurations you need to setup in order to be supported by Microsoft. Check out this capacity planning and configuration guide: 

          1. Yup.  It’s a lot of different problems. 🙂  But, at minimum, I wanted an idea of how to configure memory when a number of components were on the same machine.  If we move to a new production server, it’ll probably be the engine and SSAS at least on one server.  So, thanks for the thoughts on this.

  2. How about a Saturday night mistake. One of our processes is set to run a 02:10. It was skipped because that time didn’t exist. Daylight savings time.

  3. Nice article. It so true about DBAs not setting memory setting and just using the default. I have written an article/code on SQL Server Central ( that has can be used to determine the max memory setting. Just execute the script and it will display your current memory and recommended max memory setting. Just another way of trying to make things easier for DBAs everywhere.



Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.