February 20, 2008 at 2:20 am
Dear all,
Some bodies argue that we should not take a SQL 2005 (Windows 2003) 's default max server memory option in use, by default, this option is the greatest available memory because we should leave some memory to the server's Windows OS.
Is it true ?
Microsoft says we should leave some of a SQL server's physical memory to other applications, do these "other applications" include the Windows OS itself ?
All of our company's SQL Servers use the entire server, that means no other applications are running on it except the SQL Server + Windows OS.
Or, if I use the default value (the greatest available memory), will it undermine a SQL Server's performance ?
February 20, 2008 at 5:21 am
Hi,
Unless you are running other applications or using named SQL Server instances on the same server, you should be fine with the default memory configuration, which will work dynamically with SQL Server grabbing and releasing as required. Although you are allowing SQL Server to potentially utilize all server memory, it will automatically leave enough memory for the operating system to work. If you do have other applications and\or named instances running on the same server, you will need to configure memory via the min\max settings to ensure both your other applications and SQL Server have enough memory to operate.
Thanks,
Phillip Cox
February 20, 2008 at 5:51 am
I'm not the admin expert that configures our servers, but I do know the guy who does it sets the memory allocation to a fixed amount. Dynamic memory is primarily so that SQL Server will be a good citizen on a server that has lots of different apps running. But, most people realize pretty quickly that making SQL Server share with other applications ensures that none of the operate very well. Our guy fixes the memory at a preset limit, leaving some space for the OS to have extra. He also then sets the minimum allocation equal to the maximum so that no dynamic allocation takes place. I could drill on him for details if they're required.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 20, 2008 at 6:05 am
I can attest that leaving the default settings in place, without reserving memory for non-SQL functions even on a server dedicated to only SQL causes problems. We had issues recently with SQL not allowing our tape backup system enough memory to work with, though it only needed in the realm of 64 Mb of memory to work with. Memory is cheap, buy more than you think you need, because you WILL end up needing it. Dedicate all but a portion to SQL, but definitely don't let SQL have it all. I'll add that my issues with trying to give up 64 Mb of memory was on a Enterprise 2005 64-bit clustered node with 16GB of RAM.
- Tim Ford, SQL Server MVPhttp://www.sqlcruise.comhttp://www.thesqlagentman.com http://www.linkedin.com/in/timothyford
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply