June 2, 2004 at 10:25 am
Our system administrator says he has been told in class (think this was a SQL Server class he took) that we should fix the memory usage of SQL Server so it can't use more than the fixed amount specified. His recommendation is we size SQL Severs memory usage to be 512 MB less then the actual memory on the machine. I'm trying to verify if this recommendation is correct, but I can find no facts to backup his suggestion. I always thought it was bet to let SQL Server take as much memory as it wants, and not constraint SQL Server memory needs by fixing the memory usage. I'd like to make sure the right decision is made regarding this, so I'm looking for opinions on this. Our SQL Server machine has 2 gigabytes of memory, has two processors, and is only used for SQL Server. Does it make sense to fix the memory utilization of SQL Server? If you could provide URL's to documentation that backs up your suggestion that would be great, since I could then point our sys admin to these sites for information.
Gregory A. Larsen, MVP
June 2, 2004 at 11:03 am
Hey Greg -
I found this at
http://www.sql-server-performance.com/sql_server_configuration_settings.asp about half way down the page. I never configure specific settings for Server memory unless I am on a system with 8GB or more and can use SQL Enterprise Edition with AWE Enabled.
"For best SQL Server performance, you want to dedicate your SQL Servers to only running SQL Server, not other applications. And in most cases, the settings for the "maximum server memory" and the "minimum server memory" should be left to their default values. This is because the default values allow SQL Server to dynamically allocate memory in the server for the best overall optimum performance. If you "hard code" a minimum or maximum memory setting, you risk hurting SQL Server's performance.
On the other hand, if SQL Server cannot be dedicated to its own physical server (other applications run on the same physical server along with SQL Server) you might want to consider changing either the minimum or maximum memory values, although this is generally not required, and I don't recommend it."
Michelle
June 2, 2004 at 11:14 am
Thank you for the link. This link suggests to not fix the SQL Server memory. Also looks like you are recommending not to fix it. Anyone know of other links to confirm either for fixing memory, or for not fixing memory. I'm looking for a well rounded point of view, not just a single source. Or maybe now I have three peoples opinion, and they all say you should not fix the memory in our situation.
Gregory A. Larsen, MVP
June 3, 2004 at 5:44 am
We have about 20 servers here running SQL Server.... On some I have found that its better to let it dynamically take care of it and others I have capped how much memory SQL Server can take. For instance, a server with 1 gig of memory I cap at around 700mb. And this is a dedicated server for SQL Server.... Some SQL Server applications seem to run better with the memory capped somewhat below what memory is in the server. But, for the most part I do start and like to leave it to dynamically take care of it itself.
June 3, 2004 at 8:46 am
Markus, I was wondering if you might elaborate on what kinds of applications seem to run better with the memory capped, as well as classify what types of application are better left to dynamic memory allocations. This might help me decide what might be best for our situation. Also what kind of performance benchmark method did you use to determine that performance improved when you capped the CPU usage.
Just to let you know we have a very mixed suite of application from online-transactional based applications with inserts and updates, to purely read-only decision support application on the same server.
Your thoughts would be very informative.
Gregory A. Larsen, MVP
June 3, 2004 at 9:25 am
Our production server runs sql7 only on win2k with 2GB of memory which was dynamically allocated. However, I noticed that the server was paging at a high rate much of the time. Some of the more insignificant non-sql processes were not getting the memory they needed and sql was not giving any up.
I fixed the max server memory allocation to 1.5GB and things have been running fine since; no more paging alerts and no noticeable decline in performance.
June 3, 2004 at 9:43 am
RandyB - Which specify performance objects and counters where you monitoring?
Gregory A. Larsen, MVP
June 3, 2004 at 10:22 am
I was not real scientific.
One purchased application was on a dual CPU 512 MG memory. They were complaining that running crystal rpts was extremely slow. After a server reboot noticed SQL Svr taking all but 50mg of memory and not releasing it when rpts were running. I capped at 400 and after that no complaints.
Second, I don't really remember the details of what the situation was but noticed that SQL Svr was not releasing memory for something else.... consumed all but about 30mb... after a cap of 700 mb of 1 gig everything appeared to be running better.
I had heard that SQL 7 never was good about releasing memory when other things called on memory but I though SQL 2000 was supposed to manage this better.
June 3, 2004 at 10:34 am
Our experience...
Just like some of the others have reported, it did not hurt performance to lower the memory may be 5 to 10% less than the max. It has added more stability to the server (fixed some wierd problems we were facing while running a month-end processing)
June 3, 2004 at 10:41 am
Maybe I should add:
The issues I listed above were a few years ago in SQL2000 SP1.... maybe they don't apply as much anymore.
Now that we have all of our servers running NETIQ OnePoint monitoring and McAfee also though maybe thats not the case though..
I have seen less of an issue if the server is a dedicated SQL Server only... no software or anything else installed on the server... just a database server.
June 3, 2004 at 10:42 am
<< RandyB - Which specify performance objects and counters where you monitoring?
We use Spotlight on SqlServer by Quest Software. I believe they are using Page Faults/second which is under Memory in Performance Monitor.
June 4, 2004 at 12:07 am
We don't use fixed- or max-memory setting on most of our sql2k-servers. Only when experiencing sql-paging, it occurs we put a max-memory on it.
I believe the recomendation to leave 512Mb is to have a bit spare for usage by sqlagent, msdtc, ...
I would keep it in mind, but only have it set when prooven guilty.
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
June 5, 2004 at 2:21 am
Set 75-80% memory.
Cheers
Achot
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply