September 30, 2002 at 8:15 am
Some Brain Box in the IT support dept here installed a SQL Server on the main Application\File Server before I started this job.
"Well it`s the database thats serving the database applications" was their excuse.
Well, due to this the Application Server used to slowdown a lot if someone using one of the databases (mailsweeper/royal blue) ran a big and probably bad query.
I advised them that the databases should be moved to a dedicated database server. That wasn`t an option, so then I capped off the memory for them at 300 meg for the SQL Server to use?
Is this a good idea? I`m thinking that the only overhead this would have is slowing down any queries or processing that required more memory??
Andy.
September 30, 2002 at 8:39 am
First the general stuff:
Where I work we isolate SQL Server from other applications wherever possible but sometimes we don't get away with it either.
Capping the memory will ensure SQL Server doesn't contend with the applications past the cap limit. The way SQL Server allocates and deallocates memory is a heavy hit on performance as it will clear data out of its memory cache and if necessary write it to disk as other applications suddenly demand more memory. So if you know SQL Server and the other application is going to conflict, capping the memory is probably a good idea.
As to where to cap the memory, that's a totally different question. One thing you'll probably want to do is monitor memory usage and how well the buffers are able to respond to queries. Compare total server memory versus the memory SQL Server is using. Watch the available bytes as well. Perhaps you can give SQL Server more memory if it is necessary.
Watching the buffer counters such as the buffer cache hit ratio will give you a good idea if your SQL Server needs more memory. You would want a buffer cache hit ratio in the upper 90% range.
Now to a specific you mention... if the application server hit the wall when someone was using one of the databases you may still have performance issues. Capping the memory will ensure SQL Server doesn't go into that memory swapping routine as SQL Server and the applications contend for memory, but the problem is you may not have enough memory to begin with. The only way to know is to watch what's going on using PerfMon and Profiler. See if you can catch using Profiler one of those big bad queries you suspect are going through and try and correlate it with the counters in PerfMon. Perhaps you can optimize the query or barring that, the back-end (indexes and the like).
K. Brian Kelley
http://www.truthsolutions.com/
Author: Start to Finish Guide to SQL Server Performance Monitoring
http://www.netimpress.com/shop/product.asp?ProductID=NI-SQL1
K. Brian Kelley
@kbriankelley
September 30, 2002 at 9:56 am
Dedicated server - no capping (unless > 2GB and AWE not enabled. MS says don't try to let it run above 2GB.
Shared Server - Probably need to cap it otherwise the memory allocation/deallocation is waste cycles. Where to cap it? Follow Brian's advice above.
Steve Jones
October 2, 2002 at 3:13 am
The servers got 2 GIG of memory, thats why I thought it would be ok to do it.
I`m going to put a list of recommendations together including some dedicated database servers.....
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply