October 10, 2011 at 10:54 am
I am running SQL Server 2008 web edition x64 on Windows Server 2008 R2 with 8GB ram. The hardware supports up to 16GB ram. I have already read many posts and I completely understand that there are many factors to take into account and performance monitoring needs to be performed in order to determine memory utilization and needs. What I am looking for, is validation of the following general idea: "You can't go wrong with more memory. Get as much as you can afford. If SQL Server is hitting the max memory limit, add more memory". Again, I know you need to analyze, optimize, etc. All other things being equal, is the generalization accurate?
October 10, 2011 at 10:56 am
The generalization is fine - more memory is generally good. Before throwing more money and hardware at a problem though, optimizing is the best rule of thumb.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
October 10, 2011 at 11:48 am
The generalization is accurate, but it all comes down to price. Is a week or two of your time worth more or less then the upgrade?
Memory is usually the default go-to for a few reasons. It lowers DiskI/O (and SAN is expensive) because Page Life Expectancies are higher, and you have more room to work with badly performing queries doing heavy mid-plan tables. It's also usually cheaper to throw more memory on a box then buy another license for a second box and to split the instance up.
That said, a quick review of your 20 worst performing queries will probably gain you as much or more as doubling the memory. Doing *both* is usually the best choice.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
October 10, 2011 at 12:00 pm
As others have mentioned adding more memory can mask inefficient queries that you won't notice until the server starts using more memory.
When you look at an inefficient query, it may take 10 millisecond to run so it seems like a waste of time trying to optimize it.
-------------------------------------------------
Will C,
MCITP 2008 Database Admin, Developer
October 11, 2011 at 6:16 am
If the db load is mainly OLTP, then you may see gains by adding memory until you have more RAM than the size of the dbs - then you'd probably reach the max benefit from memory.
OLAP - style loads may behave differently though (memory grants for 'heavy' queries etc)
As with all generalisations, see it as a rule of thumb, not gospel.;-)
Cheers,
JohnA
MCM: SQL2008
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply