July 15, 2008 at 8:48 am
I have performed an analysis of my current SQL Server 2005 instance. I see that buffer hit ratio is going down below 90% at times. and process-working set is larger than memory taken up by SQL Server. So there is certainly pressure on the memory on my system.
Whats the best way to estimate memory requirement for foreseeable future? plus how do you incorporate potential migration of other databases to this instance and their impact on memory usage?
July 15, 2008 at 9:11 am
It is really a dynamic process. You need to look at a number of metrics to understand what is happening.
[SQLServer:Buffer Manager][Page life expectancy]
[SQLServer:Buffer Manager][Page lookups/sec]
[SQLServer:Buffer Manager][Page reads/sec]
[SQLServer:Buffer Manager][Page writes/sec]
It Page life expectancy stays high, and Page reads/sec stays low, you may not see much benefit from adding memory.
Adding memory is not the only possible solution. For example, you might be able to reduce the memory requirement by adding appropriate indexes that prevent table scans. I would look into this before adding memory, because it will still benefit the application, even if you add memory. The Database Engine Tuning Advisor can help you with this.
July 15, 2008 at 9:32 am
Well avg life expectancy has been going down steadily. at this point its near 800. my db's are fully indexed. so again back to same question. how to estimate for future requirements.
July 15, 2008 at 10:07 am
shahab (7/15/2008)
Well avg life expectancy has been going down steadily. at this point its near 800. my db's are fully indexed. so again back to same question. how to estimate for future requirements.
There must be large table scans in your system eating up the buffer. Check for missing indexes, especially on large tables.
__________________________________________________________________________________
SQL Server 2016 Columnstore Index Enhancements - System Views for Disk-Based Tables[/url]
Persisting SQL Server Index-Usage Statistics with MERGE[/url]
Turbocharge Your Database Maintenance With Service Broker: Part 2[/url]
July 15, 2008 at 10:57 am
Not just missing indexes. Can also be indexes that are so fragmented or where the statistics are so badly out-of-date, that the engine doesn't use them. Can also be over-indexing (if a table has a couple of dozen indexes on it, odds are that SQL will only use a few of them and will ignore the rest).
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
July 15, 2008 at 11:25 am
well at some point I will have to add more memory. as I am bringing more and more databases online some with big users populations and large amounts of data. so what criteria do you use to estimate future use?
July 15, 2008 at 11:33 am
How much data do you expect to need to cache, and how many execution plans?
It's really not a "rule of thumb" kind of thing. It depends too much on how big your databases are, how much caching they will benefit from, etc.
Generally, there's no such thing as too much. But even that isn't totally true. A small business with a 2 Gig database and 16 Gig of RAM on a dedicated database server is pretty thorough overkill. On the other hand, RAM is cheap enough that overkill may not be an issue.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply