June 1, 2012 at 12:51 pm
So I have some basic stored procedures that I can run which show me the seeks vs. scans on the indexes and tables, and I also do index reorganizes MWF throughout the week and an update stats on Sunday morning which runs for approx 24 hours. Being in the financial services industry the database is highly transactional and has times of increased throughput, etc. It's approximately 250GB spanned across 3 file groups (on various disk arrays) and 28 disks between internal disks in the server and the fiber channel SAN.
In an effort to speed things up this year I had budgetted to max out memory in the new SQL server that we were building. This meant that I was getting 288GB of RAM across 18x 16GB sticks in an HP DL380 G7 running Server 2k8 R2 Enterprise. This would hopefully turn into a good investment as the memory was cheaper than disks and we've already got alot of disks on the server. I fill up all but a few GBs of RAM when the server is running according to resource monitor, but I wanted to figure out ways and things I should be looking at so that I am able to get the most out of our investment. I happened to take up all but about 8GB of RAM when the system is running, but what I dont know is how do I know that the most recent and best possible data is stored in RAM vs. disk access.
I am an accidental "DBA" and just need a little better understanding of how to do in depth things regarding memory management and query execution time. I've run the SQL Profiler for tuning on the database before and applied some reccommendations in the past, but beyond that what I have read the past few years is load the server down with as much RAM as possible, now I need to know how to use it.
Thanks in advance for any guidance 😀
June 1, 2012 at 1:18 pm
June 1, 2012 at 1:31 pm
Homework FTW (I think it's time I invested in an tablet)
June 1, 2012 at 1:36 pm
No way around it. Nice post but you asked for an ocean of information. Buying a tablet last year increased my casual and technical reading bandwidth tenfold.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
June 1, 2012 at 1:42 pm
June 1, 2012 at 1:51 pm
Jayanth_Kurup (6/1/2012)
You cud monitor the counters for buffer cache hit ratio and page life expectancy , also you should keep and eye on your recompiles and complies per sec .sys.dm_os_memory_brokers
sys.dm_os_memory_brokers
250GB Disk and 288GB RAM hmmmmm
buffer cache hit ratio is all but useless. Page 150 of the book I linked to explains why.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
June 1, 2012 at 2:09 pm
Great artcile on the real meaning on BCHR ...totally underestimated the role of read ahead optmization.
Well i guess in this case it wont matter coz the entire Db can be hosted in memory. I wonder if it would be easier to simply use an in memory DB and reduce the RAM size instead. Also looks like stream insight cud help here.
June 1, 2012 at 2:26 pm
Well i guess in this case it wont matter coz the entire Db can be hosted in memory. I wonder if it would be easier to simply use an in memory DB and reduce the RAM size instead.
Maybe, as long as no one ever rebuilds a large index or runs checkdb.
Also looks like stream insight cud help here.
My understanding of StreamInsight is that it is meant to help us react to a high volume of events as close to real-time as possible, all in memory. How would you leverage it here?
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
June 1, 2012 at 2:29 pm
CPSPR (6/1/2012)
Homework FTW (I think it's time I invested in an tablet)
I suggest reading the entire book (not because I worked on it, it's just a good book for someone like you)
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
June 1, 2012 at 2:31 pm
Jayanth_Kurup (6/1/2012)
Great artcile on the real meaning on BCHR ...totally underestimated the role of read ahead optmization.Well i guess in this case it wont matter coz the entire Db can be hosted in memory. I wonder if it would be easier to simply use an in memory DB and reduce the RAM size instead. Also looks like stream insight cud help here.
My rather meager understanding of StreamInsight is that it is meant to accomplish CEP (Complex Event Processing). That is the extent of what I know about it at this time.
June 1, 2012 at 2:44 pm
GilaMonster (6/1/2012)
CPSPR (6/1/2012)
Homework FTW (I think it's time I invested in an tablet)I suggest reading the entire book (not because I worked on it, it's just a good book for someone like you)
It's a good book, period, no pre-qualifications. I think even seasoned DB professionals will have a high chance of gain something from reading it. I refer to it often. Thanks.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
June 6, 2012 at 9:06 am
Thanks for the read, looks helpful and I now know what counters I should be looking at. Since today is a rather heavy day I set up a perfmon session looking at some memory related counters covered starting on page 150. I also saw that Lock Pages in Memory is now avaialable in versions other than Enterprise (which is out of the questions due to cost). I'm on 10.0.5500 which I believe is SQL Server 2k8 SP3 so I should just need to use the trace flag to enable it if it will be beneficial, first I'm gonna see if I can detect anything in the memory counters for this server.
Currently I'm pulling these in, what others will give me more variance and insight?
Processor % User Time
Buffer cache hit ratio
Page Life Expectancy
Free Pages
Free list stalls/sec
Memory Grants Outstanding
Memory Grants Pending
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply