September 5, 2005 at 6:14 am
I am monitoring a server because there are some performance problems. I was looking at the buffer cache hit ratio counter and saw that it fluctuates a lot. There are a times when it drops down from 98-99% to 80-85% and goes back up again to 98-99%. This happens in about 15 minutes.
I read buffer cache hit ratio is the average ratio since the SQL Server service was last restarted. I looked in the SQL Server logs and saw that the last restart was over two weeks ago.
How is it possible that the ratio fluctuates that much in 15 minutes if it is an average of two weeks? Am I missing something?
September 5, 2005 at 12:58 pm
Does it happen at regular intervals, specific times?
Perhaps some program is inserting/retrieving lots of fresh data or using not everyday queries?
Buffer cache hit ratio is related to memory.
September 6, 2005 at 2:20 am
I wouldn't focus too much on the buffer hit ratio it is very subjective to the nature of the application you have working on the db. And the usage patterns you have on the application.
What you could probably do is turn on SQL profiler and observe if there are any queries/stored procedures that shows any abnormality.
Perhaps you can post us the actual performance problem you are facing?
September 6, 2005 at 2:37 am
It doesn't happen regular, and some measurements are taken to upgrade the server so it's not an acute problem. But i'm really curious.
If an ratio is, lets say 90%, after two weeks. And it goes up in 10 minutes to 98%. Than it has to do 4 times more cache lookups in 10 minutes, than in the last two weeks (if my formulas are right). A few times a day.
Is buffer cache hit ratio really computed since the last restart? or are there internal restarts we don't know about? or something else?
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply