October 24, 2005 at 8:17 am
Hi all
When viewing queries in the profiler, its sometimes happens that the duration of a query is 2000 and the CPU only 300. Or something like that. What is the difference? I am assuming it is network and the amount of time it takes to return the data, but this is not always the case. That is there is not always a lot of data returned. At these times can it just be that the network is very busy?
thanks
October 24, 2005 at 1:13 pm
2 different periods, millisec used by cpu and duration of the event. It's not the network nessesarily, could be parallelism and bunch of other variables.
October 24, 2005 at 3:20 pm
Like have been said, they are completely different. Duration is the time from start to finish and includes everything that takes time including cpu activity. Very often there will be some lock that is blocking the process for a short while. Other times there is file or network IO.
October 25, 2005 at 1:41 am
Hi
thanks for the responses. Makes sense. Do you have any suggestions how I can track what takes up that time? Ie. what events do I set up in Profiler once I know a specific stored procedure has a good CPU time, but duration is long and I want to trap what is causing that?
thanks again
October 25, 2005 at 1:44 am
Hi
thanks for the responses. Makes sense. Do you have any suggestions how I can track what takes up that time? Ie. what events do I set up in Profiler once I know a specific stored procedure has a good CPU time, but duration is long and I want to trap what is causing that?
thanks again
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply