January 13, 2009 at 7:52 am
I have 2 SQL 2000 boxes with a dozen busy databases. I get periodic slowdowns where clients call complaining about performance. I've solved some problems by identifying and fixing some queries that were causing blocking (table scans), but now I need to dig a bit deeper & see what's hogging resources from time to time even though not causing blocking.
In my previous job I used idera's Diagnostic Manager tool that showed me long running queries. I haven't used profiler much but I'd like to set up a low impact trace to find out what's running during the day that might be impacting the interactive users.
Any suggestions for a profiler newbie ?
January 13, 2009 at 7:59 am
Funny as I just posted a link to this on another post earlier today but, here it is again.
Side note, I haven't tried this with 2000 but my guess is that you will be able to come close. 🙂
David
@SQLTentmaker“He is no fool who gives what he cannot keep to gain that which he cannot lose” - Jim Elliot
January 13, 2009 at 10:13 am
Haven't read the other guys' post - it's probably more comprehensive - but I wanted to chime in and suggest that you put a filter on it. I typically do cpu > 100 (msec) or something similar. That leaves you with just the longer running queries and makes sure Profiler has a negligible effect on the server. Otherwise, on a busy server, you can end up with a million records...
January 13, 2009 at 11:11 am
Don't use profiler, at least not the gui. Use the server-side trace procedures (sp_trace_*) and write the output to a file on a fast disk. That's the lowest impact way of tracing a server that there is.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply