Stratergy for identifying Slow running queries

  • I am interested to find out slow running queries over the span of two weeks. Mentioned below are my questions. I request other DBA's to suggest me on how they would do the same task

    i)Should i run the job which captures slow running query once in two weeks or how often?

    ii) Would be nice if you can also share the script to do this.

    TThanks

  • I prefer to think of it as "long running queries". You can either use SQL Profiler with a duration filter, or you can use DMV's with TSQL. If you go the DMV approach please see this article that I think was referenced in SQLServerCentral at some point:

    http://www.databasejournal.com/features/mssql/article.php/3802936/Finding-the-Worst-Performing-T-SQL-Statements-on-an-Instance.htm

  • I would suggest reading these articles, espcially those by Gail Shaw, for guidance.

    Finding the Causes of Poor Performance in SQL Server, Part 1 by Gail Shaw

    http://www.simple-talk.com/sql/performance/finding-the-causes-of-poor-performance-in-sql-server,-part-1/

    Finding the causes of poor performance in SQL Server, Part 2 by Gail Shaw

    http://www.simple-talk.com/sql/performance/finding-the-causes-of-poor-performance-in-sql-server,-part-2/

    Performance Tuning by Brent Ozar

    http://www.brentozar.com/sql/sql-server-performance-tuning/

    Troubleshooting Performance Problems in SQL Server 2005 in MSDN

    http://www.brentozar.com/sql/sql-server-performance-tuning/

    If everything seems to be going well, you have obviously overlooked something.

    Ron

    Please help us, help you -before posting a question please read[/url]
    Before posting a performance problem please read[/url]

  • thanks all, but how often do i do it? I thought of doing in once in two weeks on a typical business day for 8 hours with intervals of 15 mins. What do you say?

  • Tracing for 8 hours gives you far more data than you can manage. If you read the articles i wrote, you'll see i recommend 30-60 minutes at a time. How often? Well, up to you. I used to do once a week when I did this full time, then spend the rest of the week fixing what was found.

    There's no point in tracing repeatedly without fixing the problems that the trace finds.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • You can use DMV's, or server side traces and Gail's articles are a great place to start. If you're running 2008 you can also look at using the Data Collection utility. Third party products, like Idera Diagnostic Manager or Confio Ignite will also get you what you need.

    As to frequency, it depends. We have systems where we gather performance metrics 24/7 and process the data collected into a datamart where we generate daily and weekly reports that show trend lines for procedures execution times so we know what's running slower as early as possible. But we also have systems where just occasionally collect some metrics and then compare them to previous metric sets. We also have systems where we almost never collect metrics. It just depends on the needs of your systems, the demands of the business, and just how proactive you want to be.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • GilaMonster (9/21/2010)


    Tracing for 8 hours gives you far more data than you can manage. If you read the articles i wrote, you'll see i recommend 30-60 minutes at a time. How often? Well, up to you. I used to do once a week when I did this full time, then spend the rest of the week fixing what was found.

    There's no point in tracing repeatedly without fixing the problems that the trace finds.

    Thanks Gail. Lets say if i run the trace for 30-60 mins in one day, but then how would i be able to capture the queries after 60 mins? My assumption was to capture the entire load on a typical business day with a frequency of running the job every 30 secs.

  • You can run it for 24 hours, you just have to be prepared for the fact that you're going to get a whole lot of data. Plan accordingly, run a couple of short tests and do some extrapolation so you're sure you've got enough space to handle the process. Then go for it.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • iqtedar (9/22/2010)


    Thanks Gail. Lets say if i run the trace for 30-60 mins in one day, but then how would i be able to capture the queries after 60 mins?

    In most common business situations, the majority of the queries that a system has will be getting run in any (busy) hour or so period. Pick a busy time of the day and trace then.

    If you know that certain jobs/application functions are only performed at a specific time of the day, then trace again there.

    You don't need to capture every single query that has ever run in order to do tuning. You need the most commonly run, most problematic. A proc that runs once a day for 5 minutes is far less of a performance problem than a proc that runs once a minute for 5 seconds

    My assumption was to capture the entire load on a typical business day with a frequency of running the job every 30 secs.

    Good luck analysing that much data. Personally I've found that beyond 3-4 hours of trace data you don't really get much more insight. Capturing at the right times is more important than capturing for long times.

    Remember a trace runs continually, so there's no 'every 30 seconds' here. You start the trace at a certain time, end it at a later time and it captures every event that it's looking for in that period.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • thanks..Gail but its really hard to point a time where there is heavy usage. I will still try and let you know.Thanks

  • Grant Fritchey (9/22/2010)


    You can run it for 24 hours, you just have to be prepared for the fact that you're going to get a whole lot of data. Plan accordingly, run a couple of short tests and do some extrapolation so you're sure you've got enough space to handle the process. Then go for it.

    Put the data to a flat file, not a SQL table. Then when you look at the data in profiler, set a duration filter to find the long running jobs.

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply