Cache

  • We are running Win2K with SQL2k sp3.

    We are experiencing sporatic slowdowns on our production server.

    We believe the issue is with a query the users are running with too many wildcards.

    The developers are in the process of rewriting that proc.

    In the past, when the slow down occurs, the previous DBA ran the 'sp_flush_db_proc' and they say that fixed the problem.

    I've been tasked with determining if the problem lies with the buffer pool cache and the stored proc cache. I have read the articles in BOL, but still don't have the whole picture. Any one have any good articles or ideas where else to look?

  • quote:


    In the past, when the slow down occurs, the previous DBA ran the 'sp_flush_db_proc' and they say that fixed the problem.


    Can you post this stored procedure?

  • The proc really runs a dbcc flushprocindb

    Here's the code..

    create procedure sp_flush_db_procs

    @dbname sysname=NULL

    --

    --Flush the Procedure cache for a database.

    --

    --If no database name is supplied then flush

    --the current database.

    --

    --WARNING: This SP uses an undocumented MS SQL

    --DBCC command to do its business. So

    --it might not be supported beyond the

    --current version of SQL, Version 7.

    --

    --D. Winters 2/8/2002

    --

    as

    declare @dbident int

    set nocount on

    begin

    if @dbname is null

    select @dbident = db_id()

    else

    begin

    select @dbident = db_id(@dbname)

    if @dbident is NULL

    begin

    RAISERROR ('Invalid DB name specified.', 16, 1)

    RETURN(1)

    end

    end --if

    dbcc flushprocindb(@dbident)

    select 'Flushed procedure cache for: ',db_name(@dbident)

    end

    --------END

    GO

  • It really forces a recompile of all the stored procedures in a database.

    I would suggest you to examine the particular query to see whether correct indexes being used and enable automatic statistics update.

  • Right, they are revisiting the query and plan on revising it. However, I just don't understand how recompiling the SP's would fix the slow down issue. My understanding that the recompile's are a performance hit as it is. Can the query be forcing the stored procs out of the cache? Does the dbcc used actually release everything in the cache?

  • It really depends on your data as to whether recompiles are good or not.

    For example: Say you have a table with a million rows in it and has an index on column x. You add 100,000 rows to it but all with the same x values.

    When you go to reaccess the data, you have a stored plan that had accessed the data in the most optimal way before the rows were added. It is still using the same plan but because the data has changed so much it is nolonger using the best plan. By recompiling it will relook at the statistics and hopefully re-develop the best plan.

    Hope that isn't too confusing.

    Tom

  • I thought the 'auto_statistics' 'On' option would have already updated the stats and the plan. I thought when the stats are updated automatically, the SP's would in turn be marked for recompile.

    Is this true?

  • If there are changes such as adding indexes, changing data in indexed columns or table structure, The stroed procedure should be recomplied for better performance.

    SQL Server auto statistics update performs a sample scan on the table automatically and it may or may not affect the recompilion.

    Edited by - allen_cui on 08/07/2003 1:24:00 PM

  • Hi,

    Have you tried to recreate the procedure WITH RECOMPILE?

    Ionel

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply