January 15, 2003 at 8:05 am
I have a complex series of stored procs that are fired in sequence via a main procedure. The "child" procs make use of some temp tables, but all the transaction control is handled in the main procedure. Usually the main proc is called by a front-end application on a record-by-record basis, but I've recently set up a new stored proc to run 1000's of records through this proc in batch mode. Here's the problem: Over time the process seems to slow down. When I start the process, it will process about 1 record every half second, but after several hours(6-8) it starts to slow down. Ususally the next morning the process has slowed to about 1 record every 2-3 seconds. I believe it will continue to slow even more over time. If I restart the proc, it will run fast again immediatly.
Here's what I've already checked:
-Replication load
-Other processes running
-possibly uncommitted transactions
-lack of RAM
I'm not sure where to check next...Could it have something to do with temp tables being dropped an re-created over and over from the same SPID?
The main control proc that handles looping through the batch uses a table variable, not a cursor.
Any ideas would be appreciated.
Thanks
-Dan
-Dan
January 15, 2003 at 1:11 pm
You problem is an interesting one.
How did you verify uncommitted transactions and lack of RAM? What were the values?
Besides using temp tables what else are the SPs doing? By the way I believe the temp tables will cause the SPs to recompile everytime they're executed.
Perhaps to further diagnose your problem you could run DBCC SQLPERF ( WAITSTATS ). I would clear the waitstats first, then run it periodically as the performance gets worse. Maybe something will pop up there???
January 15, 2003 at 2:41 pm
Looks like I found the problem...buried deep inside one of the sub procs was a global temp table (##). Gotta hate legacy code.
-Dan
-Dan
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply