June 14, 2018 at 8:38 am
Hi all, first post be gentle.
I have a few pieces of code which include Updates from temp tables, and loop around to perform multiple updates. It does this as it requires the previous set of data to build up the next set (it's pushing data around a road network). After the 10th loop it dumps the results into another temp table then attempts to update the main final table. I've also tried a cut down version of the code which excludes the 10th looping part but I come along the same problem, just once it's processed more days' worth of data. The behaviour is like a log file (not the trans log) is filling up and causing a bottleneck.
This works fine until it seems to hit a certain relatively predictable point where the query changes from roughly 45 seconds to complete a full daily run, to 9-10 minutes and stays like this forever. The only way I seem to be able to resolve this is to drop the data into another 'middle' table, wait until the 'bug' appears, stop the query, transfer the data to the final table and truncate the middle table. Once truncated the query runs at normal speed again on the middle table. Needless to say this is stupid and annoying.
I'm not a DBA and don't have access to actual execution plans as restricted by the corporate security policy. So I'm looking to see if anyone has seen this kind of behaviour before and knows the answer. The simpler code is 350 lines long which I've attached as a text file. The simpler code might also look like it doesn't need the looping if you're better at SQL than me, and the longer version which is 1300ish lines has 2 levels of loops (not attached) as it draws in more sources.
Thanks.
June 14, 2018 at 8:42 am
I suspect you need an index on one of the tables to improve the query performance.
You really need to work out which queries within the set of queries is taking a long time to execute. You could do this by adding PRINT statements after each query reporting the time it took. Just add this around each statement:
DECLARE @DateTime as datetime -- At the top of the script
SET @DateTime =getdate()
--SQL Statement here...
PRINT CONCAT('Update 1 took ',DATEDIFF(ss,@DateTime ,getdate()),' seconds')
SET @DateTime =getdate() -- reset date to now
Another option would be to log the query times to a table.
Once you've fond the offending queries it should be easy to fix the problem.
June 14, 2018 at 8:54 am
Hi. Thanks. There is an index and I tried adding index rebuild code in there too to resolve it. These tables use some tables with 2.5bn records each, and the final table it's building will be 1.8bn records. I don't think this is the issue though as it occurs on the smaller middle tables too. Once it's in the main large table it performs fine when reading. Though I don't know if/when I'll hit some update limit in the main table either.
June 14, 2018 at 9:00 am
First, you should probably be working with your DBA to figure out what is going on. Not much we can do over the internet since we don't have access to your system.
Second, whether or not a loop is needed in this code is hard to say as, again, we don't have access to your system so we can't see what you see. Is it possible that a loop is not need, quite possible, but without the DDL for the tables, some sample data (not production data) representative of the problem domain, and expected results based on the sample data it is hard to tell.
If a loop is needed, I personally would be using a fire hose cursor for the loop not a temporary table from which I am selecting the data.
Also, I would not be dropping and recreating the temporary tables with each iteration of the loop. Create the tables and indexes outside the loop, populate and then truncate the tables inside the loop.
Seeing the execution plan would also help, but you have already said that isn't possible. Again, this is why you should be working with your DBA.
If I have time I will try to look at your code in greater depth to see if a set-based process can be developed, but with nothing to test it against there are no promises.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply