May 6, 2008 at 8:07 pm
Hi Jeff,
no not this piece of the code, but another sample I ran a qep for has a 9.1 million row table as an input into a select which the qep identified with the parallelism symbol.
This is also another concern for me. Whether to turn it off or not.
Can you please just list out what is wrong with just this piece of the code.
I have ddl statements at the beginning - I have actually started thinking that they should re-dev this using SSIS.
Use table variables instead of temp tables
use sp_executesql instead of exec ...
did I miss anything else?
I will want to go over this with the onsite DBA - then I think both of us will need to go through each piece of the DTS package and pick out the problem areas for the third party to have a rethink on.
Also last night I captured a trace trying to identify the number of recomps and also added the autostats trace - does an autostat update of stats on a column trigger a recompile?
thanks
May 7, 2008 at 5:47 am
Don't substitute table variables for temp tables unless your data sets are very small. Table variables don't have statistics generated against them (except when they have a primary key and then only the key itself has statistics) so they are treated as one row tables by the optimizer. If you have no more than a few tens of rows, it will radically mess up your queries to have table variables in there.
I agree with your assessment that this may be better handled by SSIS.
In addition to other things mentioned, do you really need all those DISTINCT statements. That adds tons of overhead and is usually indicative of incorrect joins or inadequate where clauses. Structurally you don't want to load data into a table and then delete the data that you don't want. You want to filter the insert up front so that you only store the data you need. Also, Inserts followed by updates generally (not always, I'll grant you) should be done as part of the insert as well. Joining to derived tables as part of the initial insert will allow you to filter out the unwanted data and set the appropriate columns to meaningful values and do it all in a single step.
That said, this is a pretty complex set of processes. Without taking it apart statement by statement, it's hard to say if it's salvageable as a query or should just get chucked and turned into an SSIS package.
You might also look at Try/Catch style of error handling. It'll make the code a bit less busy.
Just a few things to watch for.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 7, 2008 at 2:07 pm
Hio
Agree with George. You will have to look at the sysprocesses table and look at the waitstats and the waitresources while it is blocking to see why it is blocking.You will be surprised to see what is realy happening. If there is a cxpacket(maxdop) then it might not be all that bad , but then make sure you do a dbcc sqlperf(umsstats) to see if to many maxdop operations is not starving your ums pool ..so you have threads waiting and nothing available.
Yep run a trace and a perfmon as well and aggragate the data.
Also if you want indepth tracing use PSSDIAG and configure the blcoker script ..very usefull
May 7, 2008 at 2:15 pm
thanks very much for all the input - appreciate it all.
You have give me much more to think about.
Now just have to try and figure out the best way forward.
regards
May 8, 2008 at 1:07 am
john pan (5/6/2008)
Hi Jeff,no not this piece of the code, but another sample I ran a qep for has a 9.1 million row table as an input into a select which the qep identified with the parallelism symbol.
This is also another concern for me. Whether to turn it off or not.
Can you please just list out what is wrong with just this piece of the code.
I have ddl statements at the beginning - I have actually started thinking that they should re-dev this using SSIS.
Use table variables instead of temp tables
use sp_executesql instead of exec ...
did I miss anything else?
I will want to go over this with the onsite DBA - then I think both of us will need to go through each piece of the DTS package and pick out the problem areas for the third party to have a rethink on.
Also last night I captured a trace trying to identify the number of recomps and also added the autostats trace - does an autostat update of stats on a column trigger a recompile?
thanks
Using Table variables instead of Temp tables would probably be a serious mistake here. See the following URL... particularly Q3/A3 and Q4/A4...
http://support.microsoft.com/default.aspx?scid=kb;en-us;305977&Product=sql2k
Also... not trying to be a smart guy here, but it would probably take longer to list everything that's wrong with this code and the underlying views than a good healthy rewrite would take. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
May 8, 2008 at 1:26 pm
Thanks Jeff
when I said table variable I meant when its a small table. In my other job - we use table variable when we need a small table to help reduce a select against a larger one.
So its use I understand is limited.
I'll have a look at the link and I understand about the list. I think I have enough to go on and I'm sure the onsite DBA will have identified more
regards
Viewing 6 posts - 16 through 20 (of 20 total)
You must be logged in to reply to this topic. Login to reply