February 3, 2014 at 12:48 pm
Dear All,
I'm using INSERT..SELECT to gather data from a complex query into a holding table, this happens right at the start of an SP. The data is then used by different code segments depending on the parms passed to the SP and the output format needed.
I notice that the loading of this table is very slow (OK, a relative term) and using 1 CPU core only. This equates to 3% of total power. I guess it's a serial process, therefore single-threaded and that's that?
So I can't see a way to capitalise on the remaining CPU power in the box to speed things up. Perhaps there is another approach? All of the bulk options I can see that ostensibly support parallelism are file related.
Any ideas gratefully received...
Greg.
February 3, 2014 at 12:53 pm
greg.bull (2/3/2014)
Dear All,I'm using INSERT..SELECT to gather data from a complex query into a holding table, this happens right at the start of an SP. The data is then used by different code segments depending on the parms passed to the SP and the output format needed.
I notice that the loading of this table is very slow (OK, a relative term) and using 1 CPU core only. This equates to 3% of total power. I guess it's a serial process, therefore single-threaded and that's that?
So I can't see a way to capitalise on the remaining CPU power in the box to speed things up. Perhaps there is another approach? All of the bulk options I can see that ostensibly support parallelism are file related.
Any ideas gratefully received...
Greg.
My guess is that the query you are using is the problem, not a lack of CPU. Can you post the actual execution plan?
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
February 3, 2014 at 1:06 pm
greg.bull (2/3/2014)
Dear All,I'm using INSERT..SELECT to gather data from a complex query into a holding table, this happens right at the start of an SP. The data is then used by different code segments depending on the parms passed to the SP and the output format needed.
I notice that the loading of this table is very slow (OK, a relative term) and using 1 CPU core only. This equates to 3% of total power. I guess it's a serial process, therefore single-threaded and that's that?
So I can't see a way to capitalise on the remaining CPU power in the box to speed things up. Perhaps there is another approach? All of the bulk options I can see that ostensibly support parallelism are file related.
Any ideas gratefully received...
Greg.
More detail: is the ENTIRE query plan single-threaded? If so, then check for a) things that prevent parallelism (UDFs maybe??) and b) cost estimate of the query compared to your sql server's Cost Threshhold for Parallelism setting (bad estimate or someone monkeyed with CTFP) and c) your server's MAXDOP setting (someone set it to 1). Affinity settings could have been played with too.
Do note that the final insert itself WILL BE single threaded.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
February 4, 2014 at 1:33 am
Thanks all. Strikes me this isn't very good. I think it's the same for update and delete too. I guess it's complex/very difficult to chunk up the work and thread it. Just seems a waste...
Could see how inserts would be tough, but dels and updates could be doable...
Greg.
February 4, 2014 at 7:00 am
greg.bull (2/4/2014)
Thanks all. Strikes me this isn't very good. I think it's the same for update and delete too. I guess it's complex/very difficult to chunk up the work and thread it. Just seems a waste...Could see how inserts would be tough, but dels and updates could be doable...
Greg.
Nope - isn't a waste. It is just F'N HARD!! I would MUCH rather have a better optimizer and engine with the man hours it would take to handle multi-threaded concurrent DML on the final effect. MOST of the time anyway using 64 threads to GENERATE the 1 or 50 or 50000 rows that will be the DML is quite acceptable. And for scenarios where you actually DO need to INSERT 1.5M rows quickly it is a EDW ETL scenario and you have other options that can be parallel. I also have found innumerable scenarios with clients where they put crap into a temp table for absolutely no reason at all (often with a useless index too!!) and removing the temp object gets a way-faster process. I note that the opposite is also true, especially for monster queries.
Oh, do note that SELECT ... INTO is going to be multi-threaded in SQL 2014!!
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply