December 18, 2008 at 4:01 am
HI,
I need some suggestion to reduce the execution time of the below insert statement.
INSERT INTO New_Table(CD,PD,PED,Itm)
Select m.CD,a.PD,t1.PED,i.Itm--Record Count of this query is 95000000
From Table1 t1
JOIN View_V m ON m.RN=t1.CD
JOIN Table2 i on i.Mne=t1.Itm
UNION
SELECT m.CD,t2.PD,t2.PED,i.Itm--Records Count of this query is 190000000
FROM Table1 t2
JOIN View_V m ON m.RN=t2.CD
JOIN Table2 i on i.Mne=t2.Itm
On "New_Table" the 4 columns((CD,PD,PED,Itm)) are Primary key. There are other columns as well in the table. On Primary key there is a clustered index.It takes 50 mins to insert these records into the New_table.
Thanks
PS
December 18, 2008 at 5:33 am
Some things to try -
If there is no possibility that the two queries (either side of the union) can return duplicates you could make it "union all" which may save a little time.
Apart from that check if the time is in the selects (run them independantly) or the insert.
If the selects - look at indexes on the source tables
If the insert - look at basics like disc config, log on different disc to data, log NOT on raid 5 etc etc.
Mike John
December 18, 2008 at 6:50 am
Is it the query that's slow or the insert? Can you post an execution plan?
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 22, 2010 at 8:34 am
Drop the primary key and index on the insert table. Add keys after data has been loaded.
December 22, 2010 at 9:20 am
Given nearly 300 million records being inserted it is going to take a bit of time no matter how you slice it. Retrieving the data could be a bottleneck but, there is a lot of disc I/O with that much data.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
December 22, 2010 at 9:26 am
How many records does it usually insert?
As Grant asked, which part is slow, the query or the insert?
CEWII
December 22, 2010 at 9:31 am
The OP has record counts as comments in the query. 95 million and 190 million.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply