July 11, 2013 at 7:41 am
Dear All,
I have a script that has 20-25 small single line queries.
Objective of script is to select all data from Server-A/table-x and copy to Server-B/Table-x.
Overall script takes almost 6-7 hours to complete as most of the tables have more then a billion records.
Individual queries has nothing to optimize as they all are single liner select queries.
What Should I do to reduce this time, or should I completely change the strategy.
Any help will be highly appreciated.
Thanks.
July 11, 2013 at 7:46 am
Don't use single line code (like INSERT ... SELECT... FROM) for such large tables. Use an ETL tool instead. You can use SSIS (import/export wizard) or the BCP command line utility.
July 11, 2013 at 10:31 am
HanShi hit the nail on the head. SSIS, bcp and BULK INSERT use a mechanism for inserting data that is not like the INSERT statement. The first time you see the performance, you won't believe the data was moved - it is through the roof.
July 12, 2013 at 3:49 am
Thanks.
If I go for SSIS, will it make any difference if source and target tables are on same database.
July 12, 2013 at 3:57 am
It won't make any difference in SSIS, I think it's the best option to use SSIS (data flow task).
July 12, 2013 at 5:57 am
Some SELECT queries has complicated WHERE clause. I hope it will be supported.
July 13, 2013 at 2:30 pm
SSIS is capable of doing very extensive filtering and transformations. Try it for yourself and if you run into trouble or need some help: on SSC is a special section in the forum, dedicated to Integration Services (http://www.sqlservercentral.com/Forums/Forum364-1.aspx)
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply