December 21, 2006 at 12:43 am
Hi!
I've inherited a stored procedure that copies 7m rows in one table from server 1 to server 2. Indexes are droppred on the destination table, all rows copied using insert/select, then indexes recreated.
If possible, I need to speed up this process. Bearing in mind that we need to copy all the records, the only idea I have come up with is to reindex, rather than drop/create. Are there any other techniques in SQL2000 that will be quicker?
Thanks for any help.
Alan
December 21, 2006 at 4:11 am
The point many overlook with this type of question is whether or not the underlying disk subsystem is a bottleneck.
I assume server 1 and server 2 are physically seperated by a network and the query runs slowly? So what do mean by slow and why do you need to speed it up ( working on the "if it ain't broke don't fix it" principle )
Full recovery at the target will slow things down, so simple recovery ( or maybe even bulk logged ) will probably help. Generally batching works quickest rather than a "big bang" approach. Make sure the tran log and data drive are on seperate physical arrays and the target arrays are not raid 5 ( which is horrible for writes )
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
December 21, 2006 at 5:02 am
Try bcp'ing the data out to an intermediary textfile then bulkinsert it back in.
Do it right and the performance gains can be massive.
December 21, 2006 at 2:31 pm
It is generally faster to drop, copy data and then recreate the indexes than trying to do mass inserts with indexes intact.
December 21, 2006 at 3:46 pm
drop indexes bcp the data out and back in or drop the table and do a select * into table
both can be non-logged operations and that is where you get your speed from if the disk aren't just plain slow to begin with.
Wes
December 22, 2006 at 3:09 pm
I think that first you need to identify what part(s) of the process is slow. Is it the Select of the data, the insert of the data, the re-indexing?
When dealing with large data sets, it can speed things up by breaking it into smaller tasks.
December 23, 2006 at 1:09 am
Thanks guys! Some good advice here.
As Robert suggests, I'm going to break it down to see which part is slowest.
Alan
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply