July 3, 2006 at 10:47 am
Hi All,
I have got my BCP command like this:
But the rate at which it transfers data is quite slow. I mean around 1 hour for only 5 million records.
I want to specify the Rows_Per_Batch option, to see if that would boost thing, I am not quite sure of where it would be in my code.
SET @Dynamic_SQL = 'bcp ' + My_Database + '..'+@TabletobeOutput+' IN "' + @MyDumpDirectory + @File + '" -q -t, -r\n -T -c '
Can someone recomend where this should be ?
July 3, 2006 at 12:34 pm
How many indexes are on the table. Try dropping the indexes during the bulk load and rebuilding them afterwards.
Tom
July 4, 2006 at 5:06 am
try the -b switch for rows per batch.
July 4, 2006 at 9:00 am
John,
I replied to this in your original posting ...
Use the "-b" switch anywhere in amongst your other switch options
e.g. To set a batch size of 1000 rows ...
SET @Dynamic_SQL = 'bcp ' + My_Database + '..'+@TabletobeOutput+' IN "' + @MyDumpDirectory + @File + '" -q -t, -r\n -T -b1000 -c '
I'd be surprised if this made much difference, though.
Does your target table have any indexes on it? Can they be dropped for the duration of the copy, and then re-created again?
If the target tables has an index, the every insertion into the table is logged. If you can drop the indexes, and put the database into "Bulk Copy" or "Simple" mode, then you should see a huge improvement.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply