March 30, 2022 at 7:29 am
Hello there All,
I have an enormous table of DataSpace 200 GB information and 450 GB of Index Space, Compression type is ColumnStore.
Presently, I am utilizing a power shell script with SqlbulkCopy Object to duplicate the information with a bunch size of 10000.
As the objective table size develops the duplicate is turning out to be increasingly slow current content is requiring very nearly 3 hours to duplicate approx. 6000000 lines.
Would you be able to kindly recommend better choices if any?
Much thanks to you.
March 30, 2022 at 1:33 pm
Try batching based on 102,400 rows. The reason I say that is because less than 100,000 rows in a batch and columnstore indexes use the delta store until you get to 102,400. You can read more about it here.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 1, 2022 at 4:16 pm
How many of what kind of indexes do you have on the target table? Also, what is the total number of rows in the source table?
--Jeff Moden
Change is inevitable... Change for the better is not.
April 7, 2022 at 6:51 am
This was removed by the editor as SPAM
April 7, 2022 at 3:25 pm
Heh.... must still be waiting for the first transfer to complete, eh? 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply