July 4, 2017 at 4:59 pm
I have a pretty simple data transfer from one table to another to de-duplicate about 3billion rows of data, I would like to have a recovery point in this due to the amount of time that it runs. For instance once i'm done with a year of data. (that's the easiest way I have found) . I would like to to have the ability to stop then restart it at a different time of day. (Off peak hours). At best calculation it's going to take about 36 straight hours to do this and I would rather not use a Cursor method to accomplish this task.
any thoughts or direction someone could guide me?
July 5, 2017 at 12:50 pm
Is it possible to break up this process into batches? You could configure it to run just a batch at a time if you want to control how much data is processed each time.
Tim Mitchell
TimMitchell.net | @tmitch.net | Tyleris.com
ETL Best Practices
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply