May 29, 2018 at 2:30 pm
I have 6 raw tables which are with around 200 million records each. I have to aggregate below 6 month old data and send to archive table.
2. Delete after 2 years
I have create aggregate table script and aggregation script. Please suggest me next steps i have to do.
May 29, 2018 at 2:39 pm
saptek9 - Tuesday, May 29, 2018 2:30 PMI have 6 raw tables which are with around 200 million records each. I have to aggregate below 6 month old data and send to archive table.
2. Delete after 2 years
I have create aggregate table script and aggregation script. Please suggest me next steps i have to do.
I want to use ssis to do so to keep track inserted timestamp
May 29, 2018 at 6:17 pm
saptek9 - Tuesday, May 29, 2018 2:30 PMPlease suggest me next steps i have to do.
Post more information so we can actually help. Table structure in the form of a CREATE TABLE statement would be a good help. Adding an "Inserted Date" to the archive tables is a total waste of drive and memory space. The data in your tables to be archived should contain a temporal column where it will be obvious as to why the rows were archived and approximately when they were archived.
200 million rows means nothing. What we need to know is how many bytes are contained by the table and the related indexes.
In the absence of any other data, my suggestion would be to look into partitioning the data along with a heavy dose of compression. I personally prefer "Partitioned Views" over "Partitioned Tables" for many reasons. Both have some serious advantages of monolithic archive structures but I think that "Partitioned Views" extend themselves a bit more effectively than "Partitioned Tables" do if they done right.
And, no... I just can't imagine using SSIS for this.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply