December 16, 2016 at 7:13 am
hI all,
how can i copy a table with an specific amount a data.
example:
myTable has 5 millions rows.
I will create myNewTable as a copy of myTable but with only 1 million data.
myNewTable will replace myTable using table rename and myTable will be saved as history data for research.
The table needs to be available all the time.
How can I achieve this with minimum downtime ?
Thanks in advance.
December 16, 2016 at 7:18 am
How active is your table going to be? Is it going to be in constant use? Are rows likely to be added to your existing dataset while you're trying to do this?
If so, would it not be a better idea be to insert the entire dataset into another table, and then delete the rows that are no longer applicable from your source table?
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
December 16, 2016 at 7:26 am
It has to be always active. we are doing the deletion right now and is taking forever which affect performance, logs,mirroring etc.
December 16, 2016 at 1:04 pm
paul.s.vidal (12/16/2016)
It has to be always active. we are doing the deletion right now and is taking forever which affect performance, logs,mirroring etc.
Is the DELETE able to make use of an index? You might also want to perform it in batches, of say, anywhere from 100 rows to perhaps 10,000 rows, depending on just how much of delete can happen without impacting performance. You could set up a loop that keeps going in batches of N deletes and waits some number of seconds or minutes between each batch.
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
December 16, 2016 at 2:41 pm
paul.s.vidal (12/16/2016)
hI all,how can i copy a table with an specific amount a data.
example:
myTable has 5 millions rows.
I will create myNewTable as a copy of myTable but with only 1 million data.
myNewTable will replace myTable using table rename and myTable will be saved as history data for research.
The table needs to be available all the time.
How can I achieve this with minimum downtime ?
Thanks in advance.
Could you post the CREATE TABLE statement and the Constraints/Indexes for this table? I think I might be able to help a bit after work tonight or maybe tomorrow.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 16, 2016 at 9:26 pm
Paul, I think you've outlined the steps pretty well. Just don't forget to rename the original 5M row table with its new name before renaming the new table to the original table name. Like Jeff asked for, the next step is to look at the DDL for base the table.
December 18, 2016 at 9:53 am
I had a similar Problem and used an Archive table for records that aren't currently active.
On a regular Basis you can copy the rows from MyTable into ArchiveTable and then delete the appropriate rows from MyTable.
You need to remember to use small batches in the copy and delete actions to prevent unnecessary preformance loss.
As a side note, don't forget a cleanup process on the Archive table to prevent it bloating beyond usefullness.
December 19, 2016 at 1:46 pm
This is what exactly we are doing with an store procedure. Deleting in batches of 1500 rows looping 5 times and then when traffic is slower we do it again and so on.
December 20, 2016 at 12:24 am
Why cant you create a new table and insert into that all records and delete the unwanted data from original table ? This will make sure the table is available all the time. Is there any trigger or FK relationships limiting data deletion from parent table?
December 20, 2016 at 1:01 am
Rechana Rajan (12/20/2016)
Why cant you create a new table and insert into that all records and delete the unwanted data from original table ? This will make sure the table is available all the time. Is there any trigger or FK relationships limiting data deletion from parent table?
This is a solution that has been suggested here several times......
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply