April 7, 2005 at 9:46 am
Scenario: a large data mart used for reporting needs to be updated nightly while minimizing downtime. The data comes from one massive extract as a text file via FTP. SQL Server 2000.
The question is what is the best way to replace the old data with the new data with minimal downtime? If there is any downtime, it is preferred if the affect is a delayed query execution instead of an error such as “table does not exist.”
Some ideas are:
1. Copy the existing table, fill it with the new data, drop the original table, rename the table copy.
2. Fill a copy of the main table with the new data, identify any rows that are different in the main table, then update, insert, or delete as needed.
What are some other options? What is the best option?
Thank you,
Mike Chabot
April 8, 2005 at 7:49 am
I have done the following to decrease downtime while performing daily catalog rebuilds.
-Build a duplicate temp table(s), matching the original live data table(s).
-Insert all data into the temp table(s).
-Truncate the live data table(s).
-Insert the temp data to live data table(s).
-Truncate the temp data table(s).
This way you will not have to worry about constraints, indexes or dependencies getting broken if you drop your live data table and attempt to do a rename of the new table. If your Live tables contain a large number of indexes, it is often faster to drop the indexes; perform the update then rebuild the indexes after the mass insert is completed.
-Mike Gercevich
April 8, 2005 at 11:40 am
I personally like the idea of truncating the table then copying over the data. It is clean, but it is much slower than renaming a table for large tables. It can take up to a minute to copy the data from one large table to another using DTS.
The scripts currently in use follow the table renaming approach. Unfortunately the scripts are very complex because of all the views, permissions, and indexes that have to be maintained. The person maintaining the scripts is creating a brand new table, populating it, dropping the old table, then renaming the new table. I am looking to see if there is a method of copying data that is very fast, and does not involve a lot of scripting of permissions and views.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply