June 11, 2009 at 9:19 am
I have inherited a system where a website's tables occasionally get updated with new (or updated) data from a remote server.
It's set to use DTS to DELETE FROM table from the destination (SQL on the webhost) and then copy the columns from the source (office server on the other side of the internet). This creates a lag time where a user who uses the website gets errors while the table data is updating information crawling across the internet.
Bad situation.
So, I wonder if there is a good DTS solution where a temp table is fully uploaded to the webhost (destination) before the webhost table data gets replaced. I thought that perhaps caching all the tables with server side code might be ok (since the database isn't very large) but that might not necessarily be the best long term solution. I am already caching the most commonly used data with server side code. I just wondered if there is a simple DTS way to seamlessly update the live tables.
In case it matters, the source database is SQL 2000 enterprise while the destination is SQL 2005 express.
June 11, 2009 at 10:12 am
This is commonly done by importing data from the remote source into a staging table (we have a staging database containing permanent tables) in the destination instance, then truncating or deleting from the destination table and inserting the new data from the staging table.
Most of the packages I've created that do this have the following steps/tasks:
1. Execute SQL task truncates staging table
2. Transform Data task inserts source data into staging table
3. Execute SQL task truncates destination table
4. Transform Data task inserts data from staging table to destination table
5. Execute SQL task truncates staging table
Greg
June 12, 2009 at 11:20 am
Thank you very much, I'll be sure to do that.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply