November 18, 2003 at 7:41 am
Hi,
We have 3 servers called A,B,C
We need to migrate data from A to B AND B to C.There are no primary keys on tables so i have to delete the data and then import.Currently i have created a DTS package(On server B), in this,
1.delete FROM all tables on B
2.Import data from A to B
3.Delete FROM all tables on C
4.Export data from B to C
It is working fine but taking long time and hard to maintain.
I want to do it in one transaction, is there any best way you have. Any ideas please.
Thanks.
November 18, 2003 at 8:44 am
Try creating a DTS package with 2 Copy Sql Server Objects Tasks
#1 to copy from A to B
#2 to copy from B to C
You can specify in the tasks to drop objects before moving.
November 18, 2003 at 9:12 am
Server A is not a SQL SERVER, B anc C are SQL2K
Accessing A from B by DSN
C lnked on B
Thanks
November 18, 2003 at 4:03 pm
Can you export the data from Server A to flat files? If so, then copying the files accross to Server B and using BULK INSERT to load the data, will give you the best performance.
Using the 'Copy Objects' to transfer the data to Server C is also a good option. Although if the data on Server C is an identical copy of the data loaded on Server B, then you can use BULK INSERT to load the data from the same files. This will also allow you to load the data in parallel.
eg.
load table1 on B
load table2 on B and table1 on C
load table3 on B and table2 on C
etc...
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
November 19, 2003 at 7:40 am
Phill,
I can't export the data to flat files because server A is not a SQL box and also i have onlyaccess through DSN.
And Server B and Server C are on two different network.
Thanks.
November 19, 2003 at 1:31 pm
One reason BCP is so fast is because there is no transaction stuff going on. If dumping the data from server A to a flat file for importing via BCP is not an option, then perhaps you could look at your DTS packages and tinker with how they utilize transactions? In a closed environment (no users to block) wrapping a process that has 5 million plus data actions in it within a single transaction can speed it up by a large amount. Far fewer log entries. While I have had practical experience with this within SQL, I have never touched transaction stuff in DTS- it may already be doing all it can.
What is the number and size of the rows you are dealing with? Just a general idea would give me a better feel for how much of the time involved is not necessary.
November 19, 2003 at 1:39 pm
Probably already thought of it, but insert into is slower than select into, truncate faster than delete, inserting into tables with indexes already defined is slow (bulk wise) especially if you have done this process on the same objects over and over and the fragmentation/splits are waaay up there.
I have a process that generates a large amount of data every 30 minutes, and when I insert the initial data I remove all indexes (in some cases I totally drop the tables), then add them back when it's done. Ugly, but it helps.
November 19, 2003 at 3:15 pm
Hi cmore,
Q.What is the number and size of the rows you are dealing with? Just a general idea would give me a better feel for how much of the time involved is not necessary.
A. There are 25 tables and each having like 2000 records.
what happening is, some times after deleting the records while importing from A the job getting fail.Tha't why i want to do in one transaction.
Thanks.
November 19, 2003 at 4:03 pm
Are these 25 tables all being imported in one DTS package?
What is the DBMS on Server A?
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
November 19, 2003 at 4:21 pm
Yes all 25 tables importing in DTS, DBMS on A is informix.
November 19, 2003 at 4:23 pm
Is the import ONE dts package, or multiple packages?
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
November 20, 2003 at 7:28 am
One DTS package.
November 20, 2003 at 3:14 pm
To make it more manageable, break your package up into discreet sets of tables. ie: in each package only include tables that are dependent on other tables in the package. When you have these packages setup you can run them from a master package that executes all the other packages. This way you can manage the flow of data into the database a bit better and have the overall process complete quicker. eg: if you have a table containing data for product orders and another table with data for tracking website visits, both of these tables can be loaded in parallel because they don't relate to each other.
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
November 21, 2003 at 8:00 am
Can i do some thing like this?.
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_NULLS ON
GO
CREATE PROCEDURE EXPORT_IMPORT_DATA(@direction CHAR(20))
AS
DECLARE @Table_Name CHAR(100)
DECLARE @sql CHAR(200)
DECLARE tablename_cursor CURSOR FOR SELECT name FROM sysobjects WHERE xtype = 'U'
OPEN tablename_cursor
FETCH FROM tablename_cursor INTO @Table_Name
WHILE @@fetch_status = 0
BEGIN
BEGIN TRANSACTION
IF @direction = 'IMPORT'
BEGIN
SET @sql = 'TRUNCATE TABLE ' + @Table_Name
EXEC (@sql)
SET @sql = 'INSERT INTO ' + @Table_Name + 'SELECT * FROM <linkserver>..' + @Table_Name
EXEC (@sql)
END
ELSE
BEGIN
SET @sql = 'TRUNCATE TABLE <linkserver>..' + @Table_Name
EXEC (@sql)
SET @sql = 'INSERT INTO <linkserver>..' + @Table_Name + 'SELECT * FROM ' + @Table_Name
EXEC (@sql)
END
IF @@error > 0
ROLLBACK
ELSE
COMMIT
FETCH NEXT FROM tablename_cursor INTO @Table_Name
END
CLOSE tablename_cursor
DEALLOCATE tablename_cursor
GO
SET QUOTED_IDENTIFIER OFF
GO
SET ANSI_NULLS ON
GO
November 23, 2003 at 3:27 pm
Yes you can. It won't perform as quick as a transfer database objects task in DTS. You are also tied to transferring one table at a time. If you group tables together as I mentioned in my previous post you can do parallel loading.
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
Viewing 15 posts - 1 through 15 (of 22 total)
You must be logged in to reply to this topic. Login to reply