August 18, 2010 at 8:04 am
We are looking for a low-cost tool which will compare the data contents of a set of tables between two servers, and generate a SQL script which will bring the target database into line with the source database.
Please note that we are looking to sync the DATA, not DDL.
I know that there's a team edition of Visual Studio which will do this, but the cost of this solution is prohibitively high for us. Can anyone recommend a reasonably-priced alternative?
August 18, 2010 at 8:22 am
TABLEDIFF - Comes free with SQL Server. Check it out in Books Online.
David
@SQLTentmaker“He is no fool who gives what he cannot keep to gain that which he cannot lose” - Jim Elliot
August 18, 2010 at 8:28 am
You could create something yourself based on the sys tables
August 18, 2010 at 8:32 am
vishal.gamji (8/18/2010)
You could create something yourself based on the sys tables
Sounds painful. There are companies out there that make big money doing this so not sure that I would want to spend my time recreating something that is pretty well in existence elsewhere.
Red-Gate, Idera, ApexSQL - all have these created and pretty reasonable prices for the robustness of them.
TableDiff as I mentioned earlier is free and is strictly command based but works great for the price. A little bit of batch file around it and you have a tool that can compare multiple tables pretty quickly.
David
@SQLTentmaker“He is no fool who gives what he cannot keep to gain that which he cannot lose” - Jim Elliot
August 18, 2010 at 8:51 am
what volume of data are you talking about...
no of tables / rows ??
________________________________________________________________
you can lead a user to data....but you cannot make them think
and remember....every day is a school day
August 18, 2010 at 9:00 am
I agree with David that TableDiff can work. It's a little work on your side, but not a crazy amount. Red Gate and Apex tools are low cost, are tested, and make it easy to check data between two tables.
August 18, 2010 at 10:48 am
Thanks for the replies. We tried Tablediff but got an error in the generated update script which we couldn't understand (see below). The table in question had no "large data types" - only a varchar(500) column - see the full table definition at the bottom of this message.
-- Host: oursourceserver
-- Database: [our_database]
-- Table: [dbo].[SYSTEM_DM_DOCUMENT]
SET IDENTITY_INSERT [dbo].[SYSTEM_DM_DOCUMENT] ON
-- unable to create UPDATE for record with only large data types
-- [DOCUMENT_ID] = 127
INSERT INTO [dbo].[SYSTEM_DM_DOCUMENT] ([DOCUMENT_DESCRIPTION],[DOCUMENT_HISTORY],[DOCUMENT_ID],[DOCUMENT_NAME],[DOCUMENT_VERSION_ID],[EFFECTIVE_DATE],[EXPIRY_DATE]) VALUES ('PromotionTest',0,211,'PromotionTest',1,'2010-08-17 12:14:00.000','2010-08-18 14:24:00.000')
SET IDENTITY_INSERT [dbo].[SYSTEM_DM_DOCUMENT] OFF
The table definition:
CREATE TABLE [dbo].[SYSTEM_DM_DOCUMENT](
[DOCUMENT_ID] [int] IDENTITY(1,1) NOT NULL,
[DOCUMENT_DESCRIPTION] [varchar](500) NOT NULL CONSTRAINT [DF_SYSTEM_DM_DOCUMENT_DOC_DEBU] DEFAULT (' '),
[DOCUMENT_VERSION_ID] [int] NOT NULL CONSTRAINT [DF_SYSTEM_DM_DOC_DOC_VERS_ID] DEFAULT (1),
[EFFECTIVE_DATE] [datetime] NOT NULL CONSTRAINT [DF_SYSTEM_DM_DOC_EFFE_DATE] DEFAULT (1 / 1 / 1900),
[EXPIRY_DATE] [datetime] NOT NULL CONSTRAINT [DF_SYSTEM_DM_DOC_EXPI_DATE] DEFAULT (1 / 1 / 1900),
[DOCUMENT_NAME] [varchar](200) NOT NULL CONSTRAINT [DF_SYSTEM_DM_DOCUMENT_DOC_NAME] DEFAULT (' '),
[DOCUMENT_HISTORY] [bit] NOT NULL CONSTRAINT [DF_SYSTEM_DM_DOCUMENT_DOC_HIST] DEFAULT (0),
CONSTRAINT [PK_SDD] PRIMARY KEY CLUSTERED
(
[DOCUMENT_ID] ASC,
[DOCUMENT_VERSION_ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 90) ON [PRIMARY]
) ON [PRIMARY]
August 18, 2010 at 10:48 am
...oh and we're talking about a few hundred rows (max) in each table...
August 18, 2010 at 10:49 am
So you have one source table and one destination table, and you just want to make sure the destination matches the source?
If that's all you need, then there are a ton of ways to do this.
If you have a primary key/unique field, you could just do something like:
Insert into table2
select * from table1
where t1_pk not in (select t2_pk from table2)
Or, you could just truncate the destination table, and then insert all rows from the source again (not the best idea if you're working with hundreds of millions of rows, or over a wan/vpn link). A similar method would be to use a Copy Objects task in DTS or SSIS, and just copy the whole table.
Do you need to check for column changes, or just row additions/deletes?
edit: just saw your new post - Honestly if this was given to me to do with only a few hundred rows and a few tables, I would just set up a quick SSIS/DTS copy objects task.
August 18, 2010 at 11:02 am
sqlservercentral-606110 (8/18/2010)
Thanks for the replies. We tried Tablediff but got an error in the generated update script which we couldn't understand (see below). The table in question had no "large data types" - only a varchar(500) column - see the full table definition at the bottom of this message.
If you are using 2008 tools you might want to try TableDiff from 2005 tools as I have run into the same issue.
The table sizes you describer certainly sound very reasonable for this tool.
Side note - as the last poster replied you could use replication to keep these in sync. Just wondering what your thoughts are on that approach.
David
@SQLTentmaker“He is no fool who gives what he cannot keep to gain that which he cannot lose” - Jim Elliot
August 18, 2010 at 12:40 pm
Actually we're using 2005 tools (on a 2005 server environment).
I hadn't really considered replication as the changes we want to propagate are the outcome of minor development updates which we would want to roll out to a number of different test platforms, as well as to our live server. While we could set up replication subscriptions on each of the target servers it would be far simpler to have a script which we could run out to each.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply