September 7, 2008 at 8:50 pm
Hi
I am new and in learning stage.
I am trying to dump out all the data from all table in one Database and use this data to replace the other database.
I used SQL dumper to created a bunch of insert scripts for every table in DB-1 and try to apply it in DB-2.
The issue is in order apply the data to DB-2 I'll need to delete all the data in DB-2.
The question is how can I add delete/truncate statement in front of each script before insert?
I am looking for something can automatic add it in so I don't have to open each script and add the line one by one.
Is there any easy way to perform this task?
Thanks in advance
September 7, 2008 at 10:20 pm
Hi,
I think here you can use DTS (Copy SQL server objects task). There is an option either to drop the objects before, or to append data etc.,...
[font="Verdana"]Renuka__[/font]
September 7, 2008 at 11:46 pm
Thanks for the advice.
I forgot to mentioned that after I create the scripts, I'll also need to add 3 columns to each of these tables at the end.
I haven't use DTS before. Can DTS provide provide this features?
September 7, 2008 at 11:55 pm
If you are on SQL2005, then you can also use SSIS, which will make this task much easier and gain some performance too since it all happens in memory. You can search tutorial on SSIS Introduction on the web.Its not that bad to follow. You can define yr data sources and destination.
September 8, 2008 at 9:27 am
I have to ask what goal you are aiming for.
To what end are you copying data from one database to another?
The reason for it will help me suggest the best method. For example, if this is a one-time data import that you will never do again, and all you're trying to do is make a couple of tables match, that calls for one solution. If this is something you will be doing repeatedly, with complex rules, then that calls for another solution.
Can you clarify the purpose of the move?
Also, how much data are you moving and from how many tables? (Don't need exact numbers, just "millions of rows" and "hundreds of tables" is different from "a couple of hundred rows" and "two tables".)
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
September 8, 2008 at 5:51 pm
GSquared (9/8/2008)
I have to ask what goal you are aiming for.To what end are you copying data from one database to another?
I am trying to copy data from one DB to the other DB on different server
The reason for it will help me suggest the best method. For example, if this is a one-time data import that you will never do again, and all you're trying to do is make a couple of tables match, that calls for one solution. If this is something you will be doing repeatedly, with complex rules, then that calls for another solution.
Can you clarify the purpose of the move?
For our temporary testing purpose, I'll need to this on weekly base
Also, how much data are you moving and from how many tables? (Don't need exact numbers, just "millions of rows" and "hundreds of tables" is different from "a couple of hundred rows" and "two tables".)
There are less than 100 tables and hundreds records in each table
Thanks for everyone's help
September 8, 2008 at 5:54 pm
bdba (9/7/2008)
If you are on SQL2005, then you can also use SSIS, which will make this task much easier and gain some performance too since it all happens in memory. You can search tutorial on SSIS Introduction on the web.Its not that bad to follow. You can define yr data sources and destination.
Yes. I am on SQL2005 and looking at SSIS to see if it is meet my needs. Thx
September 9, 2008 at 9:10 am
Correct me if I'm not understanding this correctly:
What you seem to be doing is moving data from a production environment to a testing environment in order to test new/modified code on it. Is that correct?
If so, the method I use for this is to keep a script of all the changes I'm testing, then I restore the production database onto the test server, run the script, and then test the code.
I have a script for copying the latest backups from production to test (or dev), restoring them (including renaming if desired). I run that, then run my modifications script, which creates new procs, adds new columns to tables, alters existing procs, etc., as needed.
This accomplishes two things. First, it makes it very easy for me to test new code on current data. Second, it allows me to have a single script that, when it's finalized, can be run on the production database (after QA, testing, etc.) that will take all new code and modifications live, and the script has been tested in the exact way that it will end up being used.
Would a plan like that do what you need? It's very easy to set up, if so.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
September 9, 2008 at 11:39 pm
Thanks GSquared.
You are right. May be I should make it clear at the first post.
Here is what I am trying to do:
We have a testing DB-1 which we are continuing modify it as per our project request. Once it's been tested, we will apply it to DB-2. Since a few different people doing the test and update DB-1, it's hard to make both DB-1 and DB-2 in sync. So we decide to dump all the data from DB-1 to DB-2 regularly.
The DB-2 contains part of DB-1's tables and the difference between these 2 DB is that all tables in DB-2 have a few extra columns at the end. The purpose for those extra columns is to build up routing direction base on our hierarchy structure. It will point out which record will be replicated to which production server. So the value for those extra columns may vary.
The goal I am trying to archive is to compare each table in both DB for the first few columns – except last 3 columns (say table 1 in DB-1 has 10 columns and in DB-2 to has 13 columns – 3 extra columns) if they are the same I would keep this record unchanged. If not I want to update this record for the first few columns BUT keep the last 3 columns unchanged. Or in some cases, I need add new records into tables as well.
Right now I am doing it in a awkward way – dumping out data from DB-1 manually add value for last 3 columns and then apply to DB-2.
Is there any better way to do this?
Please advice.
BTW, if you don’t mind can you share your script to me? I would like to learn it
Not sure if I have made it clear now 😉
Thank you so much
September 11, 2008 at 7:03 am
For what you're doing, I would definitely look into either SSIS, or using a third party app like RedGate's Data Compare or ApexSQLS's Diff. Both of those apps are designed for data synchronization between databases.
SSIS will give you the most options, but will require the most work to set up.
Here's the proc I use to transfer databases from server to server. I've taken out my file paths and some business-specific rules on users to add and such, but it should be easy enough to modify to fit other needs.
create proc [dbo].[DatabaseXfer]
(@SourceDB_in nvarchar(125),
@TargetDB_in nvarchar(125))
as
/*
Copies a database from one server to another, using point-in-time from
full, diff and log backups.
*/
set nocount on
create table #Files (
FName nvarchar(1000))
declare @Cmd nvarchar(max), @Err int
-- Drop current copy of target database, if exists
if exists
(select *
from sys.databases
where name = @targetdb_in)
begin
exec msdb.dbo.sp_delete_database_backuphistory @database_name = @targetdb_in
select @cmd = 'drop database ' + @targetdb_in
begin try
exec (@cmd)
end try
begin catch
raiserror('Error dropping target database', 16, 1)
return
end catch
end
declare @Full nvarchar(1000), @FullDT datetime,
@TargetFile nvarchar(1000), @TargetLog nvarchar(1000),
@Diff nvarchar(1000), @DiffDT datetime
-- Details on full backup
select @full = replace(physical_device_name, '[source server backup directory]', '[UNC to source server backup directory'),
@fulldt = backup_start_date
from [source server].msdb.dbo.backupset
inner join [source server].msdb.dbo.backupmediafamily
on backupset.media_set_id = backupmediafamily.media_set_id
where database_name = @sourcedb_in
and backup_start_date > dateadd(day, datediff(day, 0, getdate()), 0)
and type = 'd'
and backup_start_date =
(select max(backup_start_date)
from [source server].msdb.dbo.backupset
where database_name = @sourcedb_in
and type = 'd')
-- If last full database doesn't fit certain criteria, proc must exit
if @full is null or @fulldt is null
begin
raiserror('No full backup qualifies for source database', 16, 1)
return
end
-- Check for diff backup since full was done
-- Picks most recent diff backup
select @diff = replace(physical_device_name, 'H:\SQL02 Diff Backups\', '\\rmesql02\sql02 diff backups\'),
@diffdt = backup_start_date
from [source server].msdb.dbo.backupset
inner join rmesql02.msdb.dbo.backupmediafamily
on backupset.media_set_id = backupmediafamily.media_set_id
where database_name = @sourcedb_in
and backup_start_date > @fulldt
and type = 'i'
and backup_start_date =
(select max(backup_start_date)
from [source server].msdb.dbo.backupset
where database_name = @sourcedb_in
and backup_start_date > @fulldt
and type = 'i')
-- Has to cycle through one or more log backups, if they exist
declare Logs cursor local fast_forward
for select name, media_set_id
from [source server].msdb.dbo.backupset
where database_name = @sourcedb_in
and backup_start_date > coalesce(@diffdt, @fulldt)
and type = 'l'
order by backup_start_date
EXEC sp_configure 'show advanced options', 1;
RECONFIGURE
EXEC sp_configure 'xp_cmdshell', 1;
RECONFIGURE
insert into #Files (fname)
exec master.dbo.xp_cmdshell 'dir d:\*.bak /b'
if not exists
(select *
from #files
where @full like '%' + fname + '%')
begin
-- Copy backup file across the network
select @cmd = 'exec master.dbo.xp_cmdshell ''copy "' + @full + '" d:\'', no_output'
--print @cmd
--print @full
begin try
exec (@cmd)
end try
begin catch
raiserror('Error copying full backup file', 16, 1)
return
end catch
end
select @full = replace(@full, '[source UNC]' + @sourcedb_in, 'd:')
-- Copy diff backup file across the network
if @diff is not null
and not exists
(select *
from #files
where @diff like '%' + fname)
begin
select @cmd = 'exec master.dbo.xp_cmdshell ''copy "' + @diff + '" d:\'', no_output'
--print @cmd
begin try
exec (@cmd)
end try
begin catch
raiserror('Error copying diff backup file', 16, 1)
return
end catch
end
select @diff = replace(@diff, '[source UNC for diff]' + @sourcedb_in, 'd:')
-- Restore full backup
select @cmd = 'restore database [' + @targetdb_in + '] FROM ' +
'DISK = N''' + @full + ''' ' +
'WITH FILE = 1, MOVE N''' + @sourcedb_in + ''' TO N''D:\SQLData\' + @targetdb_in + '.mdf'', ' +
'MOVE N''' +@sourcedb_in +'_log'' TO N''D:\SQLData\' + @targetdb_in + '_1.ldf'', ' +
'NOUNLOAD, NORECOVERY'
--print @cmd
begin try
exec (@cmd)
end try
begin catch
raiserror('Error restoring full backup file', 16, 1)
return
end catch
-- Restore diff backup (if exists)
if @diff is not null
begin
select @cmd = 'restore database [' + @targetdb_in + '] FROM ' +
'DISK = N''' + @diff + ''' ' +
'WITH FILE = 1, MOVE N''' + @sourcedb_in + ''' TO N''D:\SQLData\' + @targetdb_in + '.mdf'', ' +
'MOVE N''' +@sourcedb_in +'_log'' TO N''D:\SQLData\' + @targetdb_in + '_1.ldf'', ' +
'NOUNLOAD, NORECOVERY'
--print @cmd
begin try
exec (@cmd)
end try
begin catch
raiserror('Error restoring diff backup file', 16, 1)
return
end catch
end
declare @Name nvarchar(125), @MediaID int, @File nvarchar(125)
open Logs
fetch next from Logs into @Name, @MediaID
-- Log Backups
while @@fetch_status = 0
begin
select @File = replace(physical_device_name, '[source log backup path]', '[source log backup UNC')
from rmesql02.msdb.dbo.backupmediafamily
where media_set_id = @MediaID
select @cmd = 'exec master.dbo.xp_cmdshell ''copy "' + @file + '" d:\'', no_output'
--print @cmd
begin try
exec (@cmd)
end try
begin catch
raiserror('Error copying log backup file', 16, 1)
return
end catch
select @File = replace(@file, '[source log backup UNC]' + @sourcedb_in, 'd:')
select @cmd = 'restore log [' + @targetdb_in + '] from ' +
'disk = N''' + @file + ''' ' +
'with file = 1, NOUNLOAD, NORECOVERY'
--print @cmd
begin try
exec (@cmd)
end try
begin catch
raiserror('Error restoring log backup file', 16, 1)
return
end catch
fetch next from Logs into @Name, @MediaID
end
close Logs
deallocate Logs
select @cmd = 'restore database [' + @targetdb_in + '] with recovery'
--print @cmd
begin try
exec (@cmd)
end try
begin catch
raiserror('Error on final restore', 16, 1)
return
end catch
EXEC sp_configure 'xp_cmdshell', 0;
RECONFIGURE
EXEC sp_configure 'show advanced options', 0;
RECONFIGURE
-- Add needed accounts
select @cmd = 'Use ' + @targetdb_in + ' CREATE USER [User Name] FOR LOGIN [Login Name] WITH DEFAULT_SCHEMA=[dbo]'
exec (@cmd)
select @cmd = 'Use ' + @targetdb_in + ' EXEC sp_addrolemember N''db_datareader'', N''User Name'''
exec (@cmd)
select @cmd = 'Use ' + @targetdb_in + ' EXEC sp_addrolemember N''db_datawriter'', N''User Name'''
exec (@cmd)
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
September 11, 2008 at 2:28 pm
GSquared (9/11/2008)
For what you're doing, I would definitely look into either SSIS, or using a third party app like RedGate's Data Compare or ApexSQLS's Diff. Both of those apps are designed for data synchronization between databases.SSIS will give you the most options, but will require the most work to set up.
Thank you very much GSquared.
I'll definite check out SSIS and apps you recommended.
And Thanks for your scripts
September 12, 2008 at 8:15 am
You're welcome.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply