December 21, 2007 at 12:26 pm
Hi there.
I am tasked with creating a clone of an existing database, let's call it "maindb".
Somebody will have an application, enter the name of the DB, let's call it "newdb", and click Go.
My task is to write a script in SQL Server, that will take "maindb" and clone it to create "newdb". Clone the tables, stored procs, views, etc.
ALSO, update all jobs, alerts, maintenance plans, etc. to include "newdb", if "maindb" is part of the job/alert.
This "newdb" may be located on another server.
Any ideas?
Thanks!!
December 21, 2007 at 12:36 pm
This is possible.
If you have the Back up of the MainDB and script for all Jobs it is quite possible.
You can restore the DB into the new machine with a different name. (Use Dynamic SQL to pass the New DB name)
Run the Scripted Jobs to the new server.
-Roy
December 21, 2007 at 12:40 pm
Roy Ernest (12/21/2007)
This is possible.If you have the Back up of the MainDB and script for all Jobs it is quite possible.
You can restore the DB into the new machine with a different name. (Use Dynamic SQL to pass the New DB name)
Run the Scripted Jobs to the new server.
I understand what you mean about creating a Backup (.bak) file.
How do I get the script for all Jobs?
How do I use Dynamic SQL to restore the .bak file to a new machine...or to a different database on the same machine?
And how do I add the new database to the Jobs script - dynamically, no Notepad of anything manual?
The request is to have a text field, a "Go" button, and within 5 (or 50) seconds the new database will be created & all jobs will be updated.
thx
December 21, 2007 at 1:12 pm
You can have an App that takes the Text as DB Name and pass it to the Stored Proc.
The stored proc should have the Restore command in Dynamic SQL.
You should try like this
DECLARE @sql NVARCHAR(500)
set @sql = N'RESTORE DATABASE ' + @dbName + ' FROM DISK = ''C:\AdventureWorks.bak''
(Add your path as well). If the back up file is in a central place then you can Provide the Mapped drive location instead of C:)
Make another Stored Proc that creates the Scripted Jobs.
You can script Jobs by right clicking on the Job in SSMS and use option Script Job as Create script.
You can call this stored proc next. But there are some issues you have to think about. What will be your Database file name? Which Drive will it go. What is the new Servers name ect
-Roy
December 26, 2007 at 9:23 am
So I'm looking at something like this:
CREATE PROCEDURE spCopyDB (@NewServer varchar(100), @NewDBName varchar(100))
AS
BEGIN
-- Backup existing DB to C:\AdventureWorks.bak
-- Restore bak to \\@NewServer\@NewDBName
-- If @NewServer = ThisServer
---- Update jobs to include @NewDBName
-- Else
---- Migrate all jobs from ThisServer to @NewServer
---- Update jobs on @NewServer to include @NewDBName
END
Looks simple enough....now how do I do these things? 🙂
The location of the new databse will be dynamic. It can be the same server, or a new server. The name will be different every time the SP is run.
So can I use RESTORE DATABASE to restore a .bak to a new server, AND a new database name?
And can I automate the Script Jobs, change it to include the new DB, and execute it...all in my one Stored Proc?
Thanks.
Jason
December 26, 2007 at 9:41 am
The migrate jobs part is not easy and probably can't happen in a stored procedure. The backup could work, but it would need to have access to the remote sql server. Remember that a stored procedure runs inside SQL Server, like a method call in your code. It isn't running on the Windows box and can't easily see resources on another server.
The hard part is that jobs could be running that aren't easily determined as affecting the old database. Jobs run on the server. A database runs on the server too, they're not inside one another.
The way this is usually done is manually. There aren't easy ways to programmatically get to jobs from outside SQL Server and not really any at all within SQL Server. If you knew the job names, you could conceivably muck with the MSDB tables, but that wouldn't necessarily work for jobs that called packages or external commands.
How often does this need to be done and what is the business rule. Maybe we can help you find another way around this.
December 26, 2007 at 10:12 am
Steve Jones - Editor (12/26/2007)
How often does this need to be done and what is the business rule. Maybe we can help you find another way around this.
Thank you for your input.
This needs to be run on-the-fly, by non-technical end users, maybe 5-10 times per day.
The newly-created database needs to be dropped after the routine is finished.
We can probably get away with leaving out "jobs" support, however there are performance alerts and we'd prefer to have those transferred if possible.
The most important thing is copying data tables, stored procedures, views, functions, etc.. to a dynamicly-entered server name and a dynamically-entered database name.
The source database and server will ALWAYS be the same value. The new database will sometimes reside on the same server.
Thanks again!
Jason
December 26, 2007 at 10:39 am
If the old and new are on the same server, the backup and restore will be easy. I'd recommend you do an EXECUTE AS (look up), to allow people with non-sysadmin rights the ability to restore the database.
Create proc MyHelper
@dbname varchar(30)
as
RESTORE command
-- Script alerts
return
I'd get the restore code by performing the restore you want, making sure that you get the variables in for the new name and include the WITH MOVE command. You might need to make this dynamic SQL since I'm not sure how many options in restore can be variables. However that would be easy. Just declare a variable at varchar(8000) and build a string
select @cmd = 'restore database ' + @dbname + ' with move ''xxx'' to ''c:\xxxx' + @newpath + ''', nextoption = yy'
Then exec(@cmd) to run it.
I'd also script out the alerts you need manually. Do this in SSMS, get the code, and then put that code into your stored proc, using variables where needed. That way you know what's being included. If new alerts are added, you'd have to manually fix your proc.
If there are GOs in the script, they won't work in the stored proc. In that case I'd build a separate proc for each batch (between GOs), the same way, and call it from your main proc. That way it's controlled from one spot.
Hope that helps, feel free to ask more questions
December 27, 2007 at 8:01 am
Where is replication in this picture?
December 27, 2007 at 8:43 am
This needs to be run on-the-fly, by non-technical end users, maybe 5-10 times per day.
The newly-created database needs to be dropped after the routine is finished.
Why is this being done? What is the purpose for creating 5 to 10 clones per day and then dropping them?
--Jeff Moden
Change is inevitable... Change for the better is not.
December 27, 2007 at 9:44 am
Jeff Moden (12/27/2007)
Why is this being done? What is the purpose for creating 5 to 10 clones per day and then dropping them?
Good question. This client has some very complicated laboratory research testing going on - tables and many layers of views are dynamically created for operations - this was all done pre-me.
When things get hectic, I run sp_lock and I see about 8 exclusive locks bringing the whole database to a crawl. What normally takes 30 seconds takes over an hour.
Performance Monitor shows no heavy server load - very low index seeks, table scans, no disk paging, low locks/sec. But high CPU processor activity.
What I've been tasked with is copying the database to another database, even another server, to separate out these dynamic view-generating options and hopefully this will eliminate all that cross-logjam.
I'd like to launch a full code investigation, but I'm only commissioned for 12 hours for now, and I haven't thought of anything better than to implement their plan.
Thanks for listening. 🙂 any thoughts?
Jason
December 27, 2007 at 10:01 am
Jason
This can be easily done with SSIS.
Just use
Transfer Database Task with
Transfer Jobs Task and
Transfer Logins Task
December 27, 2007 at 10:01 am
I still do not understand why 5+ times ?
Aren't you better off using Snapshots instead ?
And that is *a lot* faster than the whole thing. Of course Snapshots are to be taken only on the same server. I believe you will spend a long time creating these "clones". if that is the only solution you will still have unavailability at restore time 🙁
You should rethink that solution a bit more. I would rather spend the time on fixing the root cause ( the locks).
* Noel
December 27, 2007 at 11:16 am
Heh... you beat me to it, Noel... I'd rather fix the problem than make a patch and I'm thinking that the problem is just bad code to start with. If locks are involved, I'd bet there's a wad of cursors being used that don't have the FAST FOWARD option invoked. Now, THAT would be a simple fix... still not as good as replacing the Cursors with proper set-based code, but it would at least stop the blocking so the databases don't need to be cloned.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 27, 2007 at 12:58 pm
noeld (12/27/2007)
Aren't you better off using Snapshots instead ?
What are Snapshots? In Oracle, Snapshots were a precursor to materialized views, similar (in theory) to SQL 2000's indexed view...which never quite worked.
If, instead of creating several layers of views, a "physical" compilation of several tables joined together would REALLY help.
(Let me guess #tmp tables, right?)
I'll look at the T-SQL tonight. Not sure yet if there are a wad of cursors. You're saying that would explain the X locks?
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply