December 3, 2008 at 5:58 pm
Hi All
I'm going to have multiple databases with the same table structure. I want to create a set of 'global' SPs/UDFs in SQL Server where I can simply pass the DB name the SP/UDF should execute on. I hope doing something like:
input @dbName
select *
from @dbName + 'dbo.myFunc(' + '' + 'some str' + '' + ')'
type of thing is not my only option! I have several strings that I'm passing as parameters, so passing them is going to be a real pain if I have to join like above 🙁
Other question is, where do I store these SPs/UDFs? I mean..do you create some sort of a 'ScriptsOnly' type database and put all the scripts there? Cos I don't think there is a 'global' placeholder for such queries..is there?
December 3, 2008 at 6:35 pm
Why not create one set of stored procedures that are then duplicated in every database?
If you don't want to use dynamic SQL, then this is probably your best choice.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
December 3, 2008 at 7:19 pm
That'll mean maintaining multiple copies of SQL eh? If we need to make a slight change to a script, or even add some new script, that means it needs to be done for #of DBs..so I'm thinking there should be some other way to do this..?
December 3, 2008 at 7:56 pm
thusi (12/3/2008)
That'll mean maintaining multiple copies of SQL eh? If we need to make a slight change to a script, or even add some new script, that means it needs to be done for #of DBs..so I'm thinking there should be some other way to do this..?
This is easy to handle.
There is several ways to handle it, but I would probably use a "GOLD_CODE" database that is a scripted copy of the actual databases (but has no data). Then add a CREATE/ALTER/DROP PROC DDL trigger to the database. Whenever any changes are made to the stored procedures in this database, the DDL trigger would re-execute the command to all of the target databases.
For management purposes, you would probably want the DDL trigger to test the name or schema to exclude your own local procs before doing the duplication.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
December 3, 2008 at 8:05 pm
hmm..that definitely sounds like a neat approach! Unfortunately I don't have any knowledge on creating triggers and advanced techniques like the ones you've suggested..but I'll certainly do a bit of reading on how to do this. Thanks for the suggestion 🙂
December 3, 2008 at 8:31 pm
Well..think I'm going to end up using scripting -
http://www.eggheadcafe.com/software/aspnet/29255866/copy-database-with-only-s.aspx
where you simply create a copy of the database structure. Not the ideal solution, cos it'll possibly be a pain to maintain later on..but should be an ok solution I think (though not as good as yours). Thank for your help Barry.
December 3, 2008 at 10:36 pm
Here's something to get you started:
DROP TRIGGER [dbtrDuplicateProcs] ON DATABASE
go
CREATE TRIGGER [dbtrDuplicateProcs]
ON DATABASE
FOR CREATE_PROCEDURE, ALTER_PROCEDURE, DROP_PROCEDURE
AS
SET NOCOUNT ON
Declare @Type as NVarchar(32)
, @cmd as NVarchar(MAX)
, @sql as NVarchar(MAX)
-- Determine the Type
SELECT @Type = EVENTDATA().value('(/EVENT_INSTANCE/EventType)[1]','nvarchar(max)')
, @sql = N'Declare @cmd as NVarchar(MAX), @msg as NVarchar(MAX);
'
PRINT @Type + N' being duplicated to all Test databases.
'
--hold the executed command in a temp table
SELECT EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)') cmd
Into #tmpCmd
Print '' ''
USE [' +name+ '];
Print ''Using database [' +name+ ']''
Select @cmd = cmd From #tmpCmd
Begin Transaction
BEGIN TRY
Print @cmd
EXEC(@cmd)
Commit
END TRY
BEGIN CATCH
Print ERROR_MESSAGE()
Rollback
END CATCH
' From master.sys.databases
Where name like '%test%'
EXEC(@sql)
GO
ENABLE TRIGGER [dbtrDuplicateProcs] ON DATABASE
go
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
December 3, 2008 at 10:40 pm
thusi (12/3/2008)
Hi AllI'm going to have multiple databases with the same table structure. I want to create a set of 'global' SPs/UDFs in SQL Server where I can simply pass the DB name the SP/UDF should execute on. I hope doing something like:
input @dbName
select *
from @dbName + 'dbo.myFunc(' + '' + 'some str' + '' + ')'
type of thing is not my only option! I have several strings that I'm passing as parameters, so passing them is going to be a real pain if I have to join like above 🙁
Other question is, where do I store these SPs/UDFs? I mean..do you create some sort of a 'ScriptsOnly' type database and put all the scripts there? Cos I don't think there is a 'global' placeholder for such queries..is there?
If you have to maintain lots of global objects, why dont to create a new dB as dBGlobals in which you can create all the global objects.
kshitij kumar
kshitij@krayknot.com
www.krayknot.com
December 4, 2008 at 5:39 am
Personally, I create all the databases out of TSQL code and check the code into a source control management system (Visual SourceSafe VSS, Team Foundation Server TFS, there are others). Then you just build a process either using a commercial tool (DBGhost, Red-Gate SQL Compare, Visual Studio Team System Database Edition) that can work out of source control, or you make your own process that gets the code from source control.
There are a number of reasons why you should do things this way, but the number one is so that you can identify a moment in time for your system by referring to labelled versions of the code. If you want to know what the database looked like on 3/15/2005, go to your source control and get the entire database and script it.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 4, 2008 at 6:32 am
Grant Fritchey (12/4/2008)
Personally, I create all the databases out of TSQL code and check the code into a source control management system (Visual SourceSafe VSS, Team Foundation Server TFS, there are others). Then you just build a process either using a commercial tool (DBGhost, Red-Gate SQL Compare, Visual Studio Team System Database Edition) that can work out of source control, or you make your own process that gets the code from source control.
You know what I'd like to see, Grant, is a "Getting Started" article on VSTS-Database Edition for DBA's. I attended your talk at PASS on DB-Pro (which is the same thing?) and it was great, filled with all kinds of things that I didn't know you could do. However, it started out somewhat beyond what I already knew.
I've used VS before (a whole lot) and BIDS (just a little), but not VSTS. So from a DBA perspective, I think a lot of folks would like to know how they can get started with this as well as what they could do with it. Especially know that GDR is (almost?0 out .
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
December 4, 2008 at 6:41 am
Hey Barry,
Yeah, the GDR got released last Tuesday. FINALLY. And yeah, DBPro & Visual Studio Team System Database Edition are the same things.
That's a good idea. I'm in the middle of a writing project (as I'm constantly whining about, I'm sure you noticed), but once I finish it up, I guess it is time to create an article on just that topic.
BTW, the new help file that comes with the GDR is written up kind of like a series of articles on how-to rather than explicit documentation. I find it a bit of a pain in the tucas, but it's likely to be more helpful for those just getting started.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 4, 2008 at 1:08 pm
Thanks for your suggestions guys.
Personally, I create all the databases out of TSQL code and check the code into a source control management system
That's pretty much what that link suggests, so I used the SSMS option to create a script to create the entire database. Need to look at using some source control management system though. We are not in a production type environment and the data is used for research purposes, but having a version control system will certainly be handy. Thanks all
December 5, 2008 at 5:23 am
Just a wrinkle on top of what you're doing, get individual scripts for each object, not one script for the entire database. It's more difficult to work with individual objects to recreate the database (which is why third party tools such as Red-Gate SQL Compare of Datadude are helpful), but versioning individual objects is so much easier than trying to maintain versions of the entire database in a single script.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 5, 2008 at 12:57 pm
I'm with Grant.
I.e., we are a SaaS environment with a separate database for each customer all running the same code.
We store all of our code (one file per object) in the source control system then build, using a recipe, a release package (T-SQL script) that performs ALTER (not drop and re-create) on all of the modified objects (UDFs, procedures, etc.) for the release.
The script is run against each of the databases using SQLCMD during the release roll-out. It only takes a few seconds/minutes per database. It is safe, predictable, and testable by QA, etc. prior to the release.
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply