January 14, 2002 at 6:24 pm
Need help on an overall design stategy.
We have several client databases(sql2k) that use some of the same stored procedures. Right now we store the procs in each database but how/where can stored procdures be created so that multiple databases can use them but the data returned is specific to the calling database?
We have been able to call stored procs in other databases but we cannot get exec 'Use ' + @dbname to work. And is there a performance hit to this strategy.
How else do other people accomplish this? Extended stored procs?
Any help is much appreciated.
January 14, 2002 at 9:05 pm
If you use a three part naming convention within the structure of your stored procedure, it'll hit the right table. For instance:
CREATE PROC usp_GetCustomers
@City nvarchar(15)
AS
SELECT CustomerID, ContactName, ContactTitle
FROM Northwind.dbo.Customers
WHERE City = @City
Extended stored procedure are where one has built extensions in Visual C++ and are linking them into SQL Server for use. Microsoft has provided the appropriate header files, etc., to allow one to do this. xp_cmdshell and xp_sendmail are two examples.
K. Brian Kelley
http://www.sqlservercentral.com/columnists/bkelley/
K. Brian Kelley
@kbriankelley
January 15, 2002 at 11:16 am
The three part naming convention works fine to get at data in another database. But what if you wanted this to be the varible part of the stored proc?
So 'FROM Northwind.dbo.Customers'
would be replaced with something like '@mydb.customers'. This would allow this proc to be used by many databases that call it with the database name as a parameter. Is this a sound strategy? How else could this varible database access be accomplished?
Or perhaps it is best to have the same stored proc in all the necessary databases and a script to update all of them if/when the proc changes?
Thank you for your help.
January 15, 2002 at 11:48 am
This can be done using Dynamic SQL and EXECUTE or sp_executesql. Something along the lines of:
CREATE PROC usp_Dynamic
@dbName nvarchar(25),
@tableName nvarchar(25)
AS
DECLARE @SQLString nvarchar(500)
SET @SQLString = N'SELECT CustomerID, ContactName, ContactTitle FROM ' + @dbName + N'.dbo.' + @tableName
EXEC sp_executesql @SQLString
GO
It's a hard call to tell you which way is best because one of the things that have to be kept in mind is how many databases you are supporting with similar structures. I have some environments with the same issue. In our particular case it is a warehousing solution and as a result users are running ad-hoc queries, but we have talked about building stored procedures and how we handle that across 20+ databases that are identical structurally (data can't be kept together due to security and auditing reasons) and never came to a solution we felt very satisfactory about. I think in the end we decided we'd script the stored procedure creation across all databases, but we've not put anything into place as of yet so I can't offer any "real world" experience on this one.
I know Andy manages a ton of databases, but I don't know if any of them are similar or identical. Perhaps he can shed some insight.
K. Brian Kelley
http://www.sqlservercentral.com/columnists/bkelley/
Edited by - bkelley on 01/15/2002 11:48:56 AM
K. Brian Kelley
@kbriankelley
January 15, 2002 at 3:37 pm
I'm not sure I understand your question completely but here goes.
You can place a generic stored procedure in the master database with an 'sp' prefix and you can then call it from any db and it will be used. I use this at times for administration, for instance:
create procedure sp_tablenames as select name from sysobjects where xtype = 'u'
Then if I want to know the tables within the context of my connection I can execute that stored procedure.
This is assuming the tables exist in each db, and the structure is the same.
One central location to change the stored procedure.
Hope this is clear enough.
Tom Goltl
January 15, 2002 at 5:38 pm
I avoid the dynamic sql approach as much as possible. We run an app that partitions client data into separate db's - each which requires 'basically' the same procs, views, etc. I could try change them to a one size fits all, but doesn't seem to me to be much to be gained. The biggest challenge of course is that instead of updating one proc, you're updating one per db...maybe its 10, or the 200+ I have. As long as you apply the changes using a tool, whats the difference?
Maybe the hardest case I have is there is one proc that has a 3 char const embedded that is used only for that db. We used to modify each by hand as the db was created (model actually has everything in it), but eventually I built a small app that did the drudge work for me.
Dynamic sql isn't evil, sometimes it's even necessary. I probably fall into the category of avoiding it 'too much' but so far getting by without it!
Andy
January 16, 2002 at 7:06 pm
Thank you everybody for the input.
Recap and thoughts or questions:
You can have a user defined stored proc in the master database that will return data specific to the calling database. This means one version of the proc to change if you need too but is the downside performance due to more of a load on the master and a multidatabase operation?
The other option is to continue to store the procs in the individual databases and have a tool update them if needed. This sounds fairly straight forward but any suggestions how the tool would accompish this? Query the system table in each db or ?
Thank you again for all the help.
January 17, 2002 at 5:19 am
As far as a tool, all you really need is a mechanism for applying a standard "alter script". Basically a cursor, whether you use SQL or just code it in a language like VB.
Andy
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply