August 1, 2006 at 11:13 am
Hi,
I need to be able to develop and maintain generic stored procedures for a server with about 20 databases on it. The stored procedures need to be able to work on any of the 20 databases (they all have the same basic set of tables), but I need the procedures to exist in only one place, so that I don't have to maintain 20 copies of each stored procedure whenever a change comes along (which is quite often). The trick is that
the stored procedures won't know what database they're working on until run time.
The way I'm doing this now is by writing stored procedures that build their syntax at run time, which is a pain to work with, and really, really ugly. The following example shows what I currently have to go through in order to make a simple update to a table:
create proc MyProc
@dbName varchar(50), -- The database to run against
@newVal varchar(20), -- New value to apply to records
@limit varchar(20) -- Limit to use in the WHERE clause
as
declare @stmt varchar(1000) -- This will hold the generic statement
set @stmt = ''
set @stmt = @stmt + 'update ' + @dbName + '..MyTable '
set @stmt = @stmt + 'set someCol = ''' + @newVal + ''' '
set @stmt = @stmt + 'where someOtherCol = ''' + @limit + ''' '
exec(@stmt)
This will build and execute the following statement (assuming it's given 'XX', 'YY' and 'ZZ' as the 3 arguments):
update XX..MyTable set someCol= 'YY' where someOtherCol = 'ZZ'
... which would have been a small, simple stored procedure, if I didn't have to deal with the fact that I don't know the name of the database until run time.
As you can see, this gets ugly and out of hand in a hurry. Not to mention that trying to get any kind of status information from the query executed via exec() is a pain.
So, does anyone have any ideas for a better way to accomplish this? Any suggestions are welcome!
Thanks!"
August 1, 2006 at 11:40 am
Ken Henderson does something like this in his book "The Guru's Guide to Transact SQL" by creating the objects within the master database and giving them the prefix "sp_". That way you can call them from any database (a prefix of "sp_" causes the query processor to search the master db catalog first before the current database). However he's pretty much only querying common system tables. I just ran a little test and it seems to fail for insert/update functionality. And I usually try to avoid creating objects in the master database.
You might consider just creating the sp in each of the databases and call it by its full name... i.e. exec mydatabase.dbo.sp_update_something.
Also it might be simpler to look into using sp_executesql if you have to use dynamic sql. It tends to be a little more of a learning curve than EXEC() but overall it's better, IMHO.
August 2, 2006 at 7:03 am
Thanks. The above was just a sample, but my actual procedures are very lenthy, uses temp tables and cursors. I am just trying to avoid to make dynamic sql and put my proc. in master database as I would not have permission to do that(company policy).
August 2, 2006 at 10:44 am
You have a choice
avoid dynamic SQL or avoid putting the SP in the master database.
If you cannot put things in the master database then you will need dynamic SQL <period>. If you avoid dynamic SQL the PS has to go into the master database in order to set the 'execution context' (like a use database) for the SP.
RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."
August 2, 2006 at 11:45 am
Just on a second thouhgt, i was thinking to have all procedures in central database and all developers would work on that, any changes to the procedures and that proc would be copied over all the databases using a trigger on syscomments table. This way I would not have to put it into master database and avoid create dynamic sql.
i was thinking to create a trigger on syscomments table that would copy it to other databases, but I am unbale to do so.
Does any one have a sql script that can script all procedures and execute against multiple databases.
The later part can be achived by using exec('use db1 ' + @createprocstmt), but I how do I get @createprocstmt..
Any ideas..
August 2, 2006 at 11:52 am
You can add a trigger to syscomments but it is highly frowned upon for a number of reasons - only one of which I will state. Primarily modifications to system tables will void your support with MS. Besides that minor point, do you know for sure that everything will work ? As for scripting all procedures, well EM does that just fine. Executing the resultant script agains a number of databases can be accomplished with a for loop using isql/osql in a DOS cmd file.
RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."
August 7, 2006 at 10:10 am
After lots of thinking and considering the comments above, I decided to change the sql statements into dynamic sql. I converted most of it but then there are procedures that are way too lenthy to fit for nvarchar(4000) limit. As I need to use parameters I have to use sp_executesql ,cos its supports params very well as compared to exec command.
How does one construct and run sp_executesql @sqlstmt,@paramList,@param1,param2...
where @sqlstmt is going to be more than 4000 chars.
I've tried using varchar but sp_executesql requires only 'ntext/nvarchar'.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply