September 3, 2009 at 7:46 am
I can make this process work several different ways - I am looking for guidance on the best way of doing it. I have done a number of different searches on the topic but apart from learning other good stuff was not able to find any 'best practice' guidelines.
I have a legacy system (originally DOS/Btrieve) that is being slowly migrated to more modern technology and platform. Meanwhile I am upgrading from SQL 2000 to SQL 2005 and taking the opportunity to clean up the stored procedures.
I have 15 databases with identical structures. I need to batch run identical T-SQL stored procedures (I have around 100 of them) against these on varying frequencies. Currently (in SQL 2000) I begin my batch by copying all the relevant SPs from my control database to each production database and then they are run as required. The sps may change on a regular basis (2 or 3 changes a week) and may be made by different people.
I would like to have a single copy of each SP and run it using the database name as a parameter. I can make the process work but I am looking for best practices on doing it. The batch runs are system level so there are currently no special security requirements (e.g. no login /password).
I prefer not to put these stored procedures in master and they may contain references to a other shared databases where codes, logs, etc., are maintained. The SPs currently only call one or two functions but in order to reduce the number of sps I may need to be able to call another sp from the same location to work on the current database.
e.g.
The new sps have code in to check the parameters (check the working database exists!) they always return a success/failure value which is checked and logged.
Any guidance or pointers to references would be appreciated.
Thank you
SteveD
September 3, 2009 at 8:45 am
The easiest way I think do to this is use something like SQL Compare (or any compare tool) and have it sync the procs in all databases on a regular basis.
The whole point of databases is that they are containers of all the objects. The server isn't built to run a stored procedure on multiple databases.
I think I see your issue, but my guess is that the best way to handle this is build a better deployment procedure. Make sure that these procedures are moved to all databases when they are changed. That would be the easiest thing.
September 3, 2009 at 9:06 am
You could use Osql wrapped in a stored procedure to run a script stored on a file system against all the instances held in a table somewhere:-
SET quoted_identifier off
SET nocount on
SET concat_null_yields_null off
DECLARE @Count int
,@Maxcount int
,@SQL varchar(1000)
,@ServerName varchar (MAX)
CREATE TABLE #Temp
(ID Int identity (1,1),
ServerName varchar (50))
INSERT INTO #Temp
(Servername)
SELECT ServerName
FROM Serverlist
SET @Count = 1
SET @Maxcount = (SELECT MAX(id) FROM #Temp)
WHILE @Count <= @maxcount
BEGIN
SET @ServerName = (SELECT ServerName
FROM #Temp
WHERE ID = @Count)
SET @sql = 'osql -E -S '+@ServerName+' -d master -l 100 -i "D:\someproc.sql" -o "D:\somelog.log"'
SET @sql = 'EXEC master..xp_cmdshell ''' + @sql +''''
PRINT @sql
EXEC(@SQL)
SET @Count = @Count+1
END
DROP TABLE #Temp
September 3, 2009 at 12:04 pm
Carolyn and Steve,
Thanks for responses.
"Best practice" here appears to be NOT to use a single copy of a stored procedure working on remote databases but to continue, in some way, to run the stored procedures from within each database.
I am looking into the Red Gate Toolsets to assist the replication process. Obviously it is more efficient to only have changes copied.
Not a problem - and thank you again.
SteveD
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply