May 6, 2010 at 9:02 am
That's certainly one way you can get a script started if you didn't save it from the start. We have a stored procedure template (that we access via the Template Explorer) that we use so that all of our SPs have the same heading at the top along with documentation about the sp, who the origianl author was and a change log at the top as well.
For my Reporting Services Projects, all database reads are done via Stored Procedures. To help manage them I create solution/projects within SSMS. Once you have this set up you can just right click on Queries in the solution explorer and select New Query from there, it will automatically add them to the Solution and prompt you to save it when you are finshed. I save all the files on my hard drive and then I commit the projects into Subversion. That way my developers and I can all share the projects via subversion and stay in sync. When we move code from dev to test to production, we deploy them from the scripts saved in subversion.
May 6, 2010 at 9:03 am
jacroberts (5/6/2010)
bphipps-931675 (5/6/2010)
But SSMS inherently gives you a save option. All you have to do is save them like any other file to your hard drive. I really don't understand why you have to go through all these extra steps to do that.If you have over 600 procedures to save that can be quite a chore, also this adds a header to the file and scripts so that a procedure only created if the procedure does not already exist. In our case this is important as if we were to drop and recreate the procedure we would lose any rights users have assigned to the procedures.
My database schema has 566 stored procedures, 50 functions, etc. Once checked in, we're not modifying all 500+ of them all the time.
In our company we work from the code that is checked into the source code control system. Not the code that is currently in the database (as it could be wrong, old, temporarily hacked by a developer for debugging, etc.). Each procedure is an ALTER vs. CREATE. See my earlier posting where I describe how Oracle's "CREATE OR REPLACE" syntax is ideal.
When I first joined the company I had the task of "slicing and dicing" the entrie schema (reverse engineered via SSMS) into the individual files. Yes, a bit of a project (I wrote a bunch of editor (KEDIT) macros to help in the effort) but once done, it has paid dividends over and over -- reduced the development effort, created more stable code, AND we know exactly what the code is.
May 6, 2010 at 9:05 am
I'm really surprised that no one has mentioned using "Database Projects" in Visual Studio (2003 - 2008). Unfortunately, Microsoft has removed this Project Template with VS 2010+ but I plan to revive it and make some Add-ins to make it more robust. Microsoft does give us the new "SQL Projects" but they are horrible and the project forces you into a specific folder structure and method of scripting.
For example, I have been using Database Projects since 2002 integrated them into Source Safe, TFS, and now SVN. It's nice because we always had developers going in a mucking around in a "DEV" database and there were always times when one dev would overwrite another's changes. With DB Projects, it eliminated this because either only one dev could checkout the script at a time, or they would at least get "merge" capabilities. It was a nice template and could be used for any database (not just SQL Server) and allowed you to run the scripts right from Visual Studio (so you could test them). After we had everything in place, I just used a batch file to run the scripts, however, you could probably do something more robust these days with SMO and Powershell.
May 6, 2010 at 9:09 am
For several of applications, I implemented VCS functionality in SQL using these steps:
1. Create a database trigger to capture DDL events using EventData() in xml format. I include other metadata such as login, date/time, database name, etc.
2. Insert that XML into another database table on the same server. (Encrypt necessary fields using EncryptByPassPhrase if needed).
3. Periodically off-load records to a centralized VC database. Synchronize these dumps with DML dumps, backups, or other change capture data so you'd have a version of data and the code to go with it.
The above method captures all alter and create events for all objects: tables, views, sp, functions, synonyms, etc. (adjust the for clause below to customize). Then it's only a matter simple select and filter to find history of changes and so forth. I have also implemented couple of "undo" and "redo" commands to restore recent changes to individual objects.
I use this DDL changes table for searching when a field was first implemented, which table, by whom, and so on. Works great for quick searches when developing.
If your security needs do not allow off-loading code to an external VCS, then this might be an option to consider.
Here's the main trigger:
CREATE trigger [DDLChanges]
on database
for create_procedure, alter_procedure, drop_procedure,
create_table, alter_table, drop_table,
create_function, alter_function, drop_function,
create_view, alter_view, drop_view,
create_trigger, alter_trigger, drop_trigger,
create_user, alter_user, drop_user,
create_role, alter_role, drop_role,
create_schema, alter_schema, drop_schema
as
set nocount on
declare @data xml
select @data = EVENTDATA()
insert into DDLChangesDB.dbo.LoggedEvents(databasename, eventtype, objectname, objecttype,sqlcommand,loginname)
values(
@data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(256)'),
@data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(50)'),
@data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(256)'),
@data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(25)'),
@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)'),
@user)
go
Note: I copy and pasted the trig code here from a working implementation; test it before using.
May 6, 2010 at 9:09 am
I guess this is where having a plan in place of how you are going to manage your scripts before you create 600 scripts is important.
We had the same issue and it was painful to go back and put a management system in place, but I prefer using the built in tools of SSMS versus reinventing the wheel like this article suggests.
As for security, our scripts include grants at the bottom, and we generally only do alters so that security is not lost when we make updates.
May 6, 2010 at 9:14 am
Caveat - we had very few people modifying stored procedures so YMMV.
We used the old-fashioned "follow procedure" method of doing version control on stored procedures. It wasn't perfect but it did bring source control into the general perception.
Any time a stored procedure was created it was saved as a SQL file on the developer's local machine. Once it passed QA on the development machine it was uploaded to source control and the local copy was deleted. Yes, this is something the developer has to remember to do.
Then any time a stored procedure needed to be modified it needed to be checked out of source control and NOT copied from the server. I was working mostly with junior developers and management was bang up behind the need for source control so it wasn't a hard idea to get across to them.
For legacy stored procedures that weren't in source control yet it was pretty much the same. Grab the server copy, modify it, test it, then check into source control and follow the steps above going forward.
All stored procedures were deployed out of source control so if it was going to get into production it had to be there.
It's a very manual process and depends on developers following the procedures but as with almost anything, after 2 months it stops being a change and becomes "the way it always was".
Overall it worked pretty well, especially with the requirement of anything to be deployed had to be done out of source control. That last hurdle kept people somewhat on track. Deleting the local copy is another thing that helps remind people to go to the correct place to get the information.
I developed this process because I felt that using source control is a management issue, not a technical one. It's the same as doing documentation. It's just what you do if that's what the company's policy is, regardless of personal feelings. You just do it.
May 6, 2010 at 9:14 am
bphipps-931675 (5/6/2010)
That's certainly one way you can get a script started if you didn't save it from the start. We have a stored procedure template (that we access via the Template Explorer) that we use so that all of our SPs have the same heading at the top along with documentation about the sp, who the origianl author was and a change log at the top as well.For my Reporting Services Projects, all database reads are done via Stored Procedures. To help manage them I create solution/projects within SSMS. Once you have this set up you can just right click on Queries in the solution explorer and select New Query from there, it will automatically add them to the Solution and prompt you to save it when you are finshed. I save all the files on my hard drive and then I commit the projects into Subversion. That way my developers and I can all share the projects via subversion and stay in sync. When we move code from dev to test to production, we deploy them from the scripts saved in subversion.
Initially I took the copy from our UAT database which had had carefully controlled releases and so the correct version of the procedures. The files generated from the procedure and vbscript in the article were then added to version control so all developers could access them. When we do a release now we only deploy procedures that have changed for the deployment and do this by concatenating all the scripts to one file. We then only have one script to run to implement all the sproc changes.
May 6, 2010 at 9:21 am
bphipps-931675 (5/6/2010)
I guess this is where having a plan in place of how you are going to manage your scripts before you create 600 scripts is important.
It is also possible to end up being assigned to a project where this hasn't been planned and there are already 600 stored procedures not in source control as separate modules.
bphipps-931675 (5/6/2010)
As for security, our scripts include grants at the bottom, and we generally only do alters so that security is not lost when we make updates.
When there is a new stored procedure you have to create it as it's not there to alter. Do run a separate script for initial install to subsequent updates?
May 6, 2010 at 9:27 am
True about the walking in late, for that issue but I would still want to logically arrange the various scripts into projects to make them easier to find and keep track of...
As for our script creation process, we save them as either creates or alters. When they get to production the DBA simply changes it to a create for the initial run, if necessary, but typically leave them as alters.
When we first started we made them drop/create scripts but the losing security was a problem, so we switched to just using alters.
May 6, 2010 at 9:37 am
Grasshopper,
Got a few questions about your SSMS approach -
-Is there anyway of categorising into different sorts of scripts (/subfolders?) Or do they all just end up under 'queries'?
-How do you reconcile what is actually in the database with your project? e.g. a mistake in rolling out (wrong version), or perhaps a new object added accidentally by the deployment team?
-Also, how do you handle static/reference data?
May 6, 2010 at 9:42 am
bphipps-931675 (5/6/2010)
True about the walking in late, for that issue but I would still want to logically arrange the various scripts into projects to make them easier to find and keep track of...As for our script creation process, we save them as either creates or alters. When they get to production the DBA simply changes it to a create for the initial run, if necessary, but typically leave them as alters.
When we first started we made them drop/create scripts but the losing security was a problem, so we switched to just using alters.
If you used the method that this script generation used no manual editing would be necessary as it creates a tiny skeleton procedure if it does not already exist then alters it to the correct version.
May 6, 2010 at 9:47 am
Newbie,
-Is there anyway of categorising into different sorts of scripts (/subfolders?) Or do they all just end up under 'queries'?
This is where having multiple projects comes into play. I create a different project for different scripts. You can have multiple projects within one solution. Or if things are completely different you can have a different solution altogether.
For instance, I have a solution for Accounts Payable, with just one project in it.
For HR, I have a Payroll project, a separate Payroll Register project and then and final project for employee stuff. all in one solution.
-How do you reconcile what is actually in the database with your project? e.g. a mistake in rolling out (wrong version), or perhaps a new object added accidentally by the deployment team?
We use subversion. In production, the DBA only deploys scripts from tags that I (the Team Lead) create in subversion. No one else has authority to deploy to production, so I know that my tags are in sync with production. The dev and test envrironments are a little more lax, but the rule in the office is that what is in subversion rules. Anything else is considered a test and can be blown away with what comes out of subversion. It's up to us to ensure that we commit our changes into subversion. As MissPost stated earlier, this is a a management issue
-Also, how do you handle static/reference data? Generally, we create a table and store it in there, but this is something that we haven't had to address much. Our current process for this matter needs additional work. In the future one of our projects is going to be to create an application so that this kind of stuff can be owned and maintained by the users instead of applicationDevelopment/Operations.
May 6, 2010 at 9:50 am
But you lose the benefits of a change log that manual management of your procedures provides.
May 6, 2010 at 9:51 am
tymberwyld (5/6/2010)
I'm really surprised that no one has mentioned using "Database Projects" in Visual Studio (2003 - 2008). Unfortunately, Microsoft has removed this Project Template with VS 2010+ but I plan to revive it and make some Add-ins to make it more robust. Microsoft does give us the new "SQL Projects" but they are horrible and the project forces you into a specific folder structure and method of scripting.For example, I have been using Database Projects since 2002 integrated them into Source Safe, TFS, and now SVN. It's nice because we always had developers going in a mucking around in a "DEV" database and there were always times when one dev would overwrite another's changes. With DB Projects, it eliminated this because either only one dev could checkout the script at a time, or they would at least get "merge" capabilities. It was a nice template and could be used for any database (not just SQL Server) and allowed you to run the scripts right from Visual Studio (so you could test them). After we had everything in place, I just used a batch file to run the scripts, however, you could probably do something more robust these days with SMO and Powershell.
If you're referring to DBPro, yeah, we've been using it for years, quite successfully too. It allows us to move our databases from source control to dev/qa/staging & out to prod. Great stuff. I'm actively evaluating 2010 now.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 6, 2010 at 9:53 am
The way we did it, is creating procedures and udfs script from generate script functionality of sql 2008, per file.The script didn't have Drop Function or proc script that we added through command script looking each file and adding drop script on the top of each file.
Then used ssms solution in sql management, connect it to our TFS where all the files are and work on procs from sql management going forward.
There is one cool thing we do(this is something I never did in my previous companies), with every build we drop and create all functions and procs. For this we use batch script which create a file with all the procs and udfs. basically it copies the text from all the sp and udf files into one file to execute. This way our database is always in sync with what was in the build.
Viewing 15 posts - 31 through 45 (of 76 total)
You must be logged in to reply to this topic. Login to reply