May 7, 2002 at 12:00 am
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/awarren/versioncontrolforstoredprocedures.asp>http://www.sqlservercentral.com/columnists/awarren/versioncontrolforstoredprocedures.asp
May 8, 2002 at 11:50 pm
Good article! I was "forced" to start doing this when I needed to modify the parameters (interface) on procs for a new version of a UI. Without a staging server for new app versions, this has allowed me to do final "production" tests of apps without disrupting usage of the existing version.
May 9, 2002 at 5:08 am
The mention of grouped procedures is a plus in your article. However I do not use for version control. For me these are queries that I related to one another to avoid dynamic SQL and the primary or ;1 is the controlling factor which takes all inputs and routes to the correct ;x item. They are quite usefull and do store their own query plans. In addition when I need to drop them from my system I just call by the name of the procedure group and not by the numbered value unless I need to drop only one. Another method I have seen thou is to use the 0.0.0 standard noted inside the code itself (Major.Minor.Revision) but for us we have not run into a need to do so yet. Good article.
"Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)
May 9, 2002 at 8:59 am
I like this article and started implementing suggested approach in my SQL development.
May 9, 2002 at 9:53 am
A much needed topic indeed, Did you have any problems with recompiling the code that calls the procs? It seems that by renaming the proc, you now have to go out and find all the places that use the proc and update the name there too. As brbolte mentioned, this may be perfect when you don't want to break existing code, but for the stuff that we do, I would have to then modify dll's and redistribute the dll's to the team members, a lot of work, and it seems to me reduces one of the great qualities of SPs, the fact that you can encapsulate logic and just update the db instead of redistributing executables.
I will share my method, which is also some work, but perhaps less depending on the environment you are in. We use CS-RCS, and just use the scripting tools of Query Analyzer to save off a copy to be versioned. I think this would be more difficult in the hectic nature of the project you were on, but it works well, especially tracking changes as they migrate to a centralized testing db, or even just to production. The nice thing about using a revision control system is we can quickly see history and perform diffs on the files to see what is going on.
my 2 cents.
May 9, 2002 at 1:00 pm
quote:
In addition when I need to drop them from my system I just call by the name of the procedure group and not by the numbered value unless I need to drop only one.
It's not possible to drop an individual stored procedure from a numbered group. They can only be dropped as a group
May 9, 2002 at 1:16 pm
Interesting article, but as someone else mentioned, it has the disadvantage of needing to modify all occurences of where the stored procedure name is used.
An alternative is something like :-
create procedure aaa
as
exec aaa;2
go
create procedure aaa;2
as
select 'first version'
go
exec aaa returns 'first version'
exec aaa will always run the lastest version of the stored procedure
if we want to modify the stored procedure, but keep the old version for reference, we can do this :-
create procedure aaa;3
as
select 'second version'
go
alter procedure aaa;1
as
exec aaa;3
go
exec aaa now returns 'second version'
Obviously this will only work if you dont need to change the parameters to the stored procedure between versions.
You could just as easily use names with '_<number>' as grouped stored procedures, which allows you to drop older versions if you build up too many.
May 9, 2002 at 2:11 pm
Good point about finding instances of it to change the name. For EVERY build we sent out a new dll and a new exe, even if there were no changes to one or the other. Used MSI package for the distribution. We did have to change the proc name in code, but we're pretty hard core about code reuse, normally a proc will only get called in once place, so it's up to the developer that changes the proc to update the source code at the same time (a good find/replace add in is handy). For the most part the procs were stand alone, but we had a few instances where a call to proc1 would branch to one of 5-6 other procs. If you versioned one of the sub procs you had to version the main proc as well.
Thanks everyone for their comments so far, figured I'd catch hell for such a low tech technique!
Andy
May 9, 2002 at 4:40 pm
Intersting idea. Not sure I like this over my method :), but it's worth a look. Nice way to obfuscate things if you are delivering code to a customer as well.
Steve Jones
May 9, 2002 at 8:36 pm
I've begun using the free version of SQL Programmer 2001 from BMC Software. It took a while to figure out how to get VSS going but now I have full version control over my stored procs. What I cannot do is prevent someone from changing stored procs in the database on the fly. But I can refresh using VSS to make sure the latest known version is in the database. I love this product and for the price it cannot be beaten.
May 10, 2002 at 6:58 am
From an app developer standpoint, the idea of having to reference the same named proc with version for what would probably be subtle differences sounds like a headache and a potential source of bugs. Using a proc with a single name allows the developer a 'black box' approach whereas the dba changes some logic, so long as the input and output params stay the same, then there is no need to recompile. It also sounds like extreme integration between all members of the dev team would be necessary in order to know when to use this_proc1 versus this_proc2. This could only work with a small team. If the second proc is needed then perhaps a new name altogether is in order.
As a low tech solution, it probably works fine in most small environments. There are some SQL IDEs out there that integrate quite well with some of the popular version control systems. Check out embarcadero.com. They have a tool called RapidSQL that integrates with VSS. Good luck.
Mike
May 12, 2002 at 1:42 pm
Glad to see someone trying to address this. I don't agree with changing the name of the stored procs for the same reasons others have stated above, I think the stored proc should maintain a constant interface to developers wherever it can.
I have given my thoughts on what needs to be done to enable a low tech version control system below, to contribute to the discussion.
From your article I can see 3 distinct areas that need attention. 1) The ability to roll back to a previous good. 2)Developer check in/check out 3) testing phases.
1) The only version of a stored proc that is of any use is the current version, except when the current version breaks and it is necessary to roll back to a previously good version of the SP. Therefore there is no need for developers to be working on multiple versions of the sp. Always drop and recreate the sp on a new version release, so everyone is using the latest version. If someone reports their bit no longer works, then you roll back.
2) Multiple developers may work on a SP, but NOT at the same time! (NB This is sp developers, not application developers as these will always have the same interface.) A developer should check out the sp so that he/she has exclusive rights to alter it and then save a new version on check in.
3)A testing phase, encompasses a complete move of the latest versions of all sp's to a second (db/server). This is where your professional testers get to work. They are only testing the latest released build. If you like there are two types of testing and 2 versions under scrutiny. 1)The testing of the latest released build and 2)ad hoc testing on the development server of the next potential build. Where 1 is more comprehensive, but less timely than 2. The results of both feed into the next development build.
I use word pad to store the code of each version(don't get more low tech than that!). I save each version with a reverse date/daily build number eg 20020510-1.txt although you could you the notation mentioned by Antares686 above. Also I put this as a comment --version 20020510-1 in the sp code, so that anyone looking at the sp can check the version number. Additionally a log of changes is created, also as a text file that shows a version history. Lets you more easily home in on affected code when you need to fix a problem.
To enforce check in/check really needs to be enforced if you have more than a couple of developers. Tools are good for this. Alternately, in keeping with the low tech theme you could use NTFS directory permissions. A developer wishing to work on a sp must first ask permission of me. If no one else is working on the sp I put a copy in his working directory. When finished he notifies me that he is done and I rename to new version number and cut/paste to the central directory on which only I have permissions.
May 12, 2002 at 2:19 pm
In this case I'll have to disagree that the current version is the only good version. Lots of times we'd have a couple people testing v.05 that used sp_test1 (even if maybe what they were testing didnt exactly include the call to that proc) while others might be testing v.06 or v07 that used sp_test2.
Extreme integration? Maybe. Only three developers, but the way we partitioned the project there was almost zero overlap. That not withstanding, our standing process was that if you modified the proc, you modified the call to it in the source code. A quick email would suffice if we had to cross the fence to work on a part not directly related to the one we were tasked with. It's not as good as true source code control of course.
One other point, in some cases we were "refactoring" and leaving the interface as it was, in other cases we were changing the interface entirely. Once in production it IS nice to be able to tweak the proc without changing the name and having to deploy a new build. Whether we do so now depends on the nature/risk/scope of the change. My preference is that any change get tested by our internal testers, then a limited deployment to real end users, followed by full roll out.
Andy
May 12, 2002 at 3:26 pm
Please allow me to play devil's advocate with you on this one.
Maybe I missed the gist of the article, however if one group is testing v.05 and others are working on v.06 and v.07, wouldn't it be best to have 3 completely different databases and three different code bases?
Another scenario: say we have an sp called GetCustomerInfo and we decide to add an additional input parameters to it that allows us to retrieve info from some new tables we've just created. Under the guidelines you've proposed, we would call it something like GetCustomerInfo_2. Maybe a better way would be to create the third parameter, leave it optional, and if the param is specified then the sp provides the additional info? This way old code works the same and the new code benefits from the change.
So now using the same scenario, we decide we want to return a few additional columns in the result set. So now we create GetCustomerInfo_3 and return the extra columns. We decide to go out on a limb and work on a GetCustomerInfo_4 and for one reason or another we can't get around it finishing it due to time constraints. Now let's introduce another developer who is unfamiliar with what we have done. We sit him or her down and ask them to get started on a new customer interface. They come to us and ask which copy of sp GetCustomerInfo they should be using. What is our reply? All of them? Take your pick? Or worse, if they don't come to us at all they might incorrectly presume that GetCustomerInfo_4 is the most recent version.
During each iteration of the sp, we would have to notify everybody in the dev group since we may not know who is actually using GetCustomerInfo in their code. These 'broadcast' messages would quickly become an annoyance in a group larger than 4-5 developers. So let's say somebody didn't change their reference as requested by us, and we decide to change the schema of the table used by the sp. Since we don't think versions 1 and 2 are being used, we don't bother to update those procs. Now any code incorrectly referencing those procs will be broken.
Also, the size of the project sounds like it is fairly small. In many projects, there ends up being hundreds of stored procs. So now, for each proc we are going to have more than one copy, which leads to an unruly list of procs.
Now, I'm not going to just sit here and take shots at your idea without having an arguably better solution in mind. Since you are already using VSS, how about using SQL Enterprise Manager to generate your entire database (tables and everything) into files. There is an option during generation to specify saving it into separate files. Now check all these files into VSS. If you need to cross reference some older code to see what version of a certain proc it was using, we can just use file dates and compare to VSS. If we are going to break the interface for an sp, then we either create the interface to support both param signatures by setting default values, or we create a new unique name that better reflects the new functionality.
Anyway, these are just some thoughts. I look forward to your reply. 😉
May 12, 2002 at 7:37 pm
Well, it was all about time. We had one primary db for development and initial testing, a second db for more substantial testing. At times that seemed like too many, added the step of having to do a diff between the two to see what needed to be added to the other db. The VB code base was versioned in VSS, so it was only procs (and the occasional schema change!). Creating more db's would have created more admin work...that maybe we could have automated, maybe not. Not sure that it would have helped us.
For the part, yes, if the change consisted of adding more params, you could add to it. In my mind thats just a different form of versioning, since based on the presence of a value you might then branch to a different bit of code. In some cases it would make sense, but not all - what about when you want to drop params or change the data types? We wanted one simple system with as few exceptions as possible.
On the modifying part, we rarely had to broadcast anything because normally we were modifying code that was in our "segment". In the case where we needed to cross the line, a quick email (or just yell, since we had three offices in a row). One nice part is that we only altered non-deployed procs, so if the worst happenend and all three of us tried to version the same proc, two would get errors saying the proc already existed - a clue that something was wrong! The rule was if you modify the calling syntax of the proc, you search and fix ALL references to it, in code or in other procs.
Yep, we've got a few versions. At some point I'll clean them up, but for now what does it hurt? The worst case so far is one proc has 10 versions, two that never got deployed.
Which is the current version? Normal practice would be that the highest number would be live, unless it had an 'xx' appended denoting it was created then never deployed. With a small team that worked well. If you had a doubt, a quick search of the source code would show which was called (and if none, that would indicate it was a sub proc, though the comment should have told you that).
Your idea of the files isnt much different than what Steve Jones does, and possibly its a better solution. I certainly find VSS to be totally workable for VB source code, there if I want to change something I do, knowing that I have the ability to roll back to any point. I do use VSS for some special procs (outside of this project).
I guess finally, keep in mind two things. One is that we had a team of three developers of about equal ability and all reasonably detail minded. The other is that we were essentially working on a development server, not production (though we lived on the wild side and had everything on one box).
I dont disagree with any of your points, a full source code implementation would be the right approach. This was just a hack to carry us through. While definitely we want to look at "better" approaches for future projects, for the short term I'd use this again in a minute:-)
I appreciate you taking the time to add your comments - pretty important for anyone reading the article to have a very clear idea of the pros and cons of this method and a good discussion is a great way to do that!
Andy
Viewing 15 posts - 1 through 15 (of 29 total)
You must be logged in to reply to this topic. Login to reply