May 24, 2004 at 2:04 pm
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/d
My Blog:
May 27, 2004 at 4:24 am
About the only thing I liked about this was the fact that the problem of change management is recognised. Within this "Process" there is so much room for error, that within any environment other than the simplest - you would still have problems.
I live and breathe change management for databases, if you're interested in reading what I live by drop me an email and I'll send you an article which I think does this subject justice.
regards,
Mark Baekdal
May 27, 2004 at 4:47 am
I am all for introducing control into the SQL change process, however, the method described in this article won't really help with the underlying problems of managing a large team of developers that are all making changes to a database's schema.
I have encountered many paper based, communication heavy methods such as this on previous projects and they have all failed to deliver any kind of sanity. The people who implement these kinds of process think that merely by proscribing a method and creating a few documents will solve the issues. Paper based processes will never work whilst development teams are under enormous pressures to deliver new functionality as the simple truth is that corners will always be cut which introduce massive holes in the delivery process and leads to utter confusion.
What is needed is a 21st century solution that uses automation to leverages the power of our machines. Ideally the development team would be able to work with SQL like they do with other parts of the system, using a source control system. All that would then be needed is a tool that can work out what has changed in the scripts and apply those changes to a target database.
With this kind of automated approach using a version control system everything would be audited and the developers would be freed from onerous paper shuffling. The DBAs would love it as they can audit every single change easily and at a time that is convenient to them not when a developer shoves a piece of paper in front of them.
Malcolm Leach
Malcolm
DB Ghost - Build, compare and synchronize from source control = Database Change Management for SQL Server
www.dbghost.com
May 27, 2004 at 7:18 am
I agree with the previous comments that this is a very good topic, but I would also like to read about real-world examples that are not paper-based. Does anyone use Visual SourceSafe for DDL scripts, stored procedures and DTS packages? What other options are in the market for performing electronic change control?
May 27, 2004 at 7:34 am
check out http://www.dbghost.com for an electronic solution from SourceSafe to production. Software to enable a process, not just a diff tool.
May 27, 2004 at 7:42 am
I agree with the above. Paper based systems are a problem in and of themselves. Most of us are getting less used to paper not more. Even email based "forms" are paper like and don't work that well. The thing I've found over the years is any process needs to be lightweight and not get in the way of the process. If it does, it won't be followed, or be randomly followed. The easier you can make it, the less resistance you'll get.
For VB or Web work, the tools automatically check out and check in files you work on and even give you a box for comments. Even then they're not always used unless someone "forces" the users to enter comments. Which is a big part of any VCS. You must have some type of administrative controls in place that force people to comply.
I've used VSS, but it was a hack and it only worked in conjunction with server side permissions to limit developers to a single db, scripts to look for changed objects that I could then verify were in VSS, buy in from management to enforce administrative penalties (they lost object DDL rights :cool, and a little training.
http://www.sqlservercentral.com/columnists/sjones/vcspart1.asp
http://www.sqlservercentral.com/columnists/sjones/vcspart2.asp
May 27, 2004 at 9:03 am
We have a system we use at the company I work for that I think works well for us. It uses a combination of tools (all on line - no paper) including Rational Clearquest (for database change requests), Redgate SQL Compare (for creating base/change/rollback scripts) and Source Safe (for saving/updating the 3 scripts for each change). Each change in reviewed and migrated from Dev to Test and from Test to Production by a DBA. No developer has DBO permissions except in Dev.
This has greatly descreased our problems found when implementing changes to our production databases. This article is lacking much helpful information regarding how to set up a system to monitor and save changes, but is a good start at evaluating the process.
Have a great day,
Linda Leslie
Premier Lease & Loan Services
May 27, 2005 at 2:00 am
Hi,
Change Control management is required for the following reasons: You want to know who broke it.
Online systems like Flare or Rational Clearquest are helpful in maintaining who can request the change and to whom it is assigned.
Scheduled updates to Prod are also helpful (for example 11 am Saturdays) - in that you can then see the next day if something broke, you can roll back
Managing the differences between environments (Dev/QA/Prod) as well as the different versions of the SQL database is the trick. I've found that using MS Visio to do this works like a charm - when it comes to the design of the database (tables & views) - as you can Update the Visio diagram to represent the database, and vice versa. Visio sheets are also better maintained for viewing purposes - if you want to print them, you can - however I agree that printing them is futile as the design changes are then decided on and made on paper.
Maintaining Stored Procedures & Triggers & UDF, etc - is best handled by generating Scripts of them. Putting these Scripts into some or other form of code storage (MS SourceSafe / TortoiseSVN / CVS / etc) is not a bad idea - which I believe I will start doing ASAP - as versioning of the databse design you can maintain by saving the Visio file name incrementaly every time you make a change - and then document in you project documentation the change made in the respective versions. Versioning the scripts can be done similarly; and then getting these files into you code storage will enable you to relax in regards to their whereabouts, and help you to get them back from anywhere in the world.
Christiaan Pretorius
May 27, 2005 at 5:01 am
Hi Christiaan,
Having ALL of your SQL objects, not just stored procedures, triggers and UDFs, under source control is the ultimate solution. This way every single object gets the benefit of check out/check in and a full version history, down to the last character - who, when, why and what.
Using a modelling tool such as Visio, ERWin etc. is great for maintaining a logical model but, when it comes to physically deploying the changes, most of these tools are flaky at best when coming up with a delta script of the changes. Have you ever tried to generate a delta script from the tool and actually run it to upgrade a target database? If your changes are anything other than simple then the chances are that the script will fail leaving you with a schema in an unknown state and a debugging job to do. Alternatively you could just hand code an alter script but that robs you of your valuable time.
The other problem with a development team of any size sharing a database model is with concurrent development. Do you have a single model file that each developer can check out/in? If so then your model file becomes a single threaded bottleneck that gets worse as the number of developers/DBAs on the team grows. This isn't such a big problem if the team is small and/or the number of changes to the schema is low but, unless your company is standing still, that kind of situation rarely continues. At some point the business will need to adapt to changing market conditions and that almost always means lots of database schema changes. You can't have different versions of the model file with several different developers updating it as that would lead to utter chaos.
A core problem with modelling tools, and any other process that relies on delta scripts, is with storing those delta scripts as part of the 'change package' in the source control system. Although they can be linked to all other changes made for a particular change request they are simply a 'new' script as far as the source control system is concerned. If you are using advanced configuration management concepts such as a main codeline and isolated worksets that get merged back into it when the piece of work is finished then delta scripts are useless for a visual compare and merge operation. For example, if you needed to add a column to a table you could either:
1. Code (or produce using a diff tool, modelling tool etc.) a delta script that states:
ALTER TABLE x ADD COLUMN y
give it a name such as ChangeRequest1254delta.sql and check it in as part of your change package.
OR
2. Check out the drop/create script held in source control for table x, modify the CREATE statement to include the new column and then check it in as part of your change package.
If you use option 1 and another developer has modified the same table (also using method 1 but with a conflicting change) and has already merged the change package back into the main codeline in the meantime you will never see that change or be notified of it by the source control system. Only when both delta scripts have been executed against the test database and the functionality tested will any issues come to light. As we all know, the earlier in the development lifecycle that you can spot potential problems the cheaper it is to fix them.
This brings me on to the same situation but with both developers using method 2. When you perform your merge the source control system will immediately flag a conflict between your table drop/create script and the one checked in by the other developer. This gives you a chance to review the other developers changes, decide on whether it affects your change package and perform any necessary rework. This eliminates the wasted time (by the deployment and test staff etc.) that is seen in method 1. It is also much better to rework something when you have just finished it than it is some days (or weeks!) later when the test team tell you something is wrong.
Although I have used the main codeline/isolated workset paradigm here the underlying principles remain the same when working with simple source control systems such as Visual SourceSafe and pessimistic check outs.
The main advantage of this method of working is that it is extremely easy for developers as they just work in the same manner as they do for normal application code. Making a process simple to understand and easy to use is the best way to ensure it is adhered to, period.
The only problem with method 2 is that you need a magic black box that will look at a set of drop/create scripts and be able to apply those to a target database seamlessly without having to drop and recreate that database. i.e. you need something that can build a database from those scripts and use it as the source for a compare and upgrade of an existing database.
Once you have reached this logical conclusion you have to develop a tool. But how much would that cost in terms of money and resources to develop? Too much for most IT shops, I suspect.
Fortunately such a tool already exists and it's called DB Ghost (http://www.dbghost.com). It is the only tool on the market today that enables you to use your source control system for SQL in the same way that you do for all other application code.
Once you've 'clicked' with the power of this approach I guarantee you will never work any other way with SQL code again.
Malcolm
Malcolm
DB Ghost - Build, compare and synchronize from source control = Database Change Management for SQL Server
www.dbghost.com
May 27, 2005 at 6:14 am
Malcolm,
Your point is well made and it is good advice.
Thanks
Christiaan
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply