June 29, 2007 at 8:23 am
Hi Andy - I mentioned cross continent as an example to illustrate my point about a source control system being necessary as it doesn't care about time or distance boundaries.
Terry - I'm not specifically talking about DB Ghost here - I'm making a point about source control and how it should be used.
However, in answer to your question; DB Ghost and it's way of doing thing works with ANY source control system.
This is evidenced by our major customers who are using it with VSS, CVS, Subversion, Vault, PVCS Dimensions, Telelogic Synergy, MKS SourceIntegrity, Perforce, Microsoft TFS and Seapine Surround. All DB Ghost actually requires is a set of CREATE scripts that describe the entire schema on disk somewhere to start the process off so which SCM those scripts have come from is basically irrelevant. As the developers make their changes only to the object creation scripts no diff scripts are necessary and there is, therefore, one single source for the schema. It is also easy to look at the schema history in the SCM as the list will show the names of the actual objects that were modified rather than a sequentially numbered set of diff/change scripts.
Malcolm
DB Ghost - Build, compare and synchronize from source control = Database Change Management for SQL Server
www.dbghost.com
June 29, 2007 at 8:35 am
Terry,
I think you're doing your company a favor by having a solid process in place. Adding additional software into your process often requires your process to change without significant benefits in control or quality. Keep it simple, keep it flexible.
We also use a similar process to yours, but with some minor differences. The differences are probably based on our different organizational structure and methodology, but fundamentally the same.
Database developer sandbox.
Shared Application developer playground.
QA/Testing environment.
Production/Customer delivery.
In the database developer sandbox, the developer can go hog-wild here. When they have tested their solution, they push it into the next mode (or zone).
For tables and data, the scripts that they write are simply the diff scripts to get a table from one build to the next. We do not modify the Create table script in order to generate a change script. It's not that hard to write a change script to modify one or more tables and manage any data migration that is necessary as the result of the schema change. So I don't see the need for software to produce diff scripts. Also, we have setup a naming convention for SQL files that allow any developer to know if another has also changed the same object among the "myriad" of changes that build up over many months or years. Once a table is modified using a change script, you can easily generate the Create Table script in your sleep -- from SSMS, SMO, or other free utilities. We only use the Create table script for record-keeping purposes. All new database are created from a "clean" database backup, maintained by the DBA and created by the change scripts (after they leave the QA/Test zone). All of this is done using batch files, osql/sqlcmd, and SQL script files -- and all of the bat and sql files are also under the same version control (We use MS VSS, but it doesn't matter which one you use).
For textual objects (stored procedures, views, functions) we don't bother with Alter statements. We simply have Drop-n-Create statements in the SQL file that defines the object. The db developer simply modifies the syntax in the file and checks it back into version control, where it will be ready for the next step. The only special-cases for these, are dynamically created objects and functions used in Check or Default constraints.
We automate (batch files) pushing builds into QA/testing zone. The automation labels/marks all DB, application, and unit-testing code in source control. The labeled version is then pushe into QA/Testing. Part of the automation runs scripts to check for conformance to standards and inclusion of certain files or scripts. Because of the automation and version control, we have better consistency and quality. Automation also means we spend less than 2 minutes running and documenting a build! Additionally, our final push into Production/Customer delivery is the equivalent of the last QA-Approved build, so there's almost no additional work involved there.
Good luck, Terry.
~Marc
June 29, 2007 at 8:36 am
I hope I'm not hijacking this thread but I feel the discussion is a good one.
Malcolm (sorry I misspelled it before) the solution I talked about does use a "schema.sql" file as the master copy of the database schema (the same approach your product seems to take), but if you are not storing the diffs, how do I as a remote dev, get your changes applied to my sandbox database without dropping the whole thing.
The approach I talked about uses both, the full schema file to represent the 'official' version and diff files that can be applied by team members to update their database environment.
You bring up a good point and one that I didn't consider was to use the SCM to lock the 'schema.sql' since this resource is serving as the official version. If the patch diff generation process uses an exclusive lock of the schema.sql file to ensure that only diff is being generated at a time then this is much better than the 'hey i'm about to generate diff 15' approach that I mentioned in my first response.
Now I highly recommend in any sandbox approach that devs don't get married to their copy of the database and that they periodically wipe it and recreate it from the official scripts and that any data they are they are using and any tables that they are working on that are not part of the official schema, they should have their own scripts to create the objects/data until that feature is ready to be merged into the official database schema.
I see your product does allow pre/post scripts to run so I see how you could batch this up to use ant/nant/msbuild to use any SCM to lock the schema.sql file even if you are not using VSS.
June 29, 2007 at 8:49 am
Me again...
Just wanted to be clear that I'm not against Red Gate's products. Rather, I like them.
But I've seen Development groups get too excited about buying new, cool-looking software that may not be the right fit for them, or ends up not being utilized to its fullest potential.
It's all about trying it on to see if it fits.
Sometimes, I wish that the money that was spent on certain purchases had gone into my salary instead.
~Marc
June 29, 2007 at 9:36 am
Hi Terry,
Don't get me wrong - your process isn't bad (sorry if I came across badly back there) - it's just not as optimal as it could be and, in fact, we aren't very far apart in most of our thinking. Perhaps a little history of where we came from might help...
<shimmering effect> We started out writing DB Ghost because we had virtually the exact same process that you use and, althought we were just five guys sitting in the same cubicle, we still managed to create problems when moving to the QA environment such as re-using primary key values for static data, overwriting each others logic changes and writing new code against columns that had been removed by someone else.
We said "we didn't have these problems with release 1.0" why do we get them now? The answer, of course, is that in R1 we just worked on the create scripts and then threw away & rebuilt the database whenever we did a release. So, we mused, all we needed was a black box that can just look at our create scripts, automatically work out what is different, and then make those changes to the target database without losing any data. We had some full-and-frank discussions over whether it was possible - some toys were thrown, harsh words were spoken and some were even sent to bed early. However we decided to write the "black box" and that's where DB Ghost came from (actually there was nearly a punch up over the name as well so maybe some counselling is in order ) </shimmering effect>
The custom scripts feature (before and after sync) is there so that any data migrations etc. (i.e. things that require semantic knowledge) can be slotted in.
Generally speaking the developers all work in their "wild west" sandbox databases and when they're ready, they check in their changes, backup the sandbox database and then use DB Ghost to update their sandbox database from the latest set of CREATE scripts from source control. Better yet, if the scripts are extracted from a baseline/labelled set then you have a known point to work from in source control to answer questions like "what has been checked in since I last synched my database"? As a purist a would suggest simply rebuilding the sandbox database from the scripts and then re-populating the data to avoid those annoying issues you get with corrupted data giving phantom failures during testing.
Another subtle benefit is that DB Ghost not only adds new objects and updates existing ones but it also removes anything that isn't in the source. Thus it tidies up your local database whereupon you can do a last run through unit testing to absolutely make sure nothing was missed from the check-in...
Malcolm
DB Ghost - Build, compare and synchronize from source control = Database Change Management for SQL Server
www.dbghost.com
June 29, 2007 at 11:05 am
We've been using Red Gate for quite some time now (maybe forever now that I think about it) and it is a product that I use almost daily. We are a software vendor and have to create custom upgrades for our customers to upgrade their versions. When I say custom, I mean that each client that needs to upgrade their system must have a script created that is tailored to their database as our customers can be at any version and patch level in our development tree and have to get to the destination version.
We have been using Red Gate to create the synchronization script, but it always (and I mean always) requires manual tweaking for data conversions and the like prior to sending to a customer site. Red Gate works great for as a Diff tool, but we are very excited about what DBGhost can do. I downloaded a trial license many months ago to do some ‘proof of concept’ testing on automating our upgrades and I was extremely pleased with the results. We will be getting a second trial here very soon to document the ‘proof’ for the $$ spenders here to approve the product.
I think it is in everyone’s best interest to at least review the DBGhost web-site and take a look at the capabilities that their product has. I originally found the DBGhost product from an article here at SSC that if I remember correctly was a product trial testimonial for Red Gate or one of the other big dogs. The author briefly mentioned DBGhost and I searched the web and found their site. I plan on using their entire product suite and I think it all will add much value to our operation as well as free up more DBA resources to concentrate on other tasks.
BTW… We are using MKS for our source control.
June 29, 2007 at 7:58 pm
I've heard great things about both Red Gate and Innovartis (dbGhost) as products from people. Haven't really used either, but people seem to like both.
Viewing 7 posts - 16 through 21 (of 21 total)
You must be logged in to reply to this topic. Login to reply