April 4, 2012 at 8:48 pm
Comments posted to this topic are about the item Treat the Database like Code
April 5, 2012 at 12:27 am
I've been doing this for a few years and like you say, all it takes it discipline. The bonuses are easy enough to see.
All that is needed is a database project to which you can upgrade by "schema compare". This however is not always that handy when merging from dev environment to a test environment. Here I instead recommend that you create a folder structure for table changes, functions, sp's etc where you check in alteration scripts. This makes the merging and deployment go smooth.
April 5, 2012 at 1:32 am
If nothing else, this can be used to encouraged the least disciplined to be a better team player. In my experience it really helps to stop those developers who change a local copy of the database without considering that such a change must be replicated elsewhere.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
April 5, 2012 at 3:04 am
Comparing schema's is great as the deployed database is far more likely to reflect yours! I've found its useful to version your database and scripts so scripts are only run on the correct version of the database. Source control is useful to maintain the deployed versions versus those in development on separate branches or utilising labels.
April 5, 2012 at 3:39 am
This is one area I do find hard to make easy. I do agree it has to be done absolutely but getting a team of people to use a consistent way of doing it is not easy. Still, we seem to muddle through.
Anyone got any really fool proof ways to implement this? We would be up for using paid tools if they can help effective organisation. We tend to have a dev server from which changes are rolled out to test and then to production, pretty standard stuff. Anyone in favour of
A) the 'delta capture' approach or would everyone go with
B) record all changes as script in the first place
or perhaps B) with A) as a check?
The main problem is multiple people recording multiple changes which then need to be rolled up into one change script. Some features then may not make the cut, or be delayed, meaning more slicing and dicing. Should everyone work with one master script (for each version) to be checked in and out as needed?
April 5, 2012 at 5:10 am
call.copse (4/5/2012)
This is one area I do find hard to make easy. I do agree it has to be done absolutely but getting a team of people to use a consistent way of doing it is not easy. Still, we seem to muddle through.Anyone got any really fool proof ways to implement this? We would be up for using paid tools if they can help effective organisation. We tend to have a dev server from which changes are rolled out to test and then to production, pretty standard stuff. Anyone in favour of
A) the 'delta capture' approach or would everyone go with
B) record all changes as script in the first place
or perhaps B) with A) as a check?
The main problem is multiple people recording multiple changes which then need to be rolled up into one change script. Some features then may not make the cut, or be delayed, meaning more slicing and dicing. Should everyone work with one master script (for each version) to be checked in and out as needed?
Excellent questions. I laid out a lot of my approaches over the years in the book SQL Server Team-Based Development[/url] (free download for the E-Book). Although, I wrote it a few years ago and the tools, and therefor the processes, have refined somewhat since then. It uses Microsoft and Red Gate tools (same tools I'd recommend now). However, it does lay down a framework for working with either a shared development environment or multiple individual environments. I prefer the latter approach now (I used to prefer the former).
I strongly recommend using B. All database development is driven through source control, just like app dev. That way, you can more easily support branching in order to have multiple development streams, one for the current release bug fixes, one for the next release, and one for the release after that one (and frequently one or more for an integration release with other apps). That's the toughest problem to fix from a single shared development environment. In fact, I've never been able to support branches from a single shared environment. I don't think it's possible.
If you have to go with A, you're going to have a harder time identifying a specific point in time in the code, and that's one of the best parts of using source control.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 5, 2012 at 5:47 am
As Grant says, I think "B" is your best option. We've been doing this for years using Tortoise Hg. I don't have ANY IDEA how we used to manage the process withOUT using source control. It has saved our bacon countless times - earlier this week as a matter of fact! We branch the code into 3 different environments: Prod, Maint, & Trunk. Maint & Trunk will represent a different release that different folks are working on. Prod is essentially a copy of what is currently in Production so we can reproduce Production issues. We script the DDL differences between each release into a set of patch files that are in Hg and also include the version number (and increment number) in the name of the patch file. Also, all of our text based objects (procs, funcs, etc) are checked in.
April 5, 2012 at 6:35 am
Spot-on, Grant, as usual. Working with earlier versions of SQL and VSS was a true pain...until VSS saved your a$$ because you could bring back a prior version.
Now with the move to Studio as the IDE for all things SQL, creating a solution for TFS or VSS is not only a no-brainer, it should be a requirement. It does require a little bit of evangelism in training developers to update scripts to apply changes to objects. But the first time their complex change to a stored procedure disappears because the edited it directly and the next person used the script, like they were supposed to, you find that the light comes on and it rarely ever happens again.
And if you can't stand VSS or TFS, see Red Gate. But start treating object manipulation like code projects the same way your .Net people do an reap the benefits. Don't wait to get bitten. Buy the first-aid today.
------------
Buy the ticket, take the ride. -- Hunter S. Thompson
April 5, 2012 at 6:39 am
Steve,
I could be cynical, but the number of source/version control editiorials and articles on SQLServerCentral and SimpleTalk has increased greatly since RedGate started making SQL Source Control. The fact that many of them cover SVN instead of better choices like git or Mercurial disappoints me. But the fact that the topic is being brought up and discussed is good thing for the community overall.
April 5, 2012 at 6:52 am
chrisn-585491 (4/5/2012)
Steve,I could be cynical, but the number of source/version control editiorials and articles on SQLServerCentral and SimpleTalk has increased greatly since RedGate started making SQL Source Control. The fact that many of them cover SVN instead of better choices like git or Mercurial disappoints me. But the fact that the topic is being brought up and discussed is good thing for the community overall.
I guess there is a depth of material on how to do the basics and now it is down to new features and processes. I tend to find processes are more important in doing my job.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
April 5, 2012 at 7:04 am
G Bryant McClellan (4/5/2012)
Spot-on, Grant, as usual. Working with earlier versions of SQL and VSS was a true pain...until VSS saved your a$$ because you could bring back a prior version.Now with the move to Studio as the IDE for all things SQL, creating a solution for TFS or VSS is not only a no-brainer, it should be a requirement. It does require a little bit of evangelism in training developers to update scripts to apply changes to objects. But the first time their complex change to a stored procedure disappears because the edited it directly and the next person used the script, like they were supposed to, you find that the light comes on and it rarely ever happens again.
And if you can't stand VSS or TFS, see Red Gate. But start treating object manipulation like code projects the same way your .Net people do an reap the benefits. Don't wait to get bitten. Buy the first-aid today.
Actually, Red Gate supports TFS too, but let's not get into sales. Ha!
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 5, 2012 at 7:06 am
This is an excellent (and timely) topic. We're pondering this same question right now, but we have a challenge that we haven't quite figured out how to overcome. Any suggestions would be greatly appreciated.
Here's the deal. We have a third-party app that we use. The database is open to us for modification (and we do modify the vendor's scripts a little bit, and we also have a fair chunk of our own scripts). It would be a fairly trivial task to use a source control tool for our own stuff, but how do we incorporate new versions from the vendor?
Keep in mind, when the vendor supplies a new version, it's in the form of a complete build, with potentially hundreds of changed sprocs, views, functions, and indexes. Anybody have any brilliant ideas?
April 5, 2012 at 7:14 am
What has always worked well for me is to create a conversion script that encompasses not only DDL changes from one version of the database to another, but also DML. Too many people concentrate only on schema changes and forget that if not properly handled, data can become corrupted or lost if not handled properly. Scripts also ensure that schema changes and data changes are conducted in the proper order. I wrote a system stored procedure that allows me to execute SP and UDF files also in their proper order all within the conversion script. It took a bit of extra time and care but it sure made testing easy. The conversion script along with the SP and UDF script files were stored in SVN. When ready, they could be executed in the test and then production databases just by invoking the conversion script.
LinkedIn: https://www.linkedin.com/in/sqlrv
Website: https://www.sqlrv.com
April 5, 2012 at 7:17 am
chrisn-585491 (4/5/2012)
Steve,I could be cynical, but the number of source/version control editiorials and articles on SQLServerCentral and SimpleTalk has increased greatly since RedGate started making SQL Source Control. The fact that many of them cover SVN instead of better choices like git or Mercurial disappoints me. But the fact that the topic is being brought up and discussed is good thing for the community overall.
I suspect SVN is chosen so often because it's extremely popular. It might not be best in breed, but if the majority of people are using it...
For example, there's this (online, self-selected, therefore skewed) poll: http://java.dzone.com/polls/what-most-popular-source
and this review: http://www.smashingmagazine.com/2008/09/18/the-top-7-open-source-version-control-systems/
I don't have access to any of the large consulting firms who probably have more accurate numbers, but assuming these numbers are reasonably accurate, the mass is behind Subversion. Toss in TFS as THE solution from Microsoft and you can see why these two are the ones you'll see most often, even if Mercurial is better (I'm not convinced that Git is all that good).
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 5, 2012 at 7:20 am
Gary Varga (4/5/2012)
chrisn-585491 (4/5/2012)
Steve,I could be cynical, but the number of source/version control editiorials and articles on SQLServerCentral and SimpleTalk has increased greatly since RedGate started making SQL Source Control. The fact that many of them cover SVN instead of better choices like git or Mercurial disappoints me. But the fact that the topic is being brought up and discussed is good thing for the community overall.
I guess there is a depth of material on how to do the basics and now it is down to new features and processes. I tend to find processes are more important in doing my job.
Absolutely! Process is the key. The presentation that Steve was referencing was all about process. I did a 10 minute demo at the end, but everything else was talking about process. We've finally reached a point where, pick your tool cause it'll work, we can focus on process and process improvement. My argument is pretty simple, developers have been working on these processes for years, and DBAs and database developers are, by and large (there are exceptions and good for you if you are one) are behind the curve on process.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 15 posts - 1 through 15 (of 48 total)
You must be logged in to reply to this topic. Login to reply