July 12, 2004 at 3:43 pm
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/DFuller/databasechangemanagementbestpracticesachievinganau.asp
July 12, 2004 at 9:14 pm
Lots of good points highlighted and looking forward to reading part 2.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
July 12, 2004 at 9:57 pm
I am a proponent of the manual scripts method (with some automation ).
Most of the drawbacks listed by the author can be summarised as "its a big hassle for the DBA". This is true in many situations, especially for some table and index changes and I am interested in seeing part 2.
For the bulk of development which is stored procs, views, functions and new tables, I use a hierarchical directory of folders sorted in the order that scripts must be applied and have a script called RunSQL which runs all the scripts.
With reference data, my RunSql script can read text files in comma delimited form and these are kept in VSS.
July 28, 2004 at 5:09 am
I am fairly new to SQL Server and originally struggled with the manual approach but now adopt the following approach which seems to work very well for me.
I have a Business Testing environment where I simply restore the Production environment.
I then document the following so that it can be easily replicated during the Production Close.
I then use Redgate SQL Compare to update this enviroment with the changes in the Development environment. I of course scrutinise the changes carefully.
I use Redgate SQL Data Compare to update reference data tables where fields have been added.
And DTS Backup 2000 to copy all the DTS Packages.
Following Business Testing I adopt the same approach to update Production.
I save the reports and scripts created by Redgate. And back up the original DTS packages before deleting and overwriting them.
July 28, 2004 at 5:16 am
therefore you cannot say who updated what code and when with a complete history of all changes...
making the change is not change management.
July 28, 2004 at 7:28 am
Good layout of the common solutions. We currently use the manual approach wioth templates for stored procs and triggers. This was working until we grew to about 12 developers. Now it has become a mess since everyone is at different levels of sql scripting and some better then others. We are currently looking for a new solution and are evaluating DB Ghost as a possible solution. Any one use or heard anything about it?
Can't wait for part 2!
Jim
July 28, 2004 at 9:19 am
This looks like it's going to be a very good series of articles. I'm always looking for ways to improve how we manage database change. We are a small team, with myself (DBA and actually lead developer too), and 2 other developers. We maintain 3 separate environments (Development, Beta and Production).
Currently I am the only person responsible for any database changes - schema, stored procedures and permissions. I've implemented a fairly manual approach, but I use Visual Studio.NET with a database project to manage "groups" of changes. VS.NET also will do some automatic scripting of schema changes, but you are on your own as far as stored procedure change scripts are concerned. The drawback is that all "rollout" status of each DB project, as well as any interdependency between projects (project B requires project A be rolled out first) is all completely manual, and I track it all in an Excel spreadsheet for now.
This system is effective for our needs now, but as the team grows, I'm sure we will need to make some changes.
Ross Environmental Services
July 28, 2004 at 10:58 am
Good teaser article to have us looking forward to the next part. This really just highlights some basic methods that are currently used. Some suggestions on better alternatives will be greatly appreciated when part 2 comes along. We use a separate change-tracking/approval system right now. I look forward to seeing how it can all be integrated a little better.
-Pete Schott
July 28, 2004 at 12:22 pm
Go to http://www.innovartis.co.uk and get the whitepaper.
July 28, 2004 at 1:53 pm
That's pretty disappointing. While I like looking at new products, it's a shame that this is pretty much leading up to "Use DB Ghost". How about those of us who just don't have a good software budget? It's difficult to justify purchasing another tool (even with the ROI calculator - that is a nice touch) to management. Anyone have some best practices that don't involve purchasing a new tool? 🙂
-Pete Schott
July 28, 2004 at 2:32 pm
These are articles are by the author of the whitepaper and will be coming every Wed for the next few weeks. They form the basis for a methodology that doesn't require dbGhost, but was the theory behind why it was developed.
As far as not having a budget, you're stuck. Any good VCS product will require some $$. I was in the same boat and developed my own manual/automated solution, mostly a process. You can read about it here:
http://www.sqlservercentral.com/columnists/sjones/vcspart1.asp
http://www.sqlservercentral.com/columnists/sjones/vcspart2.asp
http://www.sqlservercentral.com/columnists/sjones/vcspart3.asp
http://www.sqlservercentral.com/columnists/sjones/vcspart4.asp
July 28, 2004 at 3:27 pm
Thanks, Steve. That makes me feel a little better. The first part was a direct quote from the white paper and that didn't bode well for the rest of the series. 🙂 I'll take a look at your columms, too. The whole process is interesting - it's just a hard sell to buy new products for development.
I will be interested in the whole methodology behind this to see what they did and thought of. The product concept looks like it could be useful, it's just tough to sell that sort of solution for our environment.
-Pete Schott
July 28, 2004 at 6:06 pm
It is, especially when you have to license all developers. I've not successfully licensed any product for everyone.
July 28, 2004 at 7:37 pm
Many thanks for the feedback. I have tried to write the articles (white paper) as generically as possible. I really want to convey the concepts of SQL object (and hence database) version control and automated generation of change scripts. After many years as a DBA, I haven't seen things improve much in this area. The ideas and process outlined is working successfully in production environments. Yes, the concept requires some software for automation and it can be software of your choosing. I have tried to address some of the considerations for implementing an automated approach. If that is not an option, there are some good articles around - thanks Steve.
Thanks,
Darren Fuller
July 28, 2004 at 8:46 pm
The only reason I came across the whitepaper was because I could not wait for the ending, Darren was speaking on a topic that I am in the middle of, and I need an answer before the next article was released. If this piece of software does what it says it can do, I have no problem paying for it. We have currently brought in 4 consultants to fix a f$#%^ed Database that would have been in better shape if we would have had some form of change control, (and a good SQL DBA, but that's another story). For the price of this software, if it does what it says it does, part of my problems will be solved. I've had heard pitches for software before but this one is right on the money. I am in the process of setting up a lab to test the software, to see if it does what it say's. If it does, thank you Darren.
p.s. Sorry if I took the wind out of your sails mate!!
Viewing 15 posts - 1 through 15 (of 18 total)
You must be logged in to reply to this topic. Login to reply