May 6, 2010 at 10:01 am
If you're referring to DBPro, yeah, we've been using it for years, quite successfully too. It allows us to move our databases from source control to dev/qa/staging & out to prod. Great stuff. I'm actively evaluating 2010 now.
No, everyone assumes that when I say "Database Project". I actually hated DBPro. It forced you into a methodology and didn't let you structure the project how you wanted. It was overly complex (although flexible). It didn't let you use "ALTER" statements or structure scripts the way YOU want to. I guess I have to use it now since they did away with the Database Project template.
May 6, 2010 at 10:10 am
haha- sorry didn't realise grasshopper was a forum participation level thing, sorry about that! 🙂
That all makes sense, although it's a shame you can't subcategorise scripts into different types of objects and you end up with a giant alphabetized list (I guess prefixing tables with tbl_ etc mitigates that somewhat).
More generally though, you become reliant on having a sure footed and trusted DBA in your setup, as any mistake he/she makes is going to be practically indetectable to you? An example might be an accidentally dropped stored procedure or an entirely new but unwanted object being created in the database, or worse, some changes made by a dba post your testing. I realise this is a management issue to some extent and this often heads into controversy that usually ends with sentences "but ultimately you have to trust the dba", but mistakes do happen, and in our shop it's been useful to have an objective third party process that corroborates our claim that what is in source is actually what is in production.
(I'll accept that a dba can of course alter/circumvent any process that we create to maintain this control)
May 6, 2010 at 10:17 am
bphipps-931675 (5/6/2010)
But you lose the benefits of a change log that manual management of your procedures provides.
In the version of the procedure I have adds and additional call to another stored procedure at the bottom of each script which calls an 'UpdateChangecatalog' stored procedure, this logs to a table the date/time, username, sproc filename, version and a user comment for the stored procedure installed.
PVCS fills in the filename and version (revision). The procedure UpdateChangeCatalog works out the username and datetime, .so no need for any manual management.:
-- ************************************************
-- Update the Change Log
-- ************************************************
EXEC UpdateChangeCatalog '$Workfile:$', '$Revision:$', 'Script to install procedure myProcedure'
GO
Another useful thing we do is add the revision number to the start of the procedure, just above the ALTER statement so we can simply look on any database and instantly see which version of a stored procedure is installed, again PVCS maintains the version number so no manual intervention needed:
e.g.:
IF Object_Id(N'mySproc', N'P') IS NULL
BEGIN
EXEC ('CREATE PROCEDURE [dbo].[mySproc] AS BEGIN SELECT 1 END')
END
GO
GO
-- **************************************
-- $Revision: 1.0 $
-- **************************************
ALTER PROCEDURE [dbo].[mySproc]
(
May 6, 2010 at 10:38 am
Phil Factor (5/6/2010)
I really enjoy seeing scripts like this, since the whole topic of scripting is fascinating. What I really like about this is that someone has gone ahead and brewed a solution that fits their requirements like a glove. It is neat, because it will work with any version of SQL Server 2000+.I wouldn't like anyone to think that this is a recommended general solution, though. SMO is the way to go! (I still occasionally use DMO)
I agree with Phil here. It is nice to see scripts designed to source control a database. I would prefer to use SMO and something more flexible.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
May 6, 2010 at 10:41 am
I would have like to see a more in-depth article on the facets of the scripts. I think this would have added more value to the article.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
May 6, 2010 at 10:47 am
CirquedeSQLeil (5/6/2010)
I would have like to see a more in-depth article on the facets of the scripts. I think this would have added more value to the article.
I didn't want to go into too much detail about other things the script could do as I wanted to make the article simple, straight forward and brief so people could just use it to get their code in a state to put in VC. In retrospect it might have been a good idea put more detail in.
May 6, 2010 at 10:49 am
I think it was a useful article and it's certainly sparked off a lively and fascinating debate.
May 6, 2010 at 10:50 am
jacroberts (5/6/2010)
CirquedeSQLeil (5/6/2010)
I would have like to see a more in-depth article on the facets of the scripts. I think this would have added more value to the article.I didn't want to go into too much detail about other things the script could do as I wanted to make the article simple, straight forward and brief so people could just use it to get their code in a state to put in VC. In retrospect it might have been a good idea put more detail in.
Fair enough. At least it was considered.:-D
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
May 6, 2010 at 11:00 am
Personally I used SSIS to generate an if exists drop, pull out the text from sys.syscomments, and write it all out to text files. We scripted out > 2000 objects in ~ 15mins.
Nate
May 6, 2010 at 1:16 pm
Am I missing something here?
We use SourceSafe. We keep each proc separate. When I need to change one, I check it out, modify it, paste it into QueryAnalyzer or Studio depending on which version of server, run it and test it.
When everything is cool, I check it back in.
The grants are role, not user based so there aren't that many and are in the scripts that are housed in SourceSafe. Seems pretty simple.
I do essentially the same thing from java code to DTS/SSIS source to documentation. A release to a customer is built out of SourceSafe.
May 6, 2010 at 1:29 pm
bill page-320204 (5/6/2010)
Am I missing something here?We use SourceSafe. We keep each proc separate. When I need to change one, I check it out, modify it, paste it into QueryAnalyzer or Studio depending on which version of server, run it and test it.
When everything is cool, I check it back in.
The grants are role, not user based so there aren't that many and are in the scripts that are housed in SourceSafe. Seems pretty simple.
I do essentially the same thing from java code to DTS/SSIS source to documentation. A release to a customer is built out of SourceSafe.
It is the same thing really, this was just how to get the code into a VC system as individual files if they weren't created there in the first place. Once they are in VC you do just need to check out the file, edit it then check it back in again as you said.
May 6, 2010 at 1:34 pm
Wow, that's really labor intensive! Compare that to the integration between a visual studio c# project and visual source safe, where the checkout/checkin can be done with a mouse click.
May 6, 2010 at 1:35 pm
I also home-brewed a solution with C# (SMO) and powershell...
it is an important thing to do and it would be pretty awesome if I didn't need to have spent my own time creating it.
This is on my top 5 list of things I wish came with SQL Server out of the box.
native versioning... keep X versions of every stored proc (X being configurable)
exec dbo.usp_blahblah @myparam = 100; -- would run current
exec dbo.usp_blahblah @myparam = 100 {1}; --- would run previous.
exec dbo.usp_blahblah @myparam = 100 {2} ; --- would run two versions ago.
... kind of like error logs.
Get to work MS!
Craig Outcalt
May 6, 2010 at 1:35 pm
bill page-320204 (5/6/2010)
Am I missing something here?We use SourceSafe. We keep each proc separate. When I need to change one, I check it out, modify it, paste it into QueryAnalyzer or Studio depending on which version of server, run it and test it.
When everything is cool, I check it back in.
The grants are role, not user based so there aren't that many and are in the scripts that are housed in SourceSafe. Seems pretty simple.
I do essentially the same thing from java code to DTS/SSIS source to documentation. A release to a customer is built out of SourceSafe.
Essentially, the same thing that we do. For each build, we concatenate all of the changes together using a utility I developed (see earlier posting), give that to QA for proper testing, and eventually used to upgrade all customer databases.
No, you're not missing something. It just appears that a number of individuals can't get their head around a proper source control system.
May 6, 2010 at 1:49 pm
There is a lot of good info in the posts for this discussion which makes good sense. Some good ideas to add to the arsenal.
I have similarly set up many of the things mentioned here for the team I am on. I agree fully that source control is very important to a DB just like any other programming.
I created a "homegrown" app. out of necessity that uses SMO to script all of our DB objects. We do this to get our custom headers for the objects that require it. Mainly so the script can be run and re-run with no alterations. (This excludes tables, as mentioned by others previously there are many issues scripting them but we do keep them in source control to compare changes at the least as is scripted) We script:
Assemblies
Functions
StoredProcedures
Tables
Types_XMLSchemaCollections
Views
We utilize SSMS solutions and projects. Where each solution represents a DB. And each project represents the different objects. (Set up this way because of SSMS's lack of robustness in its solution/project explorer)
So we could have:
Solution - "NameOfDB"
..Project - Assemblies
..Project - Functions
..Project - StoredProcedures
..Project - Tables
..Project - Types_XMLSchemaCollections
..Project - Views
Solution - "NameOfOtherDB"
..Project - Assemblies
..Project - Functions
..Project - StoredProcedures
..Project - Tables
..Project - Types_XMLSchemaCollections
..Project - Views
(SSMS solutions and projects LACK A LOT of capabilities that you would find with other similar solutions and projects like for C# or VB.NET but it is better than nothing. But MS should have done much better.)
All DB work is done from these scripted source files. Checkout to make changes, apply to development db ONLY. Check in when complete.
When we are ready to migrate a release we script the whole development DB out again and compare to VSS, disregarding specifically named objects that reside on dev only. If they match we are good. If they don't the developer is notified to correct.
But to actually migrate these we use SQL compare tools to create a single script to migrate version changes. This handles all the relation issues, etc. so it is the easy way out.
Viewing 15 posts - 46 through 60 (of 76 total)
You must be logged in to reply to this topic. Login to reply