March 10, 2011 at 6:49 am
I could not find any Davelopment Tools that where released in the last ten years on MSDN.... 😀
I wonder if this is another victim of the Question submission tools edit function. 😛
March 10, 2011 at 6:51 am
mohammed moinudheen (3/9/2011)
I had no idea about the answer, I guessed it and got it right. 🙂Off late, there are so many questions on Visual Studio tools in QotD section.
It would be great if someone could share names of books on this topic for beginners.
The Visual Studio 2010 documentation is online here:
http://msdn.microsoft.com/en-us/library/dd831853.aspx
It does not have any infomration on Davelopment tools. 😉
March 10, 2011 at 6:52 am
SanDroid (3/10/2011)
I could not find any Davelopment Tools that where released in the last ten years on MSDN.... 😀I wonder if this is another victim of the Question submission tools edit function. 😛
I suspect its more a victim of my poor grasp of english, which is rather disappointing given it is my native language 🙂
Jamie Thomson
http://sqlblog.com/blogs/jamie_thomson
March 10, 2011 at 7:30 am
We recently upgraded to Visual Studio 2010 Ultimate and I had some free training time, so I dove into the database project stuff. At first, I thought I was wasting my time, but like Jamie said, it all started to "click" at a certain point, and you get it. And, it's pretty awesome what you can do, especially with the refactoring of columns - automatically changing all the objects that reference that column, having each object as it's own file (yes, 1000 per folder limit due to performance, heh). What can take me hours on end happened in 2 minutes with this tool. Seriously.
March 10, 2011 at 7:39 am
I wan't everyone to understand something.
1000 files is a performance recomendation created a few years ago using a 32bit single core processor system running what I would call a limited configuration as a base point.
I have actually had to change this default configuration value in every version of Database Tools. 😎
If you have a good dev system with plenty of memory, more than one CPU core, and you are not running 32bit windows install on 64bit hardware importing Databases with almost 8000 objects are no real problem.
:smooooth:
The amount of time these tools save you are worth any system lag you may experiance.
March 10, 2011 at 7:46 am
SanDroid (3/10/2011)The amount of time these tools save you are worth any system lag you may experiance.
I've encountered *a lot* of people that disagree with that sentiment 🙂
I, however, agree wholeheartedly.
Jamie Thomson
http://sqlblog.com/blogs/jamie_thomson
March 10, 2011 at 8:04 am
SanDroid (3/10/2011)
If you have a good dev system with plenty of memory, more than one CPU core, and you are not running 32bit windows install on 64bit hardware importing Databases with almost 8000 objects are no real problem..
I'm surprised to hear this given that Visual Studio is 32bit only. Good news if true though.
Jamie Thomson
http://sqlblog.com/blogs/jamie_thomson
March 10, 2011 at 9:58 am
Duncan Pryde (3/10/2011)
Jamie Thomson (3/10/2011)
The big benefits as far as I can see them are:
-Development-time error checking (i.e. find out about errors before you actually run the code - so you wont get caught by deferred name resolution)
-Declarative development. (i.e. You define what the database state should be and the tool works out how to get it to that state, as opposed to you having to author all of the ALTER statements)
-Code analysis (i.e. it highlights bad coding practices)
The second one does look like it might swing it. Up to now, we've tended to hand-write upgrade and rollback scripts for each release. It works, but it's time-consuming and quite clunky. I imagine this would be an improvement on that approach.
That's the biggie... My deployment routine is to restore a copy of the target locally and get VS to generate the deployment scripts for me... some checking is necessary of course, and some small edits have to be made, but it works, and it gets the target database in line with your model time and time again. It takes me less than a day all included to recreate the live environment locally and on test for four databases, generate scripts, tidy them up and deploy to test. Where I used to work, a database team or 4, serving a similar sized development team, took 2 or 3 times as long to put together a deployment script.
For our last release, however, another developer did the release, and while he started with a generated deployment script he couldn't be bothered to follow my established procedure (one script per DB into test, and a second after bug-fixes, so you have a part 1 and a part 2 script to run on production eventually - maybe a part 3 if the bug fixes needed further fixing) and instead hand-cranked extra items into it rather than generating a catch-up script from the testing phase... we've spent a few days tracking down and fixing the errors he introduced! Missed procedures, one procedure that was created with a rogue "drop procedure..." statement after it so every time it ran it dropped another procedure, missing static data... all manner of issues!
March 10, 2011 at 10:06 am
dave.farmer (3/10/2011)
For our last release, however, another developer did the release, and while he started with a generated deployment script he couldn't be bothered to follow my established procedure (one script per DB into test, and a second after bug-fixes, so you have a part 1 and a part 2 script to run on production eventually - maybe a part 3 if the bug fixes needed further fixing) and instead hand-cranked extra items into it rather than generating a catch-up script from the testing phase... we've spent a few days tracking down and fixing the errors he introduced! Missed procedures, one procedure that was created with a rogue "drop procedure..." statement after it so every time it ran it dropped another procedure, missing static data... all manner of issues!
That's a horror story. If any anecdote illustrates the value of VS DB Tools its that one right there!
Jamie Thomson
http://sqlblog.com/blogs/jamie_thomson
March 10, 2011 at 10:07 am
For reasons of performance the recommendation is that 1000 files per folder is not exceeded. But no reference has been included in the explanation. So the justification for the correct answer is truly subjective.
The addition of say 1,001 files will it reduced performance by say .00001 per cent or by 10 percent.
March 10, 2011 at 10:11 am
Nice question.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
March 10, 2011 at 1:12 pm
Interesting question.
March 10, 2011 at 1:57 pm
March 11, 2011 at 1:21 pm
Interesting question and discussion.
Tom
Viewing 14 posts - 16 through 28 (of 28 total)
You must be logged in to reply to this topic. Login to reply