November 27, 2008 at 5:01 pm
Comments posted to this topic are about the item The Full Potential of SQL 2000
November 27, 2008 at 6:30 pm
Tony:
I see some UDF's in the Matrix workbench: They're not going to work on SQL Server 7.0. ๐
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
November 27, 2008 at 10:15 pm
Heh... you know I'd show up with the idea that it "can all be done in T-SQL". Now, if we can just get Microsoft to stop deprecating stuff! ๐
--Jeff Moden
Change is inevitable... Change for the better is not.
November 30, 2008 at 9:05 am
Hello for All,
About Potencial of the Versions of SQL Server, in my company we migrated SQL Server 2000 for 2005, but all databases run in 8.0 compatibility mode.
I study Report Service 2008 and until now, any new feature did not maked a difference, and Iยดm think in implement the Report Service 2005.
Thanks,
Brazil
November 30, 2008 at 3:20 pm
Why build using managed languages? Why build using object oriented languages? Why build using high level languages? Let's all go back to assembler.
The editorial is inane. Why would you want to do things the hard way when you've been given a productivity tool to do things simpler, faster. Why would you want to build anything on a platform at its end of life?
I get that the article is rhetorical, but most people don't jump on new features because they're new. They do it because it provides a simple framework to be more productive, more agile, cheaper to build, and easier to maintain. Just imagine what would happen to your proprietary SQL Server 7.0 framework the moment a body walks out the door.
November 30, 2008 at 3:21 pm
Hello Steve,
I agree with you that while excitment sells it doesn't necessary mean progress. And not everyone wants to blow the same whistle and reach for the newest brass ring. But when it comes to things like matrix work benches there seems a need for a reality check too. What you are encouraging is to go where there is absolutely no intent. Yeah, you can play around. You can also try to manage an enterprises data with fortran. So it's not surprising that some spacecadets think nothing of using the server for factor analysis. If this is for 'real' and for 'real' data, then here's the check mate - that's crazy. Professionals know the tools for their trade, or should. You don't play chess with checkers. If you want to deal with matrices or multivariate analysis you go with the software where that's the intent. You check out the IMSL library, SAS, SPSS or BMDP. They've been around a hell of a lot longer than sql server and their stuff is burned in. You don't go cobbling together select and backdoor update statements. MS users should learn more about broads, as in horizons.
best,
steve dassin
P.S. If you want real arrays in real data management you need look no further than the combination of Dataphor/Sql Server.
November 30, 2008 at 10:35 pm
SQL 2000 is a great, stable platform and it's served me well in many situations. I've had all kinds of businesses run well on it, including this one. We upgraded to 2005, but we didn't need to and SQL 2000 would likely run this site for a long time to come.
That being said, I'm not sure I'd continue to develop on this platform since support (mainstream) is essentially done and you never know when you'll run into an issue you need help on. People continue to find bugs all the time with new development.
SQL 2005 and 2008 are great evolutions of the product and I'd recommend them to anyone. However I'm not sure I'd upgrade existing applications without a good reason. Continuing development might be a reason to do so.
And thanks for the editorial, Tony. Nice to have a break.
December 1, 2008 at 7:08 am
I like your question " Have I even got close to exploiting the full potential of SQL Server 2000?" This could also be asked of SQL 2005. With SQL Server or any other software, you will never see its full potential as long as new version get pumped out every 2-3 years. It's all about business and making money not about making SQL Server better...until the next version.
I really enjoy working with SQL Server as most of you, I only wish that instead of making new versions, why not just continues and add to existing foundation? Add features, functions, processes. All these could be additional dollars and you would only pay for the upgrades needed.
Just my cents worth.
Rudy
Rudy
December 1, 2008 at 8:38 pm
That's exactly what SQL Server 2008 is... the ultimate service pack for 2005 ๐
--Jeff Moden
Change is inevitable... Change for the better is not.
December 1, 2008 at 8:48 pm
It's always a balance. We complain and ask for things to be fixed, stabilized, sped up, and there are changes made in those areas, but it's a business and there is a push to implement new features to create sales.
I'm with you for the most part. I wish we'd get more stable every 2 years and then every 4 years a new version, but it looks like the other way. A new version every 30 months or so.
December 2, 2008 at 6:24 am
Either way, it's still a very good product. I've used Sybase and now forced to learn Oracle, when you see/use them you will appropriate the SQL server even more. Just hope that Microsoft will continue to listen to the SQL communities and find a way to still make money and keep us all happy.
Just my 2 cents worth.
Rudy
Rudy
December 2, 2008 at 5:57 pm
I know lot's of good folks that like it, but I don't care for Oracle much, either. Many will disagree with me, but I find it too limiting.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 14, 2008 at 4:51 am
I'm not sure if I buy the idea that it is becoming simpler and simpler to write applications because the tools we use are getting better and better. It is this belief that keeps driving us on to upgrade to the latest bleeding edge of Microsoft product whatever the inconvenience and cost. It may be a good idea, but we need to make a conscious decision.
Over the past twenty years, the lead-time for new applications has been getting longer and longer, they are getting more and more expensive, and the failure rate has remained constant. There have been some huge breakthroughs, certainly, and the expectations, and demands for quality and compliance have increased enormously, but basically, the increase in the complexity of the software tools we use has not been mirrored in more rapid or successful application development. There has always been a huge gap between marketing and reality in the IT industry.
Best wishes,
Phil Factor
December 14, 2008 at 9:53 am
Phil Factor (12/14/2008)
I'm not sure if I buy the idea that it is becoming simpler and simpler to write applications because the tools we use are getting better and better. It is this belief that keeps driving us on to upgrade to the latest bleeding edge of Microsoft product whatever the inconvenience and cost. It may be a good idea, but we need to make a conscious decision.Over the past twenty years, the lead-time for new applications has been getting longer and longer, they are getting more and more expensive, and the failure rate has remained constant. There have been some huge breakthroughs, certainly, and the expectations, and demands for quality and compliance have increased enormously, but basically, the increase in the complexity of the software tools we use has not been mirrored in more rapid or successful application development. There has always been a huge gap between marketing and reality in the IT industry.
Very well said. That just about sums up the reasons why I don't care for DTS, SSIS, CLR's, Business Objects, and a host of other flashy computational aberrations that supposedly enable people to be more productive. People have to become familiar in many areas to do what... Import a simple file? Do a simple split? Create a running total? Join a couple of tables? How many times have you seen a DTS or SSIS job where something (supposedly) can't be done and people revert to an writing an ActiveX component or a PERL script or CLR... etc, etc. I'm seeing that in my current job, alot! And, everything they're writing that way is either slow or horribly and unnecessarily complex. For example, they have a very complex file type to import that defies all conventional methods of import. They wrote a DTS job that uses Perl scripts, ActiveX and a couple of other computational flavors. It takes 40 minutes on a file of just 30,000 rows and 215 columns wide just to get the file ready for import never mind doing the actual import. Using 100% T-SQL in a comparitively short sproc, I get the same thing done PLUS the actual import in 92 seconds.
Microsoft keeps adding/releasing products to "make it easier" to use SQL Server. As a result, people who know nothing of databases are now using and abusing it. It's made SQL Server much more popular, but it sure plays hell on systems when these people touch the data.
Some folks say I'm being stubborn about not using tools other than T-SQL. I guess that's pretty much true... when I can take a complex file import from 40+ minutes down to 90 seconds or write T-SQL to change a 24 hour, 62 database dupe-check that would usually fail, to a very lean T-SQL sproc that does 93 databases in 15 minutes and hasn't failed yet, I'm thinking that lots of folks just haven't taken the time to really explore the full potential of SQL Server and, especially, T-SQL. Further, it only took me three days to do it including final acceptance testing. The original dupe check, written in C#, took 2 developers 2 weeks to make something slow, unreliable, and not leave enough time in a day to actually do the full 93 database requirement.
As you can tell, I not only agree that all these flashy products HAVEN'T increased productivity, effeciency, accuracy, or performance of applications, I believe that they've generally caused a decrease in all of that and an increase in the cost of getting products to market even if the market is "in house usage". There are exceptions, of course, like Reporting Services, but for the most part, I think most folks have fallen for the Microsoft marketing strategy and the "you're stupid if you don't do this" mentality... I think the flash of all that other stuff get's in their eyes when it comes to making common sense and innovative applications.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 14, 2008 at 4:52 pm
Are we sure it takes longer for new applications? I know we're in a mode of enhancing applications in small ways on a regular basis, but it seems that we do often develop new applications in months or even weeks instead of years.
Is quality down? Not sure about that, it's not great now, but it wasn't great before. At least not 10-15 years ago.
I do agree that more and more we have people that are less and less qualified touching applications and making life hard for us.
We upgrade too much, and I think that's a problem. Instead of charging for support and having longer lifecycles, we have shorter lifecycles, and often free or discounted support (and we get what we pay for). I think I'd like to have us go back to 4-5 year time frames for new versions and have support for 8-12 years for each one.
Viewing 15 posts - 1 through 15 (of 36 total)
You must be logged in to reply to this topic. Login to reply