August 9, 2007 at 6:33 am
Hi
I am looking for some guidlines or a document with generally accepted principles on the optomization of T-SQL i.e. general practices which should be followed to provide the fastest response from the server.
Thanks,
August 9, 2007 at 6:53 am
August 10, 2007 at 8:11 am
Mark, you don't give your level of experience with databases or sql server, so it is a bit difficult to target advice for you.
1) Mentoring is the best way to pick this stuff up.
2) Check Microsoft's website for sql server best practices.
3) Learn how to use Profiler so you can modify a query and compare before/after results to see if it is better or worse and why. Same goes for SET STATISTICS IO ON and show actual query plan in Query Analyzer or SSMS.
4) Cursors are BAD!!! DO NOT USE THEM! Learn how to wrap your mind around set-based logic.
5) Indexes are good - mostly. Learn how to develop an indexing strategy.
6) Avoid interim tables - mostly. This one takes study, practice, and some intuition.
Depending on your level of knowledge, there are a number of books from beginner through expert to help you be more effective designing, developing and tuning against sql server.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
August 10, 2007 at 10:47 am
I'll disagree with #4. There are instances where cursors are usable. Before using them, be sure you have looked at all other alternatives.
August 10, 2007 at 10:52 am
To explain and get understanding of the very few exceptions to #4 requires a lot of effort on both parties and still has low probability of success --> it is still a pretty good rule, IMHO. 🙂
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
August 10, 2007 at 12:35 pm
I will have to dwelve into that once the basics have been applied.
Thanks all,
August 10, 2007 at 1:25 pm
I have run into a couple of situations where a cursor based solution was a lot faster than the set based solutions I was attempting, 45 minutes for the cursor vs many hours for the set-based solution. I didn't have a lot of time to investigate other options as I had other projects that I needed to work on as well, result: we went with the cursor-based solution. I no longer work for that company, but if given the opportunity to improve that process, I would take it.
August 13, 2007 at 6:40 am
I agree with TheSqlGuru. We should avoid cursors at all cost. It is a performacne hog and can bring the server to hault. Once you have a system of how to get around using cursors, seting things up doesn't take long. However, if it is a small amount of data using cursors are safe.
Gettingthere
August 13, 2007 at 9:14 am
Even small amounts of data can be most inefficient with cursor access. A cursor that loops through just 100 rows of data will require 100 8K page reads just for data access. If all 100 rows in this tiny table fit on one 8K page (not unreasonable), that is over TWO orders of magnitude less efficient. Now, think about executing this one or two hundred times a minute. Such a waste of server resources.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
August 13, 2007 at 9:29 am
I am a firm believer in "Never say never." I will not tell anyone to NEVER use cursors. Although the times to use them a few and very far apart, they do have thier uses, and you should know how AND when to use them as well as when they are inappropriate.
I try to avoid cursors but I have found times that they are useful, such as at my current employer where I am using cursors and dynamic sql to build views, archive tables, and insert/update/delete stored procedures. The scripts my not use formal cursors (I am using temp tables and a while loop, but it is the same principle), but this is a case where cursors would be appropriate.
It is a matter of looking at what your are trying to accomplish and using the right tools for the job.
August 13, 2007 at 11:10 am
It's not just cursors... single row process loops are the real culprit. Cursors inherently are single row and they use loops. The real rule is that RBAR (pronounced "ree-bar" and is a "Modenism" for "Row By Agonizing Row) should be avoided at all costs.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 14, 2007 at 9:07 am
Yes, cursors should be avoided, but SOMETIMES (not very often) they are appropriate. That is all I am saying. As DBA's/developers we need to know how to use them, when they are appropriate, and when they are not. You can't just give a blanket statement: never use them. As I mentioned earlier, I know of at least one case where cursor based processing was more efficient then the set based based solutions I had tried. I know I hadn't exhausted all possibilities, but time constraints forced us to use the cursor based solution because it met our requirements time wise. I would still love to find a more efficient set based solution, but I don't work for that company anymore.
August 14, 2007 at 9:38 am
I have some client's with "unfortunate datastructures" that do require cursor processing. I also use them occassionally when looping through system objects for one or another reason. So I do agree with Lynn's statement that sometimes the really do have to be used.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
August 14, 2007 at 6:09 pm
Whatever... I've been working with SQL Server for 12 years and I've never had to write a cursor. Yeah, sometimes the occasional While loop in a function (have even gotten away from that ) , but never a cursor (except for demo's on how bad they are ). And, the use of While loops for me are less frequent than most others use a cursor and those While loops normally control sets of data, not RBAR.
User cursors, if you think you need to... I simply haven't found the need
--Jeff Moden
Change is inevitable... Change for the better is not.
August 15, 2007 at 10:56 am
There is almost always a way to not use a cursor.
I have used cursors in rare instances to speed up performance when the set based solution of updating millions of rows would take hours (probably due to I/O constraints). Using a cursor to break it up into chunks of 10,000-20,000 records at a time (still semi - set based) increased performance down to minutes.
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply