June 3, 2015 at 3:26 pm
I did not see the use of CTE included in the comparisons.
June 3, 2015 at 8:49 pm
Mickey Stuewe (6/2/2015)
I'm looking forward to you reading your article.Mickey
Heh... I've gotta ask... slip or something else, Mickey?
--Jeff Moden
Change is inevitable... Change for the better is not.
June 3, 2015 at 9:29 pm
LOL. That's what happens when I'm trying to type to fast. 🙂
Mickey
Jeff Moden (6/3/2015)
Mickey Stuewe (6/2/2015)
I'm looking forward to you reading your article.Mickey
Heh... I've gotta ask... slip or something else, Mickey?
Mickey Stuewe
Sr Database Developer
My blog
Follow me on twitter: @SQLMickey
Connect with me on LinkedIn
--------------------------------------------------------------------------
I laugh loudly and I laugh often. Just ask anyone who knows me.
June 3, 2015 at 9:45 pm
Getting back to it, you've got a good touch for writing. You should do it more often.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 3, 2015 at 11:00 pm
Thanks Jeff. I plan on it. 🙂
Mickey
Mickey Stuewe
Sr Database Developer
My blog
Follow me on twitter: @SQLMickey
Connect with me on LinkedIn
--------------------------------------------------------------------------
I laugh loudly and I laugh often. Just ask anyone who knows me.
June 4, 2015 at 1:04 pm
Nice article. Thanks for taking time to write it.
But why temp tables were analyzed the way it should not be used. It is very bad way to use
select column(s)
into #table
from table.
If you create temp table up front with column's definition and proper primary key then you will see completely different picture.
Leo Peysakhovich
June 4, 2015 at 1:51 pm
Leo Peysakhovich (6/4/2015)
Nice article. Thanks for taking time to write it.But why temp tables were analyzed the way it should not be used. It is very bad way to use
select column(s)
into #table
from table.
If you create temp table up front with column's definition and proper primary key then you will see completely different picture.
Leo Peysakhovich
"It Depends". Ostensibly, you only put into the Temp Table exactly what you need and there's no need for a PK or a Clustered Index because you're going to use everything in the Temp Table.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 23, 2015 at 7:02 am
Jeff Moden (6/4/2015)
Leo Peysakhovich (6/4/2015)
Nice article. Thanks for taking time to write it.But why temp tables were analyzed the way it should not be used. It is very bad way to use
select column(s)
into #table
from table.
If you create temp table up front with column's definition and proper primary key then you will see completely different picture.
Leo Peysakhovich
"It Depends". Ostensibly, you only put into the Temp Table exactly what you need and there's no need for a PK or a Clustered Index because you're going to use everything in the Temp Table.
"It depends". But based on my experience, I have seen many examples where this technique improve stored procedure performance dramaticaly. And it is never get it worth. With statement "select column into #temp from table" I have seen performance problems very often. So, I am advocating even for the temp tables - create them up front and with at least dummy (identity) column as a PK.
June 23, 2015 at 7:39 am
I've never seen such problems as you mention with SELECT/INTO Leo but that doesn't mean they don't exist. Thanks for the warning.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 12, 2016 at 5:25 am
Top-down design usually results in failure; see any centralized government.
Bottom-up design is used by evolution and in my opinion is a thousand times more intelligent than humans.
The two extremes are monolithic SELECT statements or discrete procedural steps. The article is one dimensional, only concerned about CPU. I believe many dimensions need to be considered such as disk, ease of maintenance, stability, clarity.
Most of the queries I support are joining 30 tables with complex filtering criteria. These need to be broken up since the optimizer gets overwhelmed at 8 tables in my experience. I've had huge success with modularizing monolithic queries by using derived tables for every table in a query. It leads to predictability and better plans.
August 12, 2016 at 5:48 am
One trick I use is to write stored procedures using temporary tables, then when the stored procedure is finished and working go back through it and turn the references to temporary tables into derived tables by replacing the temporary tables with the code used to create the temporary tables.
That's the best of both worlds.
August 12, 2016 at 6:03 am
Agree with other posters that once you hit large datasets, temp tables are the way to go. And to solve a very complex problem, sometimes those temp table results are used more than once in a later process.
August 12, 2016 at 7:58 am
A worthwhile article, but I feel the need to trot out my hobbyhorse.
You write about the "best" solution, without defining best. If I read the article correctly, by "best" you mean "running in the least amount of time".
Having spent more years developing software than most of your readers have been alive, my experience is that (assuming the proper result set is returned) "best" is more often appropriately defined as "easy to maintain", which often means "easy to understand at a glance, often by a relatively inexperienced person". With that definition, breaking things down into a series of steps is usually the "better" choice. It is easier to rewrite an easy to understand script to gain more performance (if/when needed) than it is to train up new people to quickly read and understand "sophisticated" scripts sufficiently reliably to perform maintenance.
Summary: if you called this "Writing Faster T-SQL" I would not complain, but you didn't, and I am.
August 12, 2016 at 9:47 am
Stan Kulp-439977 (8/12/2016)
One trick I use is to write stored procedures using temporary tables, then when the stored procedure is finished and working go back through it and turn the references to temporary tables into derived tables by replacing the temporary tables with the code used to create the temporary tables.That's the best of both worlds.
That's funny. I do almost the opposite.
I'll write using CTEs to begin with, because I can still order them within the WITH statement to provide logical flow, but I don't have to write create...insert statements for them all.
So I get a functional query very quickly. Then I do a performance pass or two where I find the CTEs which affect performance the most and convert them to temp tables.
I'm done when I run out of time or find a good enough query. Wait, where have I heard that before?
August 12, 2016 at 11:07 am
Your example states your procedure will return customer count for each employee but your temp table holds customer count by store. Am I missing something? Is there only 1 employee per store?
I tend to fall in with the 'temp table, divide and conquer' crowd. For me this approach evolved over time as my group learned to deal with performance issues. Testing alternatives with realistic record counts is my main takeaway from your very interesting, useful discussion-provoking article.
Viewing 15 posts - 31 through 45 (of 57 total)
You must be logged in to reply to this topic. Login to reply