June 2, 2006 at 11:11 am
I too like temporary tables, I don't use them to often but find them a good alternative to cursors. Recently I have had reason to use a global temp table which is scary and I don't really like so if anyone has better suggestions let me know.
I have an sp which queries a linked server. The linked server has about twenty databases which have the same table structure along with other misc databases. I need to loop through every database and run a query against it, then return the compiled results.
My current solution involves using a comma delimited string as a sort of array holding the database names, and use a while loop to loop through all the databases. I then use the exec command to create a query with the database variable and load the results into a global temp table. Obviously I am very careful to drop the table at the end but still...
Using a global temp table was the only way I could think of to join and interact with the results of a query run with the exec command. The exec command was the only method I could find that would allow me to write and reuse one query in a stored procedure. Yes it does have to be in an sp.
Something I just thought of is calling another sp from the main sp passing the database variable from one to the other. Then I could compile the results in the main sp. If that works it should get rid of the global temp table but not the exec command, still it might be a step in the right direction.
June 2, 2006 at 1:16 pm
Very helpful - I learned some things I didn't know (and which I'd been doing wrong ) - but now I know how to fix them. I'd really like to know some of the pros and cons of using a Server 2000 table data type vs. a temp table.
June 2, 2006 at 1:22 pm
I found #temp tables a nice tool for split very complex queryes and avoid cursors. The unique incovenient is to be sure to drop de #temp (even at a transaction error).
SQL abuses from tempdb behind the curtains. If u put u tempdb in another disk u can improve the performance since u can join data from two sources (double IO!) instead of scanning two tables at the same disk.
At a very uncommon issue i needed get a query, make some processing and run the same query again. temp table saved me since i can run the query just one time.
At a performance analysis i found u can save time at a disk space cost if u properly use temp tables.
Jean
June 2, 2006 at 10:44 pm
http://support.microsoft.com/default.aspx?scid=kb;en-us;305977&Product=sql2k
You may never use a table variable outside a function ever again if you study Q3, Q4, and Q5 a bit...
--Jeff Moden
Change is inevitable... Change for the better is not.
June 3, 2006 at 6:57 am
Good stuff. Thanks for info - and the links others have supplied to asociated information is helpful too.
We have found #temp tables often significantly outperform derived tables in situations where there is a large amount of data. We often create "driver" tables of the primary keys for the data we want from several other tables. If this driver table is used in several queries, there is a marked perfromance improvement.
Key thing that most everyone here seems to agree on - for poor performing queries try different solutions. Don't fall in love with a single technique and expect it to work for everything.
June 3, 2006 at 11:42 am
"You may never use a table variable outside a function ever again..."
I don't see why you would say that. Table variables are certainly no panecea, but they can be useful under the right circumstances. The trick is to use the right approach at the right time. I think it would be very shortsighted to write them off altogether.
/*****************
If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek
*****************/
June 3, 2006 at 2:59 pm
I don't see why you would say that. Table variables are certainly no panecea, but they can be useful under the right circumstances. |
Yep, like I said, maybe in a function... but consider what Microsoft stated in the article I mentioned (which I'm thinking you didn't read, so here's the URL again http://support.microsoft.com/default.aspx?scid=kb;en-us;305977&Product=sql2k  ...
1. "Table variables are created in the tempdb database similar to temporary tables." and "If memory is available, both table variables and temporary tables are created and processed while in memory (data cache). "
...so, NO advantage to using table variables over temporary tables there!
2. "Table variables do not maintain statistics like temporary tables can. Statistics cannot be created on table variables through automatic creation or by using the CREATE STATISTICS statement. Therefore, for complex queries on large tables, the lack of statistics may deter the optimizer to determine the best plan for a query, thus affecting the performance of that query."
...I'll translate... "TABLE VARIABLES ARE SLOW"! So no advantage there, either. In fact, if the table variable has any size to it or is called repetatively, it will be slower than a temporary table... much slower. But don't take my word for it... the following was in the article, as well...
"In general, you use table variables whenever possible except when there is a significant volume of data and [or] there is repeated use of the table."
The above should say "Don't use table variables for anything except when the data you're dealing with is in a function and then only because temp tables don't work there."
The Microsoft article does follow that sentance with "In that case, you can create indexes on the temporary table to increase query performance." I've found that, due to the statistics that are created on temporary tables during construction, you usually don't even need to do that. You can't have either with table variables because "Non-clustered indexes cannot be created on table variables, other than the system indexes that are created for a PRIMARY or UNIQUE constraint."
3. "Tables variables cannot be used in a INSERT EXEC or SELECT INTO statement."
... another great use shot to smithereens. Temporary tables can do this, easily. Yeah, OpenQuery works in many cases but why do the same call to the same proc more than once if you don't need to?
4. "You cannot use the EXEC statement or the sp_executesql stored procedure to run a dynamic SQL Server query that refers a table variable, if the table variable was created outside the EXEC statement or the sp_executesql stored procedure. "
... another loosing proposition although I agree that this won't be done often.
5. "Table variables result in fewer recompilations of a stored procedure as compared to temporary tables."
...Sounds like a great advantage for table variables, but it's not because they're still comparitively slow because they don't use statistics. AND, there are ways to avoid such recompiles... the best way is to define the temp tables at the beginning of the code so as not to mix DDL with DML... again, don't take my word for it... here's all the ways to avoid recompilations (whether you use temp tables or not)...
http://support.microsoft.com/kb/243586/
... and anyone who says speed isn't important hasn't had to stay up all night to baby-sit someone else's garbage-called-code that takes hours and hours to run on a couple of lousy million rows when the same thing could be done in under six minutes or less, data integrity intact (a true story there).
Here's another supposed advantage of table variables over temp tables...
6. "Transactions that involve table variables last only for the duration of an update on the table variable. Therefore, table variables require less locking and logging resources."
... who in their right mind would include a table variable or a temp table in a transaction? IT'S EXPENDABLE! So far as locking goes, temporary tables are scoped by connection... no one else has access to a temporary table ,so who cares?!
7. "computed columns in the table type declaration cannot call user-defined functions"
... yet another downfall in the usefullness of table variables and, in some cases, poorly written UDF's.
There's a lot more but I think the above will suffice... to summarize, table variables and temp tables live in the same place and that's where the similarity ends. Temp tables have much fewer restrictions than table variables and, if you're prone to writing good code anyway, you won't fall into any of the so-called problem areas of temp tables.
I'll say it again, the only place to use table variables is in a UDF and then only because Temp tables don't work there. And, if you think you really, really need to use a table variable in a UDF, you may want to consider finding a method other than using a UDF.
Did I answer the question sufficiently?
--Jeff Moden
Change is inevitable... Change for the better is not.
June 4, 2006 at 7:03 pm
This thread has been very interesting for this process since I've been involved with just about every item mentioned. I use a function to split the list of data sources into a table variable; I use a cursor (which I usually avoid) to process each data source. As I was rewriting existing code, I ditched the myriad temp tables created by SELECT INTO statements (which have no place in a stored proc, IMHO - take the time to script a table, temp or variable) into permanent tables so that I can replace a heap of custom SP's with just two key SP's - the processor and the data loader.
To reply to Jeff Moden's post, who has some valid points, I have to say "each tool for its job". But I have worked for a company that had hundreds of SP's with temporary tables, and just about every one of them started with the following line:
IF EXISTS <temp table> THEN DROP <temp table>
And even (to use Steve Rosenbach's example):
IF EXISTS "#Foo___...___000000017D5D" THEN DROP "#Foo___...___000000017D5D"
Our DBA had a job to check for "orphaned" temp tables and delete them. Perhaps we have some larger issues that have lead to every developer needing to check for the existance "temp" tables, however I'm not aware of this situation existing with table vars.
At the end of the day, there's a batch of tools and each has it's place in the scheme of things - to say "thou shalt not..." with any one of them is limiting your effectiveness in finding the best solution.
Cor, this post ended up longer than I expected.
S.
June 5, 2006 at 6:41 am
Our DBA had a job to check for "orphaned" temp tables and delete them. |
If you mean find the actual orphaned tables in tempdb and delete them, be advised that orphans cannot happen because a temp table is automatically dropped when the connection that created it is terminated or dropped.
I could see it being a bit of a problem if you are pooling connections but leaving a temp table open is just another type of "memory leak" caused by bad programmers.
If you mean in code, please explain further... what do you mean by an orphaned temp table?
--Jeff Moden
Change is inevitable... Change for the better is not.
June 5, 2006 at 7:55 am
Actually I was quite familiar with the article before, and I re-read it just to be sure. Yes, table varibles have limitations, but in cases where you are not dealing with large data sets they can be very useful and they avoid recompilations of your stored procedures. This can be a large advantage, again, when you are dealing with frequently called sprocs that process small data sets. In this situation they are clearly superior to temp tables.
/*****************
If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek
*****************/
June 5, 2006 at 10:27 am
Great idea!
Is there a way to lock the view while it is in use and being changed? Or how to you keep a second instance of your sp from stepping on the toes of the first one and consequently having them both return bad data?
June 5, 2006 at 6:10 pm
By "orphaned temp tables" I mean temp tables in tempdb that no longer have any connecting processes/sessions/etc. I know they are dropped when the connection terminates, and when I first saw this in the test environment I assumed it was a leftover from development in Query Analyzer or similar.
When I saw this in the production environment I got an inkling something was amiss. Checking with the DBA revealed that there had been instances where temp tables remained after the end of the connection, and that there was a job to check for this. This was not a daily occurence, mind, just something that was done when tempdb became a little bloated; one of the things to check.
I haven't seen this myself, and am quite curious to do so coz this shouldn't happen. This is about the extent of my knowledge on this situation. If you'd like more info, drop me an email and I'll see if I can dig something up.
S.
June 5, 2006 at 8:14 pm
Checking with the DBA revealed that there had been instances where temp tables remained after the end of the connection |
I'm thinking that someone was using SP_OA procs and forgot to shut the door which left connections open with their associated temp tables... or, maybe some GUI code has a connection leak... in either case, you good folks may have a bigger problem than you think and are merely treating the symptoms of that bigger problem by searching for/dropping temp tables that these connection leaks leave behind.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 5, 2006 at 9:03 pm
That wasn't an issue in this particular case. Being an ad hoc process it is only going to be running once at any given time: in the production environment it is scheduled and in the test environment it is pretty much my baby.
In a multi-user situation I would consider using owner/schema prefixes (that "dbo." thing ) but would test this extensively first. Perhaps by using "SELECT system_user" or similar, but in certain circumstances the "user" can change. (This being a 2000 issue. In 2005 with schemas this would need not apply, however my experience with 2005 is a bit thin right now.)
Alternatively, you could store a value somewhere and refer to it when the proc starts, ie: 1 = already running, bail out; 0 = fine, keep going. Problem here is that if the proc terminates without resetting the value then the proc will always bail until you manually reset.
As to an explicit *lock* of a view...I've not had to do this so will need to do some investigating.
S.
June 6, 2006 at 6:52 pm
Sql Server 7 ( if I remember correctly) had a problem creating a temp table inside a procedure. Select into ..where 1=2 was the only way to get around this problem.
I liked the clear and simple approach of the article.
Cheers,
Win
Viewing 15 posts - 16 through 30 (of 93 total)
You must be logged in to reply to this topic. Login to reply