December 10, 2001 at 3:09 pm
I design databases and program applications using them and am presently on a contract with a client who uses SQL Server 2000. I have been hired to design a commission tracking database and prepare all of the necessary stored procedures, views, etc., to extract data for a reporting system I am also designing and programming. I have run into an interesting road block; the DBA refuses to allow me to use either local temporary tables or global temporary tables in situations where one would use a cursor. Having found that I get better performance using a temporary table to hold specific data I am looping through for manipulation and reporting purposes, I do not understand the DBA's refusal to allow the use of temporary tables. Any ideas as to why this might be the case?
Thank you.
Hawkeye67
December 10, 2001 at 3:16 pm
Pesonally I hate temp tables, but I don't REFUSE them. I think peformance suffers. If he will not allow temp tables, then you could build a perm table and use the SPID as a column to match rows with a connection.
Steve Jones
December 10, 2001 at 3:20 pm
I see nothing wrong with temp tables when used wisely. Certainly Steve's idea of a permanent table and a spid can often achieve the same thing. Another option would be a table returning function. Im sure you know, but in general any kind of loop operation is expensive, far more so than the hit to create a temp table. I'd take a second look to see if you cant produce some or all using a set based approach.
Note - I agree about global temp tables - horrible concept.
Andy
December 10, 2001 at 3:22 pm
I also hate Globals. Cause problems and make bad solutions. What about doing the loops on the client and calculating there?
Steve Jones
December 11, 2001 at 1:51 am
Could you get away with using 'derived tables' instead in your query?
Steven
December 11, 2001 at 5:14 am
Don't allow temp tables - must make for an interesting system.
The good thing about temp tables is that they clear themselves up when you close the connection and so you don't have to mess around with recovery code.
I'll agree that for things like web applications they can end up slowing the system down due to contention on system tables but for most situations they can make the code a lot simpler and more efficient.
Global temp tables are handy for bcp and the like but that's about it.
I suspect your dba has had experience with poor code which filled up tempdb or people doing select into's in v6.5 and bringing down the server - if you're sensible there's nothing wrong with them.
Cursors never.
DTS - only when needed and never to control.
December 11, 2001 at 6:32 am
I would like to reply to each of you as you gave me a number of things to think about but instead, I will make some general comments.
I agree that bad code using temp tables, be they global or local, can seriously jeopardize the system, but I have learned over that last 6 years, since starting with SQL Server 6.5 to use them sparingly. Most importantly, I create them with specific fields in the stored procedure, and then use the INSERT INTO ... syntax rather than the SELECT INTO ... for the reason that the INSERT INTO ... avoids the locking problems nigelrivett spoke about. But in circumstances where you are preparing dynamic sql statements to be executed within the stored procedure, especially in situations where you might have a number of nested dynmaic sql executions, global temp tables come in very handy, especially when you are not sure of the actual number of fields that might need to be returned by the query when the database has been used for some time and changed over time. For example, I recently did a law enforcement database application accessing an SQL Server 2000 database I did not design for reporting purposes. This database had over 175 tables, and it used what the designer called a Universal Table to access certain data in other tables that could not be joined without first extracting the necessary keys from the Universal Table. In addition, the database allowed for certain tables containing fee category information for fines and payment types to change by adding or subtracting columns related to fee type. As a consequence, and in an attempt to write stored procedures that would not have to be re-written every time the database was changed as to fee categories, I had to create a series of dynamic SQL statments that would return global temporary tables containing all the necessary fee categories and the data pertaining to them, even though I had no idea as to the number of categories or their names. In other words, I treated the stored procedures like you would program a COM object, allowing for the adding of interfaces without changing the contract of the original interface the accessing program was looking for in retrieving information.
I do understand the problem with TempDB and the use of temp tables, but I have never had the occassion to use more than 4 temp tables in any stored procedure and always make it a practice to drop the temp table immediately after I am finished using it and not waiting to clear the TempDB at the end of the stored procedure.
I will certainly consider the option of doing all the calculating at the client level, but, as you know, the server generally is more powerful for these purposes.
I also make it a general practice to open and close connections to the database only when needed.
I appreciate all of the input here and thank you all for taking the time to answer my questions.
Hawkeye67
Hawkeye67
December 11, 2001 at 10:17 am
An interesting explanation. And not without merit. Part of my hesitation is that over the years I have seen more bugs (obscure) with views and temp tables than with normal tables (or cursor for that matteR). I think the programming complexities of the server give more chance that memory will leak or some unforseen bug will crop up with views or temp tables, so I use them minimally.
I think a permanent table that uses SPIDs (not my invention, one I adopted from another) is a great solution to not using temp tables. In SQL 7/2000, you can even place this on a separate file or filegroup if it receives heavy use and have control over it's I/O. It can also be indexed which can also improve performance.
Not that your method is not valid or will not work, this is, IMHO, a better alternative.
Thanks for the feedback and debate.
Steve Jones
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply