November 17, 2006 at 9:39 pm
You may rely on SQL on this.
As soon as it comes to MyCn.Close you may be sure the table is dropped.
Just don't forget to close connections.
_____________
Code for TallyGenerator
November 17, 2006 at 10:00 pm
In a properly designed system, if you create a temporay table, you should drop it. You should not rely on the system to clean up after you.
November 17, 2006 at 10:12 pm
Out of curiosity - how do you "undeclare" declared objects, variables (including tables) if you cannot rely on the system to clean up after you?
_____________
Code for TallyGenerator
November 18, 2006 at 4:11 am
DELETE FROM dbo.USP_Demo WHERE Name in ('@UserID', '@UserName')
No seriously. I've always seen that any instanciated object should be explicitly destroyed in application design after you finished using it. I always considered this a best practice and it was also presented as so to me in different articles. Also I think it's nice to have a single standard for all #[#]temp tables... you finished using it, release the ressources.
November 18, 2006 at 4:31 am
Yes, it would be nice to have a single standard for all tables, so how you use to drop table variables?
Do you actually follow your rule "any instanciated object should be explicitly destroyed in application design after you finished using it" when you drop a table? Do you drop all indexes PK and DEFAULT constraints before dropping the table or rely on system on it?
If you do rely on system here why you don't rely there?
_____________
Code for TallyGenerator
November 18, 2006 at 12:06 pm
Doh! I am seeing the utility of both points; especially if you are using a Connected Dataset as opposed to a disconnected. It seems you would have to build another SP which runs regularily to check the state of the connection and kill any SP's that are still active, but the user connection has gone down.
This could be complicated and really cause a serious slow down on your I/O if this was run every x amount of time... Clean up is never easy, nor is it always a one size fits all...
[Not really sure if that was helpful or just more fodder for discussion]
I wasn't born stupid - I had to study.
November 18, 2006 at 7:05 pm
Let me put it this way. I drop all instanciated objects application side. I also trust the server to do its job. But being true to myself and my style I do and will keep destroying objects after there life is over, both client and server side. But I guess it's just personal choice at this point.
November 18, 2006 at 7:42 pm
Here we are.
"Drop all instanciated objects application side" - agree. But #Tables are not "application side", right?
They are not even part of the database you are connected to. It's Server feature, and this feature belongs to this particular connection within the scope of this particular SP.
Your scope is over - #Tables are dropped.
> being true to myself and my style
That's what I was talking about from the beginning. It's neither good nor bad practice. It's just your style.
My style is create small SP and call them as modules from "main" routine. I know the scope for each of SPs and I can manage #Tables accordingly.
_____________
Code for TallyGenerator
November 18, 2006 at 8:20 pm
The other issue to consider: system performance and scalability. When does SQL Server destroy your #table temporary tables? Do they automatically get destroyed and free up disk space in tempdb when they go out of scope, or do they get destroyed by a SQL Server system process that runs periodically? If it is a seperate system process, how often does it run?
It just makes sense, if you create a temporary table (create table #tablename or select * into #tablename) in a stored procedure, you should explicitly drop the table before exiting from the procedure.
If T-SQL had the commands to deallocate variables (table variables included) then you would do that as well prior to exiting.
It is called Best Practices.
November 18, 2006 at 8:52 pm
Before you start the argument you better learn the matter. Then you not gonna ask all those questions.
SQL Server anyway will check for all kinds of objects created or declared within the scope. It will anyway run a routine to drop or deallocate scope-dependent objects.
So, it does not make any sense to drop something before exiting from the procedure if it will be automatically dropped anyway on exiting from the procedure.
It's called wasting time.
_____________
Code for TallyGenerator
November 18, 2006 at 9:19 pm
Sorry to disappoint you, but I am knowledgeable about the subject, although I may not be an expert. It is better to explicitly destroy that which is explicitly created. It is coding such as yours in application systems that results in things like memory leaks and buffer overruns.
What I learned as a developer in application systems does carry over into database systems as well. Take this example (although contrived), a stored procedure creates a 10 MB temporary table and runs under 1 second. It is not destroyed prior to exiting the procedure and takes 3 seconds before SQL actually destroys and releases the disk space in tempdb. If this procedure runs once every second, it will use 30 MB of disk space. Now let us have 10 users each running that procedure each time, now you are using 300 MB of disk space. If the table had been destroyed (dropped) prior to the procedure being completed, in this scenerio that would drop to 10 MB and 100 MB. Increase the number of users to 100 and see what the difference would be.
Just because the stored procedure that created the temporary table has ceased to exist, does not mean the table itself is automatically dropped, it has simply gone out of scope and is no longer visible to other processes. You are leaving it up to SQL Server to take care of cleaning up your mess, when you should be doing it.
It is not wasting time, pointless arguments where neig=ther of us is going to be swayed to the others side is a waste of time.
November 18, 2006 at 10:18 pm
To make sure you are really knowledgeable about the subject you better run some tests.
Because the text above says you are not.
CREATE PROC dbo.ScopeTest
AS
SELECT TOP 1000000 IDENTITY(int,1,1) as ID,
O.Name as ObjectName,
C.Name as ColumnName
INTO #ScopeTest
FROM sysobjects o, syscolumns C, syscolumns C1
select * from tempdb..sysobjects where NAme like '#ScopeTest%'
GO
select * from tempdb..sysobjects where NAme like '#ScopeTest%'
EXEC dbo.ScopeTest
select * from tempdb..sysobjects where NAme like '#ScopeTest%'
DROP PROC dbo.ScopeTest
SP won't be finished until all scope objects are dropped.
_____________
Code for TallyGenerator
November 18, 2006 at 11:21 pm
Like I said, arguing is pointless. Neither of us is going to be swayed to the other side. You can continue to write your code your way and I will continue to write code my way.
We can agree to disagree.
November 18, 2006 at 11:40 pm
We can agree that you statement about "best practices" is wrong.
I proved that all you assumptions and statements in favour of dropping #tables were wrong.
You cannot find any reason why dropping temporary tables before exiting SP may be useful.
It just adds tiny overhead to compile additional statement(s) and do the same operation as Server does automatically on exit from SP.
Yes, this overhead is really tiny, so I wouldn't tell anybody "don't do it".
But what I can tell is "There is no point of doing this".
That's what I stated on the first page of this topic. You started arguing this. Would you now?
_____________
Code for TallyGenerator
November 19, 2006 at 8:19 am
So we all agree to disagree. Let's close this one before it gets out of hand .
Viewing 15 posts - 16 through 30 (of 55 total)
You must be logged in to reply to this topic. Login to reply