December 9, 2014 at 8:48 am
Trying to go back to the original problem, I think the SP that is supposed to fill your temp table has some sort of problem. Either the data is not there or the code is wrong. I made a test and you can certainly use temp tables across nested procedures.
CREATE PROCEDURE dbo.TempTableCreation
AS
CREATE TABLE #OneTable(
IntCol int,
StringCol varchar(50)
);
EXECUTE dbo.TempTableLoad;
SELECT *
FROM #OneTable;
DROP TABLE #OneTable;
GO
CREATE PROCEDURE dbo.TempTableLoad
AS
INSERT INTO #OneTable
VALUES( 1, 'One'),(2, 'Two');
GO
EXECUTE dbo.TempTableCreation;
GO
CREATE TABLE #OneTable(
IntCol int,
StringCol varchar(50)
);
EXECUTE dbo.TempTableLoad;
SELECT *
FROM #OneTable;
December 9, 2014 at 9:32 am
Sean Lange (12/9/2014)
Here is where I am going to disagree with Gail and Grant. I have worked on a system that had many procs nested like this and it was a serious PITA to maintain. What you end up with is a bunch of stored procedures that can't work on their own because they require temp tables created in previous procedures. You end up with a tightly coupled group of procedures. It seems clever during development but in the long run it is painful. Debugging procedures like that (especially when you were not the author) is like the spaghetti code of old. I would recommend rethinking the process and roll the logic into a single stored procedure to save yourself hours and hours of regret down the road.
Yes, I'm starting to see that. And part of it is limitations of SQL Server. It's a pain right now to debug it too. But I'm probably falling back on my old experience with SQL Server, and I mean old, where there was no way to step through code (did I say OLD experience). And I'm getting back into coding t-sql again after a long hiatus.
And back then, trying to debug a long comprehensive proc, particularly when making changes, was a huge chore.
What are the current tools for stepping through code? I don't believe I have a developers version of SQL Server.
December 9, 2014 at 9:39 am
DSNOSPAM (12/9/2014)
Sean Lange (12/9/2014)
Here is where I am going to disagree with Gail and Grant. I have worked on a system that had many procs nested like this and it was a serious PITA to maintain. What you end up with is a bunch of stored procedures that can't work on their own because they require temp tables created in previous procedures. You end up with a tightly coupled group of procedures. It seems clever during development but in the long run it is painful. Debugging procedures like that (especially when you were not the author) is like the spaghetti code of old. I would recommend rethinking the process and roll the logic into a single stored procedure to save yourself hours and hours of regret down the road.Yes, I'm starting to see that. And part of it is limitations of SQL Server. It's a pain right now to debug it too. But I'm probably falling back on my old experience with SQL Server, and I mean old, where there was no way to step through code (did I say OLD experience). And I'm getting back into coding t-sql again after a long hiatus.
And back then, trying to debug a long comprehensive proc, particularly when making changes, was a huge chore.
What are the current tools for stepping through code? I don't believe I have a developers version of SQL Server.
I would suggest that if your procedure is so complicated you need to step through it you can probably gain a lot by simplifying your code. Often times what I have seen in those cases is the procedure is trying to do too much. What I really mean here is that the business rules should not be part of the stored procedures. This is all part of the separation of business layers. That isn't always possible of course.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
December 9, 2014 at 10:10 am
I would suggest that if your procedure is so complicated you need to step through it you can probably gain a lot by simplifying your code. Often times what I have seen in those cases is the procedure is trying to do too much. What I really mean here is that the business rules should not be part of the stored procedures. This is all part of the separation of business layers. That isn't always possible of course.
+1
Don Simpson
December 9, 2014 at 11:08 am
Not complicated. Just tedious to debug when changes need to be made.
I guess it also goes against all my reflexes for other programming languages, which is to break a problem down into logical components.
SS seems to throw other constrictions in the way.
I put it all into one proc. Seems to be working. Haven't had a chance to look at the code from the other fellow who showed that they do persist across procs.
December 9, 2014 at 2:15 pm
I would suggest that if your procedure is so complicated you need to step through it you can probably gain a lot by simplifying your code. Often times what I have seen in those cases is the procedure is trying to do too much. What I really mean here is that the business rules should not be part of the stored procedures. This is all part of the separation of business layers. That isn't always possible of course.
BTW, this is for analysis purposes of underlying data. There is no multi-tier environment in place for this, i.e. nowhere else for there to be business logic. People will run the stored proc by hand to generate an outcome table to then use elsewhere for further analysis.
So, the business rules have to be part of the stored procs, and I seem to be caught in the dilemma of "don't use multiple stored procs" and "make simple stored procs". These two ideas seem in conflict.
December 10, 2014 at 3:55 am
Everything is a balance. Just make sure you're breaking down the procs appropriately, not just to arrive at arbitrary small queries. But some degree of nesting, depending on how and where you do it, is fine. The main thing is making the code work well. Have you examined your execution plans?
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 10, 2014 at 3:57 am
DSNOSPAM (12/9/2014)
I guess it also goes against all my reflexes for other programming languages, which is to break a problem down into logical components.SS seems to throw other constrictions in the way.
YES!
The logical code reuse that it seems like we should get within SQL Server as suggested by the objects available just doesn't really work well. The nature of the language does naturally lead to repetition.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 10, 2014 at 9:34 am
If you are just coming back to SQL from a long break (i.e. pre 2005) then you might want to look at Common Table Expressions as an alternative.
From my experience a lot of 'old school' temp tables were because either the underlying server infrastructure was not large enough to handle a lot of in-memory processing or the developer struggled to get their head around nested co-related ans sub queries and so resorted to storing interim steps in the data manipulation process in temp tables.
CTEs give you a way to separate the co-related queries into a separate components and test these individually to confirm that the unit test gives you the results you expect, but because the query optimizer sees the whole plan as a single unit or work, normally still does a pretty good job of executing the plan.
local temp, global temp and in-memory tables all have their place, but there are often better modern alternatives to the old hoops we had to jump through.
If you want some guidance on how you might approach the code in a single procedure, post some sample code here and we will help you refactor it.
December 10, 2014 at 11:03 am
sqldriver (12/9/2014)
Grant Fritchey (12/9/2014)
doesn't everything in sql server? 😉
Still trying to figure out what the use-case for multi-statement table-valued user-defined functions is. But, other than that, yes!
Their one use is for people to demonstrate why they're bad.
Let's not confuse the *need* or usefulness with the quality of the execution.
I'd think that there is a tremendous need to have a way to save a set of parameterized statements whose results can be invoked within SELECT statements. A "saved parameterized multi-statement query" would be a damn nice feature especially if it can be used inline. The piss-poor performance issue makes a FUNCTION unattractive for that purpose, but that doesn't negate the usefulness.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
December 10, 2014 at 11:43 am
Matt Miller (#4) (12/10/2014)
sqldriver (12/9/2014)
Grant Fritchey (12/9/2014)
doesn't everything in sql server? 😉
Still trying to figure out what the use-case for multi-statement table-valued user-defined functions is. But, other than that, yes!
Their one use is for people to demonstrate why they're bad.
Let's not confuse the *need* or usefulness with the quality of the execution.
I'd think that there is a tremendous need to have a way to save a set of parameterized statements whose results can be invoked within SELECT statements. A "saved parameterized multi-statement query" would be a damn nice feature especially if it can be used inline. The piss-poor performance issue makes a FUNCTION unattractive for that purpose, but that doesn't negate the usefulness.
+1
Don Simpson
December 10, 2014 at 1:07 pm
Matt Miller (#4) (12/10/2014)
sqldriver (12/9/2014)
Grant Fritchey (12/9/2014)
doesn't everything in sql server? 😉
Still trying to figure out what the use-case for multi-statement table-valued user-defined functions is. But, other than that, yes!
Their one use is for people to demonstrate why they're bad.
Let's not confuse the *need* or usefulness with the quality of the execution.
I'd think that there is a tremendous need to have a way to save a set of parameterized statements whose results can be invoked within SELECT statements. A "saved parameterized multi-statement query" would be a damn nice feature especially if it can be used inline. The piss-poor performance issue makes a FUNCTION unattractive for that purpose, but that doesn't negate the usefulness.
The functions allow you to document them. Parameterized queries do not. One reason I've tried to go towards functions. but, they have problems too. I know you did not specify "parameterized QUERY", but it jogged that part of my brain.
December 10, 2014 at 1:09 pm
aaron.reese (12/10/2014)
If you are just coming back to SQL from a long break (i.e. pre 2005) then you might want to look at Common Table Expressions as an alternative.From my experience a lot of 'old school' temp tables were because either the underlying server infrastructure was not large enough to handle a lot of in-memory processing or the developer struggled to get their head around nested co-related ans sub queries and so resorted to storing interim steps in the data manipulation process in temp tables.
CTEs give you a way to separate the co-related queries into a separate components and test these individually to confirm that the unit test gives you the results you expect, but because the query optimizer sees the whole plan as a single unit or work, normally still does a pretty good job of executing the plan.
local temp, global temp and in-memory tables all have their place, but there are often better modern alternatives to the old hoops we had to jump through.
If you want some guidance on how you might approach the code in a single procedure, post some sample code here and we will help you refactor it.
I tried to make CTE's work for something a while back, and the problem at hand did not lend itself to a CTE. I'll have to revisit.
Thanks.
December 10, 2014 at 1:10 pm
thanks, everyone, for your continued responses, and broadening input.
December 10, 2014 at 1:15 pm
Grant Fritchey (12/10/2014)
DSNOSPAM (12/9/2014)
I guess it also goes against all my reflexes for other programming languages, which is to break a problem down into logical components.SS seems to throw other constrictions in the way.
YES!
The logical code reuse that it seems like we should get within SQL Server as suggested by the objects available just doesn't really work well. The nature of the language does naturally lead to repetition.
Thanks for validating that.
Viewing 15 posts - 16 through 30 (of 30 total)
You must be logged in to reply to this topic. Login to reply