May 11, 2006 at 6:50 am
Hi Jeff,
Great article that truly highlights and explains the issues.
In relation to your observation about using Temp table to store interim results, to which you then apply set-based logic, I have a problem that I have not been able to resolve.
In my application I have the ability to specifiy n-user defined queries, which are entirely defined in the database in a table QueryTable as below:
Column1: Query ID
Column2: Query Description
I'm using the dataset of all the records in the QueryTable to populate a datagrid in an ASP.NET page, with hyperlinks on each query that then go off to another page that returns the dynamic results associated to each query.
QUESTION: How do I return all the records in the QueryTable, together with the associated number of rows associated to each query?
I can put the rows into a Temp table, but I am then not sure how to calculate the returned rows for each query without using an RBAR approach on the Temp table.
Any suggestions would be greatly appreciated
--Mauro Ciaccio
May 11, 2006 at 7:46 pm
Sorry, Mirko... I missed your post from way back in February...
You wrote "I don't need statistics on temp tables, because I usually know enough about their structure and data that will be populated to force optimal execution plan." All I can say is do a measurable performance test using temp tables and table variables... force any execution plan you want (or not)... under otherwise identical conditions, the temp table will win every time simply because it uses statistics. It's just nature of the beast...
You also wrote "Temp tables cause sp recompilations, so in high traffic environment where sp may be executed N x 1000s time in hour sp recompilations affect significantly cpu usage." In the world of GUI's, that might be true because they are, more or less, RBAR in nature. However, you may still want to do a performance test because even in the face of a recompile, the temp table may still beat out the table variable. Also, if you correctly define the temp table at the beginning of the proc, you may not get any recompiles at all. Books Online says that if you mix DDL within DML, you will probably get a recompile. For batch files, the most number of recompiles you should get (by not mixing DDL within DML) is 1. You may get a recompile no matter what you do if the data in a permanent table has changed enough... and it doesn't take much.
Anyway, I appreciate your postion but try some performance testing... you may change your mind. If the performance testing you do proves me wrong, I sure would like a copy of the code you did the testing with.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 11, 2006 at 7:47 pm
Mauro, I'm looking at your post right now...
--Jeff Moden
Change is inevitable... Change for the better is not.
May 11, 2006 at 8:47 pm
Mauro,
Just trying to clarify... are you saying that each row of the QueryDescription column contains the code for a complete query and that you want to execute all of them AND return a rowcount for each query executed?
--Jeff Moden
Change is inevitable... Change for the better is not.
May 12, 2006 at 2:45 am
Jeff,
Yes, I did not really give a full explanation. Maybe the best thing is if I attach the actual tables.
In terms of UI I will have:
a) Summary page listing all the query options, with the number of records associated to each query
b) Detail pages with the result of each query.
Now what I have done allows me to easily provide the data required for (b). The problem I am having is how to provide the data for (a) without using an RBAR approach.
I can easily retrieve the query description, ID, etc to display in (a). The issue is how to retrieve the count of records associated to each query.
Is that clearer? I hope so
CREATE TABLE [dbo].[BCTodoListCategory] (
[BCTodoListCategoryPK] [int] IDENTITY (1, 1) NOT NULL ,
[TodoListCategory] [varchar] (30) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[TodoListCategoryDesc] [varchar] (200) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[CommonSQL] nvarchar(2000) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[CountSQL] nvarchar(2000) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
CONSTRAINT [PK_BCTodoListCategory] PRIMARY KEY CLUSTERED
(
[BCTodoListCategoryPK]
  ON [PRIMARY]
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[BCTodoList] (
[BCTodoListPK] [int] IDENTITY (1, 1) NOT NULL ,
[BCTodoListCategoryFK] [int] NOT NULL ,
[TodoListTitle] [varchar] (50) NOT NULL ,
[TodoListDesc] [varchar] (200) NOT NULL ,
[SP_Name] [varchar] (50) NOT NULL ,
[WhereSQL] [nvarchar] (1000) NOT NULL ,
CONSTRAINT [PK_BCTodoList] PRIMARY KEY CLUSTERED
(
[BCTodoListPK]
  ON [PRIMARY] ,
CONSTRAINT [FK_BCTodoList_BCTodoListCategory] FOREIGN KEY
(
[BCTodoListCategoryFK]
  REFERENCES [dbo].[BCTodoListCategory] (
[BCTodoListCategoryPK]
 
) ON [PRIMARY]
GO
And to clarify how I build the dynamic SQL:
DECLARE @sql nvarchar(2000)
DECLARE @sqlwhere nvarchar(1000)
SELECT @sql = B.CommonSQL
FROM BCTodoListCategory B
INNER JOIN BCTodoList A ON A.BCTodoListCategoryFK = B.BCTodoListCategoryPK
WHERE A.BCTodoListPK = @BCTodoListPK
SELECT @sqlwhere = A.WhereSQL
FROM BCTodoList A
WHERE A.BCTodoListPK = @BCTodoListPK
--Add WHERE Clause
SET @sql = @sql + ' ' + @sqlwhere
--print @sql
CREATE TABLE #TodoByAssmntApp
(
/* Columns specified for each query */
)
INSERT INTO #TodoByAssmntApp
EXEC sp_executesql @sql
SELECT * FROM #TodoByAssmntApp
DROP TABLE #TodoByAssmntApp
May 12, 2006 at 2:47 am
P.S. Forgive the winks in the SQL code...too quick on the mouse clicks
May 12, 2006 at 7:20 am
QUESTION: How do I return all the records in the QueryTable, together with the associated number of rows associated to each query? |
Mauro,
Not without RBAR
Wait for Jeff to prove me wrong
If you have to do it in a procedure I would use a CURSOR
My preferred solution would be to return the description and sql to ASP, execute each sql and build a dataset manually for the datagrid
Far away is close at hand in the images of elsewhere.
Anon.
May 12, 2006 at 7:28 am
David,
personally I was thinking of using a temp table, and then iterating through each row to update the row count. Then at the end I could do a single SELECT on the temp table, and I would not need to do any ASP.NET processing (eg. building a dataset). I would've thought that would be the quickest way, and conceptually it would keep all the logic with the DB functionality.
Mauro
May 12, 2006 at 7:35 am
Yep my first thought as well, still RBAR though
Not sure any faster than CURSOR though
Far away is close at hand in the images of elsewhere.
Anon.
May 12, 2006 at 5:32 pm
Sorry I'm late...
I guess you should call it RBAR since you absolutely need to read a row at a time... I'd be tempted to call it a control loop because it processes a record set for each read. Guess I "failed as a developer", eh.
Because you're reading a relatively low volume of "control" records (code to be executed actually), I've gotta agree with Dave and say that I'm not sure that a WHILE/Temp table solution would be any faster than a cursor. If only one process at a time will be doing this, you may not even run into contention problems with the cursor. Still, because I'm also a creature of some habit, I prefer to avoid the cursor but don't see the harm here.
There is one non-RBAR solution but it would require a run through OSQL (does that redeem me?)... if it were a batch file, I'd use OSQL to write all of the records to a file (single SELECT from the table) and then use that file as an input to another OSQL command thereby avoiding RBAR. Probably not an appropriate thing to do in a GUI but dunno for sure because I'm a data troll, not a "GUI-guy".
I'm going to go scrape the egg of my face now...
--Jeff Moden
Change is inevitable... Change for the better is not.
May 12, 2006 at 8:52 pm
2 Mauro
All you ToDo's must be done in UDF's. If you cannot predict its contents you may set a trigger on the table to recompile UDF related to updated or inserted row.
After that your query gonna be like this:
SELECT ..., CASE UserQuery WHEN 'Q1' THEN dbo.Q1(...) WHEN 'Q2' then dbo.Q2(...) ... end
....
_____________
Code for TallyGenerator
May 15, 2006 at 2:18 am
I'd be tempted to call it a control loop... |
Well it is
Maybe we should change it to RBARCL
Because you're reading a relatively low volume of "control" records |
You hit the nail on the head there Jeff 'low volume'
There is one non-bar solution but it would require a run through OSQL |
Yeah, but might be a bit messy though and introduces another POF
does that redeem me? |
Do you need redeeming. I was thinking more of revered
I'm going to go scrape the egg of my face now |
And I thought it was just a large grin
Far away is close at hand in the images of elsewhere.
Anon.
May 15, 2006 at 3:03 am
All,
thanks very much for all the advice, and supporting me in the trauma of having to use the "c" thing
Serqiy, I did consider your approach too, but in our case the user can define new queries and so the UDF would need to process dynamic SQL, which it cannot.
Finally, so data trolls still use o-sql...great to know there is still a hard-core out there.
Mauro
May 15, 2006 at 9:06 pm
Finally, so data trolls still use o-sql...great to know there is still a hard-core out there. |
Yep... I sometimes even revert to the occasional batch file or VBS. I love brute force but what would you expect from a data troll?
Maybe we should change it to RBARCL |
Rhymes with "debacle", doesn't it?
Yeah, but might be a bit messy though and introduces another POF |
Actually, I've had some good success with things like that (again, what do you expect from a data troll?). Our scheduling system takes both precedents and dependencies and 99% of our nightly runs are accomplished through batch files that call OSQL to either run a query stored as a script in a local file or to simply run a stored procedure. Yeah, we could have used SQL's job scheduler but with less control over the precedents and dependencies... and, the DBA's really didn't want to give anyone the necessary permissions to maintain SQL's job scheduler. It is a "locked down" system... each person responsible for a job and the "scheduler DBA" are the only people granted access to the job folder. Works pretty well but requires that hard-core OSQL and batch file knowledge that Mauro was talking about.
Do you need redeeming. I was thinking more of revered |
You, Sir, are way too kind. You made my week... thanks, David.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 14 posts - 16 through 28 (of 28 total)
You must be logged in to reply to this topic. Login to reply