July 4, 2007 at 7:03 am
Hi,
for our crm program we use a query to retrieve the results for searches our users make. We limit the results we return to 50 at a time. Never the less we want to know how much results their are in total so we can set the number of result pages and things like that.
the query looks as follow
exec sp_executesql N' SELECT [__SUB_QUERY__].[ACCOUNT_NAME],
[__SUB_QUERY__].[DUNS_ULTIMATE_NUMBER], [__SUB_QUERY__].[DUNS_NAME_OVERRIDE], [__SUB_QUERY__].[CHANNEL_ID], [__SUB_QUERY__].[AREA_NAME],
[__SUB_QUERY__].[COUNTRY], [__SUB_QUERY__].[GLOBAL_MARKET_SEGMENT_NAME], [__SUB_QUERY__].[GLOBAL_ACCOUNT_COORDINATOR], [__SUB_QUERY__].[GLOBAL_CLIENT_SERVICE_PARTNER],
[__SUB_QUERY__].[PRIORITY_STATUS], [__SUB_QUERY__].[PRIORITY], [__SUB_QUERY__].[CREATE_DATE], [__SUB_QUERY__].[UPDATE_DATE], [__SUB_QUERY__].[ACCOUNT_ID], [__SUB_QUERY__].[CHANNEL_NAME],
[__SUB_QUERY__].[AREA_CODE], [__SUB_QUERY__].[ACCOUNT_ROLE_GAC], [__SUB_QUERY__].[COUNTRY_3CHAR], [__SUB_QUERY__].[GLOBAL_MARKET_SEGMENT_CODE]
FROM
( SELECT
[Account].[ACCOUNT_NAME],
[Account].[DUNS_ULTIMATE_NUMBER], [Account].[DUNS_NAME_OVERRIDE], [Account].[CHANNEL_ID], [Area].[AREA_DESCR] [AREA_NAME], [COUNTRY].[COUNTRY_DESCR] [COUNTRY],
[GLOBAL_MARKET_SEGMENT].[GLOBAL_MARKET_SEGMENT_NAME], [Account].[ACCOUNT_ROLE_GAC] [GLOBAL_ACCOUNT_COORDINATOR], [Account].[ACCOUNT_ROLE_GCSP] [GLOBAL_CLIENT_SERVICE_PARTNER],
[Account].[PRIORITY_STATUS], [REFERENCE].[NAME_ENGLISH] [PRIORITY], [Account].[CREATE_DATE], [Account].[UPDATE_DATE], [Account].[ACCOUNT_ID], [Account].[CHANNEL_NAME], [Account].[AREA_CODE],
[Account].[ACCOUNT_ROLE_GAC], [Account].[COUNTRY_3CHAR], [Account].[GLOBAL_MARKET_SEGMENT_CODE], ROW_NUMBER() OVER(ORDER BY ([Account].[ACCOUNT_NAME]) ASC) [__ROW_NUMBER__]
FROM [Account]
LEFT JOIN [Area] ON ( [Account].[AREA_CODE] = [Area].[AREA_CODE] )
LEFT JOIN [COUNTRY] ON ( [Account].[COUNTRY_3CHAR] = [COUNTRY].[COUNTRY_3CHAR] )
LEFT JOIN [GLOBAL_MARKET_SEGMENT] ON ([Account].[GLOBAL_MARKET_SEGMENT_CODE] = [GLOBAL_MARKET_SEGMENT].[GLOBAL_MARKET_SEGMENT_CODE] )
LEFT JOIN [REFERENCE] ON ( [Account].[PRIORITY_STATUS] = [REFERENCE].[REFERENCE_ID] )
INNER JOIN [ACCOUNT_FILTER] AS [AF] ON ( [Account].[ACCOUNT_ID] = [AF].[ACCOUNT_ID] )
INNER JOIN [CLIENT] ON ( [ACCOUNT].[ACCOUNT_ID] = [CLIENT].[ACCOUNT_ID] )
WHERE ( ( [CLIENT].[CLIENT_NAME] LIKE @CLIENT_NAME ) )
AND ( [Account].[ACCOUNT_NAME] LIKE @ACCOUNT_NAME )
GROUP BY [Account].[ACCOUNT_NAME], [Account].[DUNS_ULTIMATE_NUMBER], [Account].[DUNS_NAME_OVERRIDE], [Account].[CHANNEL_ID],
[Area].[AREA_DESCR], [COUNTRY].[COUNTRY_DESCR], [GLOBAL_MARKET_SEGMENT].[GLOBAL_MARKET_SEGMENT_NAME], [Account].[ACCOUNT_ROLE_GAC], [Account].[ACCOUNT_ROLE_GCSP], [Account].[PRIORITY_STATUS],
[REFERENCE].[NAME_ENGLISH], [Account].[CREATE_DATE], [Account].[UPDATE_DATE], [Account].[ACCOUNT_ID], [Account].[CHANNEL_NAME], [Account].[AREA_CODE], [Account].[ACCOUNT_ROLE_GAC],
[Account].[COUNTRY_3CHAR], [Account].[GLOBAL_MARKET_SEGMENT_CODE] )
AS [__SUB_QUERY__]
WHERE [__SUB_QUERY__].[__ROW_NUMBER__] BETWEEN @__ROW_NUMBER__ AND @__ROW_NUMBER___1
ORDER BY [__SUB_QUERY__].[ACCOUNT_NAME] ASC '
,N'@CLIENT_NAME nvarchar(6),@ACCOUNT_NAME nvarchar(1),@__ROW_NUMBER__ bigint,@__ROW_NUMBER___1
bigint',@CLIENT_NAME=N'%John%',@ACCOUNT_NAME=N'%',@__ROW_NUMBER__=1,@__ROW_NUMBER___1=50
We tried to use @@rowcount, but than the result is always 50 which is indeed the number of rows for the outer select.
Is there a way to find the total number of results with giving as little load as possible to the database server.
thx.
July 4, 2007 at 7:42 am
@@rowcount will always correspond to the status of the last executed query. You will need to handle this using dynamic sql's no way than that comes across my mind.
Cheers,
Sugeshkumar Rajendran
SQL Server MVP
http://sugeshkr.blogspot.com
July 5, 2007 at 1:05 am
You might be able to use ROW_NUMBER().
In your subquery, include another field ROW_NUMBER() OVER (Order BY ....) AS RowNo and give it any of the fields. Then if you can get the max of that, you'll have the number of rows in the subquery. Not a particuarly nice solution, but might work for you.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 5, 2007 at 1:24 am
thx, I'll try that one right away. It would be a lot better than the old solution where we execute the query 2 times, once without limitations and once with.
July 5, 2007 at 1:56 am
I'm affraid it doesn't work either.
In the inner select I can't do the max() yet, and when I use it in the outer select I get the maximum rowcount of one of the 50 results determined by @row_number & @row_number_1 and not the entire resultset
July 5, 2007 at 2:03 am
Perhaps insert the subquery portion into a temp table (or use a CTE), then you can use that in the outer query and again to get the rowcount.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 5, 2007 at 3:08 am
Thx, that looks like the thing we need.
I still have a question though, since the concept of CTE is fairly new to me (I'm still quite the junior employee)
I read that a CTE only exists in the scope of a single statement, but because this query is generated in the code (there is actually no limit set to the number of criteria our users can use) I would try to perform the 3 query's I need in a transact.
First the query that selects everything, than a row count and finally the part that limits the number of rows to return.
So I wonder if a transact is considered as one statement, or will the temporary view of my first select be long gone before I get to the third query?
July 5, 2007 at 3:15 am
Ah, so it's an auto-generated query.
A transaction is not considered one statement, and you're right, the CTE would be gone by the second select. Since this is auto generated, try the temp table approach. Little bit messier, but should work well enough. If its slow, or there are lots of records, you can try indexing the temp table.
-- Subquery
CREATE TABLE #TempSubQuery (...)
INSERT INTO #TempSubQuery (...)
SELECT <fields> FROM ...
-- how many rows total in the subquery
SELECT @TotalSunqueryRows = @@rowcount
-- Outer query
SELECT ...
FROM #TempSubQuery
WHERE ...
... other queries as necessary
DROP TABLE #TempSubQuery
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 5, 2007 at 4:25 am
If you do not mind the slight overhead of returning the total on each row, something like the following may work:
EXEC sp_executesql N'
WITH YourQuery
(
ACCOUNT_NAME
,DUNS_ULTIMATE_NUMBER
,DUNS_NAME_OVERRIDE
,CHANNEL_ID
,AREA_NAME
,COUNTRY
,GLOBAL_MARKET_SEGMENT_NAME
,GLOBAL_ACCOUNT_COORDINATOR
,GLOBAL_CLIENT_SERVICE_PARTNER
,PRIORITY_STATUS
,PRIORITY
,CREATE_DATE
,UPDATE_DATE
,ACCOUNT_ID
,CHANNEL_NAME
,AREA_CODE
,ACCOUNT_ROLE_GAC
,COUNTRY_3CHAR
,GLOBAL_MARKET_SEGMENT_CODE
)
AS
(
SELECT
Ac.ACCOUNT_NAME
,Ac.DUNS_ULTIMATE_NUMBER
,Ac.DUNS_NAME_OVERRIDE
,Ac.CHANNEL_ID
,Ar.AREA_DESCR
,T.COUNTRY_DESCR
,G.GLOBAL_MARKET_SEGMENT_NAME
,Ac.ACCOUNT_ROLE_GAC
,Ac.ACCOUNT_ROLE_GCSP
,Ac.PRIORITY_STATUS
,R.NAME_ENGLISH
,Ac.CREATE_DATE
,Ac.UPDATE_DATE
,Ac.ACCOUNT_ID
,Ac.CHANNEL_NAME
,Ac.AREA_CODE
,Ac.ACCOUNT_ROLE_GAC
,Ac.COUNTRY_3CHAR
,Ac.GLOBAL_MARKET_SEGMENT_CODE
FROM Account Ac
LEFT JOIN Area Ar
ON Ac.AREA_CODE = Ar.AREA_CODE
LEFT JOIN COUNTRY T
ON Ac.COUNTRY_3CHAR = T.COUNTRY_3CHAR
LEFT JOIN GLOBAL_MARKET_SEGMENT G
ON Ac.GLOBAL_MARKET_SEGMENT_CODE = G.GLOBAL_MARKET_SEGMENT_CODE
LEFT JOIN REFERENCE R
ON Ac.PRIORITY_STATUS = R.REFERENCE_ID
JOIN ACCOUNT_FILTER F
ON Ac.ACCOUNT_ID = F.ACCOUNT_ID
JOIN CLIENT C
ON Ac.ACCOUNT_ID = C.ACCOUNT_ID
WHERE C.CLIENT_NAME LIKE @CLIENT_NAME
AND Ac.ACCOUNT_NAME LIKE @ACCOUNT_NAME
)
SELECT
ACCOUNT_NAME
,DUNS_ULTIMATE_NUMBER
,DUNS_NAME_OVERRIDE
,CHANNEL_ID
,AREA_NAME
,COUNTRY
,GLOBAL_MARKET_SEGMENT_NAME
,GLOBAL_ACCOUNT_COORDINATOR
,GLOBAL_CLIENT_SERVICE_PARTNER
,PRIORITY_STATUS
,PRIORITY
,CREATE_DATE
,UPDATE_DATE
,ACCOUNT_ID
,CHANNEL_NAME
,AREA_CODE
,ACCOUNT_ROLE_GAC
,COUNTRY_3CHAR
,GLOBAL_MARKET_SEGMENT_CODE
,TotalRows
FROM (
SELECT Q.*
,D1.TotalRows
,ROW_NUMBER() OVER(ORDER BY Q.ACCOUNT_NAME ASC) AS RowNumber
FROM YourQuery Q
CROSS JOIN (
SELECT COUNT(*) AS TotalRows
FROM YourQuery Q1
) D1
) D
WHERE RowNumber BETWEEN @RowNumber AND @RowNumber1
ORDER BY ACCOUNT_NAME
'
,N'@CLIENT_NAME nvarchar(6), @ACCOUNT_NAME nvarchar(1) , @RowNumber bigint, @RowNumber1 bigint'
,@CLIENT_NAME = N'%John%', @ACCOUNT_NAME= N'%' , @RowNumber = 1, @RowNumber1 = 50
July 5, 2007 at 8:50 am
I would do two queries. The first is a simple SELECT COUNT(*) using all of the rest of the query (except reduce derived tables to the bare minimum fields required to return a complete count - i.e. remove all fields not required for subsequent joins or filters). That gets you the count while allowing the optimizer to pick the most efficient (hopefully pure index usage) plan. It has an added benefit of putting data that will be used in the REAL run using 50 row limit into ram cache for you.
Oh, speaking of 50 row limit, can you use SET ROWCOUNT 50 or perhaps TOP 50 to limit the rows? Both/either may wind up being more efficient than your current filtering mechanism.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
July 6, 2007 at 4:32 am
Thx for the help.
I'll try multiple approaches anyway to see what gives the best performance results.
As for the filtering mechanism, I think top50 won't work because when for example page 2 or 3 is loaded I need results 51-100 or 101 to 150...
And I think set rowcount gives the same issue. I also read (don't know if this is correct)that set rowcount isn't really suited to use when you also perform a lot of other queries that don't have the same limits which is the case for us.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply