April 3, 2005 at 4:39 am
Hi...
Recently, I have a problem with the following stored procedures.
ALTER PROCEDURE
dbo.Product_SelectByIndexAndSize (
@index INT,
@Size INT
)
AS
SET NOCOUNT OFF
DECLARE @LowerBound INT
DECLARE @UpperBound INT
SET @LowerBound = ((@Index - 1) * @Size) + 1
SET @UpperBound = @LowerBound + @Size - 1
CREATE TABLE #Page (
Id INT IDENTITY(1, 1) NOT NULL,
ProductId INT
)
INSERT INTO #Page (ProductId)
SELECT ProductID
FROM Products
ORDER BY ProductName
SET NOCOUNT ON
SELECT P.*
FROM Products P,
#Page T
WHERE P.ProductID = T.ProductId
AND T.Id >= @LowerBound
AND T.Id <= @UpperBound
ORDER BY T.Id
The stored procedure is used to create data pagination at database level so it gets input consists of page index number (starting from one) and page size, so for example if want to return 10 rows per page for page #2, the parameters would be set as @index = 2, @Size = 10. Further, I've created this sp in Northwind sample database using SQL Server 2000 Developer Edition Service Pack 3.
The problem I encountered is the output was NOT always returned (sometimes returned, sometimes not returned), though I still used the same parameters.
Any help would be appreciated. Thx.
--
Regards,
Maximilian Haru Raditya
April 3, 2005 at 5:32 pm
Why are you turning NOCOUNT off, when there are statements in there (eg the INSERT INTO) which will generate rowcount messages ?
April 4, 2005 at 1:23 am
Drop the temporary table #Page at the end of you sp
Anders Dæmroen
epsilon.no
April 4, 2005 at 5:16 am
:: PW
:: Why are you turning NOCOUNT off, when there are statements in there (eg
:: the INSERT INTO) which will generate rowcount messages ?
Actually, I need the stored procedures to return the rows count only from the last SELECT statement, I don't need it from the INSERT statement. But, I've tried your solution with modification (as shown the working version below) and it seems work. The thing I don't get is the relevance between SET NOCOUNT ON and the output of my stored procedure. Is there any explanation about this?
:: Dølle Døck
:: Drop the temporary table #Page at the end of you sp
According to the documentation stated in SQL Server (Compiled HTML) Online Help as follow:
A local temporary table created in a stored procedure is dropped automatically when the stored procedure completes. The table can be referenced by any nested stored procedures executed by the stored procedure that created the table. The table cannot be referenced by the process which called the stored procedure that created the table.
I think it shouldn't be a problem at all. BTW, I've tried it before and still doesn't work.
Here is the working version.
ALTER PROCEDURE dbo.Product_SelectByIndexAndSize (
@index INT,
@Size INT
)
AS
SET NOCOUNT ON
DECLARE @LowerBound INT
DECLARE @UpperBound INT
SET @LowerBound = ((@Index - 1) * @Size) + 1
SET @UpperBound = @LowerBound + @Size - 1
CREATE TABLE #Page (
Id INT IDENTITY(1, 1) NOT NULL,
ProductId INT
)
INSERT INTO #Page (ProductId)
SELECT ProductID
FROM Products
ORDER BY ProductName
SELECT P.*
FROM Products P,
#Page T
WHERE P.ProductID = T.ProductId
AND T.Id >= @LowerBound
AND T.Id <= @UpperBound
ORDER BY T.Id
SET NOCOUNT OFF
Anyway, thx for your attention. This helps me solve the problem though...
--
Regards,
Maximilian Haru Raditya
April 4, 2005 at 11:21 am
Stops the message indicating the number of rows affected by a Transact-SQL statement from being returned as part of the results.
SET NOCOUNT { ON | OFF }
When SET NOCOUNT is ON, the count (indicating the number of rows affected by a Transact-SQL statement) is not returned. When SET NOCOUNT is OFF, the count is returned.
The @@ROWCOUNT function is updated even when SET NOCOUNT is ON.
SET NOCOUNT ON eliminates the sending of DONE_IN_PROC messages to the client for each statement in a stored procedure. When using the utilities provided with Microsoft® SQL Server™ to execute queries, the results prevent "nn rows affected" from being displayed at the end Transact-SQL statements such as SELECT, INSERT, UPDATE, and DELETE.
For stored procedures that contain several statements that do not return much actual data, this can provide a significant performance boost because network traffic is greatly reduced.
The setting of SET NOCOUNT is set at execute or run time and not at parse time.
SET NOCOUNT permissions default to all users.
April 4, 2005 at 5:11 pm
It is now recommended to use the TABLE variable, rather than temp tables (see SQL Books Online)
Try this instead...
ALTER PROCEDURE dbo.Product_SelectByIndexAndSize (
@index INT,
@Size INT
)
AS
SET NOCOUNT ON
DECLARE @LowerBound INT
DECLARE @UpperBound INT
SET @LowerBound = ((@Index - 1) * @Size) + 1
SET @UpperBound = @LowerBound + @Size - 1
-- Id column should be CLUSTERED since you want to search in a range. This is OK for small tables.
DECLARE @Page TABLE
(
Id INT IDENTITY(1, 1) NOT NULL PRIMARY KEY CLUSTERED,
ProductId INT
)
INSERT INTO @Page (ProductId)
SELECT ProductID
FROM Products
ORDER BY ProductName
SELECT P.*
FROM Products P INNER JOIN @Page T ON P.ProductID = T.ProductId
WHERE T.Id BETWEEN @LowerBound AND @UpperBound
-- Ordering by T.Id is not necessary now thanks to CLUSTERED INDEX
April 4, 2005 at 8:34 pm
Why do you insert the whole resultset into the temp table, while you only need @Size results at a time??????!!!!
This will consume a lot of time if the Products table has a lot of rows!!!
Why not use TOP clause?
This will ensure that only @size rows are inserted in the temp table
So the code will look like:
DECLARE @query VARCHAR(500) DECLARE @pagesize INT DECLARE @pagenum INT SET @pagesize = 3 SET @pagenum = 5 SET @query = ' DECLARE @maxid INT ' SET @query = @query + ' SELECT @maxid = MAX (ProductID) ' SET @query = @query + ' FROM Products ' SET @query = @query + ' WHERE ProductID IN ' SET @query = @query + ' (SELECT TOP ' + CAST(@pagesize * @pagenum AS VARCHAR) + ' ProductID ' SET @query = @query + ' FROM Products ' SET @query = @query + ' ORDER BY ProductID) '
SET @query = @query + ' SELECT TOP ' + CAST(@pagesize AS VARCHAR) + ' * ' SET @query = @query + ' INTO #page ' SET @query = @query + ' FROM Products ' SET @query = @query + ' WHERE ProductID > @maxid ' SET @query = @query + ' ORDER BY ProductID'
SET @query = @query + ' SELECT * FROM #page ' SET @query = @query + ' ORDER BY ProductName ' SET @query = @query + ' DROP TABLE #page '
EXEC(@query)
This will go much more faster; because you won't insert the whole data in Products table into temp table, and You will only select the rows you want.
Try it, it works 100%
Ezz Khayyat
April 5, 2005 at 5:52 am
:: JSBBS
:: It is now recommended to use the TABLE variable, rather than temp tables
:: (see SQL Books Online)
I've just known if it can be done, but you miss @Page T in FROM from last SELECT statement. In what topic this information can be found? How do I measure if there's any performance boost using your way?
:: Ezz Khayyat
:: Why do you insert the whole resultset into the temp table, while you only need
:: @Size results at a time??????!!!!
:: This will consume a lot of time if the Products table has a lot of rows!!!
:: Why not use TOP clause.?
:: This will ensure that only @size rows are inserted in the temp table
I do admit mine works inefficiently. Your way works, but if you notice on the order of final result, I don't think it's the expected result.
My goals on this model are it would work in whatever field I want to sort and I want to eliminate any dependencies to the object's primary key, just like in yours. I want a workaround version on SQL Server just like LIMIT clause in MySQL. I wonder why there's no LIMIT clause equivalency in SQL Server. I think the use of LIMIT clause is more flexible than the use of TOP clause. Any comments?
Thx.
--
Regards,
Maximilian Haru Raditya
April 5, 2005 at 7:41 am
Ohh you are right! The results are not in the same order as you wanted it to be!
Well, the LIMIT clause is really mssing around in SQL Server!
But also, you have to consider the IDs of the Products to be not sequential (like 30, 31, 32, 33, 34, 35)!
It could be like this (30, 36, 37, 39, 58, 62...)
so the T.Id >= @LowerBound AND T.Id <= @UpperBound won't work correctly & you can't guarantee to retrieve @size records each time due to the fact that some rows can be deleted from Products table!
This is really a serious problem with only one solution (Which is not effecient & not optimal at all) !!
Anyone has a good idea how to slove this??
Ezz Khayyat
April 5, 2005 at 10:56 pm
:: Ezz Khayyat
:: Well, the LIMIT clause is really mssing around in SQL Server!
:: But also, you have to consider the IDs of the Products to be not sequential
:: (like 30, 31, 32, 33, 34, 35)!
:: It could be like this (30, 36, 37, 39, 58, 62...)
:: so the T.Id >= @LowerBound AND T.Id <= @UpperBound won't work
:: correctly & you can't guarantee to retrieve @size records each time due to
:: the fact that some rows can be deleted from Products table!
If you notice, T.Id doesn't reflect the ProductID, nor the other fields in Product table, instead, it refers to new generated id came from the SELECT clause in INSERT clause. So, I can be sure that there wouldn't be any missing id.
Well, then my other question: is there any better way to workaround the missing LIMIT clause in SQL Server, which promoting efficiency, flexibility, and optimality?
Thx.
--
Regards,
Maximilian Haru Raditya
April 6, 2005 at 3:58 am
Ohh yes, you are right! Sorry I didn't notice this!
So I guess there is no any other solutions for this!
But this would be much faster than doing paging in ADO, since the whole resultset will be transferred to the calling client (ADO) and then the ADO will do the paging for you..
Ezz Khayyat
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply