November 21, 2013 at 10:56 am
Which way is better in SQL Paging and Why ??
With
WITH MyCte AS
(
SELECT EmployeeID, EmployeeName, Age, Position, Address, PhoneNumber
,ROW_NUMBER() OVER (ORDER BY EmployeeID) AS RowNumber
FROM dbo.Employees
)
SELECT *
FROM MyCte
WHERE RowNumber BETWEEN StartIndex AND EndIndex
TempTableVariable
CREATE TABLE #tempSearch(
[EmployeeID] INT,
[EmployeeName] NVARCHAR(100),
[Age] INT,
[Position] NVARCHAR(100),
[Address] NVARCHAR(200),
[PhoneNumber] NVARCHAR(30),
[RowNumber] INT
)
INSERT INTO #tempSearch
[EmployeeID] ,
[EmployeeName] ,
[Age] ,
[Position] ,
[Address],
[PhoneNumber],
SELECT ROW_NUMBER() OVER (ORDER BY InventorySys ) AS RowNumber
FROM dbo.Employees
SELECT *
FROM #tempSearch
WHERE RowNumber BETWEEN StartIndex AND EndIndex
drop table #tempSearch
Thanks for your help 🙂
November 21, 2013 at 2:32 pm
What do you mean by "SQL Paging"?
Igor Micev,My blog: www.igormicev.com
November 21, 2013 at 2:53 pm
I would go for the CTE option since it doesn't require to copy the whole table just to filter the results instead of doing it in the same statement.
November 24, 2013 at 7:47 am
Like what am doing in the above script
November 24, 2013 at 1:20 pm
It depends. Small data sets will usually perform better with the CTE whereas large datasets will usually perform better with a temp table. Regardless, the best answer is to test both ways with realistic data and choose the best option in that scenario.
November 24, 2013 at 6:53 pm
ahmedhussein874 (11/21/2013)
Which way is better in SQL Paging and Why ??With
WITH MyCte AS
(
SELECT EmployeeID, EmployeeName, Age, Position, Address, PhoneNumber
,ROW_NUMBER() OVER (ORDER BY EmployeeID) AS RowNumber
FROM dbo.Employees
)
SELECT *
FROM MyCte
WHERE RowNumber BETWEEN StartIndex AND EndIndex
TempTableVariable
CREATE TABLE #tempSearch(
[EmployeeID] INT,
[EmployeeName] NVARCHAR(100),
[Age] INT,
[Position] NVARCHAR(100),
[Address] NVARCHAR(200),
[PhoneNumber] NVARCHAR(30),
[RowNumber] INT
)
INSERT INTO #tempSearch
[EmployeeID] ,
[EmployeeName] ,
[Age] ,
[Position] ,
[Address],
[PhoneNumber],
SELECT ROW_NUMBER() OVER (ORDER BY InventorySys ) AS RowNumber
FROM dbo.Employees
SELECT *
FROM #tempSearch
WHERE RowNumber BETWEEN StartIndex AND EndIndex
drop table #tempSearch
Thanks for your help 🙂
Neither. I use dynamic SQL with two TOPs in dynamic SQL and get returns in less than 500 ms from a million row table with a page size of 50.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 25, 2013 at 7:03 am
Jeff Moden (11/24/2013)
ahmedhussein874 (11/21/2013)
Which way is better in SQL Paging and Why ??With
WITH MyCte AS
(
SELECT EmployeeID, EmployeeName, Age, Position, Address, PhoneNumber
,ROW_NUMBER() OVER (ORDER BY EmployeeID) AS RowNumber
FROM dbo.Employees
)
SELECT *
FROM MyCte
WHERE RowNumber BETWEEN StartIndex AND EndIndex
TempTableVariable
CREATE TABLE #tempSearch(
[EmployeeID] INT,
[EmployeeName] NVARCHAR(100),
[Age] INT,
[Position] NVARCHAR(100),
[Address] NVARCHAR(200),
[PhoneNumber] NVARCHAR(30),
[RowNumber] INT
)
INSERT INTO #tempSearch
[EmployeeID] ,
[EmployeeName] ,
[Age] ,
[Position] ,
[Address],
[PhoneNumber],
SELECT ROW_NUMBER() OVER (ORDER BY InventorySys ) AS RowNumber
FROM dbo.Employees
SELECT *
FROM #tempSearch
WHERE RowNumber BETWEEN StartIndex AND EndIndex
drop table #tempSearch
Thanks for your help 🙂
Neither. I use dynamic SQL with two TOPs in dynamic SQL and get returns in less than 500 ms from a million row table with a page size of 50.
+1 here as well, even though I'm not a real big fan of dynamic SQL. I supported an application where by all paging was done this way, be it rendering results to a web page or paging an on-line report. It worked well.
Kurt
Kurt W. Zimmerman
SR DBA
Lefrak Organization
New York, NY
http://www.linkedin.com/in/kurtwzimmerman
November 25, 2013 at 7:54 am
Kurt W. Zimmerman (11/25/2013)
Jeff Moden (11/24/2013)
Neither. I use dynamic SQL with two TOPs in dynamic SQL and get returns in less than 500 ms from a million row table with a page size of 50.
+1 here as well, even though I'm not a real big fan of dynamic SQL. I supported an application where by all paging was done this way, be it rendering results to a web page or paging an on-line report. It worked well.
Kurt
In keeping with the normal postings around here it would be most excellent if either of you could post an example so the OP can learn the correct way to do this type of thing.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
November 25, 2013 at 10:32 am
Sean Lange (11/25/2013)
Kurt W. Zimmerman (11/25/2013)
Jeff Moden (11/24/2013)
Neither. I use dynamic SQL with two TOPs in dynamic SQL and get returns in less than 500 ms from a million row table with a page size of 50.
+1 here as well, even though I'm not a real big fan of dynamic SQL. I supported an application where by all paging was done this way, be it rendering results to a web page or paging an on-line report. It worked well.
Kurt
In keeping with the normal postings around here it would be most excellent if either of you could post an example so the OP can learn the correct way to do this type of thing.
Absolutely. I hit the hay early last night so didn't go into detail. I'll see what I can do after work tonight. Of course you know that I'll have to generate the normal million row table to test on. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
November 25, 2013 at 10:39 am
Jeff Moden (11/25/2013)
Sean Lange (11/25/2013)
Kurt W. Zimmerman (11/25/2013)
Jeff Moden (11/24/2013)
Neither. I use dynamic SQL with two TOPs in dynamic SQL and get returns in less than 500 ms from a million row table with a page size of 50.
+1 here as well, even though I'm not a real big fan of dynamic SQL. I supported an application where by all paging was done this way, be it rendering results to a web page or paging an on-line report. It worked well.
Kurt
In keeping with the normal postings around here it would be most excellent if either of you could post an example so the OP can learn the correct way to do this type of thing.
Absolutely. I hit the hay early last night so didn't go into detail. I'll see what I can do after work tonight. Of course you know that I'll have to generate the normal million row table to test on. 😀
I would assume no less Jeff. 😉 I have done some server side paging but never considered doing it the way I think you are doing this. Seems like an awesome idea!!! I look forward to reading it.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
November 25, 2013 at 9:51 pm
Ok, here we go. The first thing that we need is some test data. Heh… I couldn’t bring myself to put a telephone number in an NVARCHAR column so please forgive that change to just VARCHAR.
Here’s the code to build an “Employees” table in TempDB and populate it with a million rows of data.
--===== Do this in a nice, safe place that everyone has.
USE tempdb
;
--===== Conditionally drop the "permanent table" test table
-- to make reruns easier.
-- This is NOT a part of the solution.
IF OBJECT_ID('tempdb.dbo.Employees','U') IS NOT NULL
AND DB_NAME() = 'tempdb' -- Just to be sure for testing
DROP TABLE tempdb.dbo.Employees -- Just to be sure for testing
;
--===== Create the "permanent table" test table.
-- This is NOT a part of the solution.
CREATE TABLE dbo.Employees
(
EmployeeID INT IDENTITY(0,1)
,EmployeeName NVARCHAR(100)
,Age INT
,Position NVARCHAR(100)
,Address NVARCHAR(200)
,PhoneNumber VARCHAR(30)
)
;
--===== Populate the "permanent table" test table.
-- This is NOT a part of the solution.
INSERT INTO dbo.Employees
(EmployeeName, Age, Position, Address, PhoneNumber)
SELECT TOP 1000000
EmployeeName = NEWID()
,Age = ABS(CHECKSUM(NEWID()))%100+1
,Position = NEWID()
,Address = NEWID()
,PhoneNumber = LEFT(NEWID(),30)
FROM master.sys.all_columns ac1
CROSS JOIN master.sys.all_columns ac2
;
--===== Add the likely Primary Key
ALTER TABLE dbo.Employees
ADD CONSTRAINT PK_Employees PRIMARY KEY CLUSTERED (EmployeeID)
;Let’s get the copy to a Temp Table thing out of the way right now. The following code looks at the last page (#20,000) of data as do all of the examples.
--===== Note: If you convert this to non-dynamic SQL, it will be remarkably slow
DECLARE @PageSize INT
,@PageNum INT
,@Skip INT
,@SQL VARCHAR(8000)
;
--===== Preset what would normally be the input parameters
SELECT @PageSize = 50
,@PageNum = 20000
;
--===== Preset to change the page number to zero-based for easy calculations
-- and find the numbeer of rows to skip
SELECT @PageNum = @PageNum - 1
,@Skip = @PageSize*@PageNum
;
SET STATISTICS TIME,IO ON;
SELECT TOP(@Skip+@PageSize)
RowNumber = ROW_NUMBER() OVER (ORDER BY EmployeeID)
,*
INTO #MyHead
FROM dbo.Employees
;
SELECT * FROM #MyHead WHERE RowNumber BETWEEN @Skip+1 AND @Skip+@PageSize
;
DROP TABLE #MyHead
;
SET STATISTICS TIME,IO OFF;
PRINT '---------------------------------------------------------------------------------------------------------'
GO 5The results are abysmal.
Beginning execution loop
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 3385 ms, elapsed time = 7948 ms.
(1000000 row(s) affected)
(50 row(s) affected)
Table '#MyHead_____________________________________________________________________________________________________________000000000048'. Scan count 1, logical reads 35716, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 62 ms, elapsed time = 97 ms.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 1 ms.
---------------------------------------------------------------------------------------------------------
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 3542 ms, elapsed time = 8019 ms.
(1000000 row(s) affected)
(50 row(s) affected)
Table '#MyHead_____________________________________________________________________________________________________________000000000049'. Scan count 1, logical reads 35716, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 63 ms, elapsed time = 89 ms.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 1 ms.
---------------------------------------------------------------------------------------------------------
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 3510 ms, elapsed time = 7997 ms.
(1000000 row(s) affected)
(50 row(s) affected)
Table '#MyHead_____________________________________________________________________________________________________________00000000004A'. Scan count 1, logical reads 35716, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 62 ms, elapsed time = 92 ms.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 0 ms.
---------------------------------------------------------------------------------------------------------
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 3448 ms, elapsed time = 8046 ms.
(1000000 row(s) affected)
(50 row(s) affected)
Table '#MyHead_____________________________________________________________________________________________________________00000000004B'. Scan count 1, logical reads 35716, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 62 ms, elapsed time = 86 ms.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 1 ms.
---------------------------------------------------------------------------------------------------------
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 3650 ms, elapsed time = 8068 ms.
(1000000 row(s) affected)
(50 row(s) affected)
Table '#MyHead_____________________________________________________________________________________________________________00000000004C'. Scan count 1, logical reads 35716, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 78 ms, elapsed time = 85 ms.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 0 ms.
---------------------------------------------------------------------------------------------------------
Batch execution completed 5 times.
To summarize, it takes about 3.5 CPU seconds and about 8 seconds to copy the data to the Temp Table. After that, it only takes about 62 ms of CPU and 90 ms duration to find the data even though the Temp Table is actually a HEAP. That should be a lesson, though. If you can somehow cache this information, you’d have the fasted paging method of them all.
Up next, the CTE method. This is a fast method but it’s not the fastest. “It Depends”, once again, rings true. It may be the “best” method of the 3 in this post because it has a good tradeoff between the number of logical reads and performance. Here’s the code. It uses the same variables just to keep everything equal.
--===== Note: If you convert this to non-dynamic SQL, it will be remarkably slow
DECLARE @PageSize INT
,@PageNum INT
,@Skip INT
,@SQL VARCHAR(8000)
;
--===== Preset what would normally be the input parameters
SELECT @PageSize = 50
,@PageNum = 20000
;
--===== Preset to change the page number to zero-based for easy calculations
-- and find the numbeer of rows to skip
SELECT @PageNum = @PageNum - 1
,@Skip = @PageSize*@PageNum
;
SET STATISTICS TIME,IO ON;
WITH MyCte AS
(
SELECT EmployeeID, EmployeeName, Age, Position, Address, PhoneNumber
,ROW_NUMBER() OVER (ORDER BY EmployeeID) AS RowNumber
FROM dbo.Employees
)
SELECT *
FROM MyCte
WHERE RowNumber BETWEEN @Skip+1 AND @Skip+@PageSize
;
SET STATISTICS TIME,IO OFF;
PRINT '---------------------------------------------------------------------------------------------------------'
GO 5And, here are the results. MUCH better than the first attempt with the Temp Table. To summarize, it uses about 400 ms CPU and 420 ms Duration. Not bad.
Beginning execution loop
(50 row(s) affected)
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 421 ms, elapsed time = 452 ms.
---------------------------------------------------------------------------------------------------------
(50 row(s) affected)
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 406 ms, elapsed time = 426 ms.
---------------------------------------------------------------------------------------------------------
(50 row(s) affected)
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 390 ms, elapsed time = 418 ms.
---------------------------------------------------------------------------------------------------------
(50 row(s) affected)
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 405 ms, elapsed time = 418 ms.
---------------------------------------------------------------------------------------------------------
(50 row(s) affected)
Table 'Employees'. Scan count 1, logical reads 34542, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 406 ms, elapsed time = 416 ms.
---------------------------------------------------------------------------------------------------------
Batch execution completed 5 times.
That’s a pretty good trade off. We can cut the duration in half if we don’t mind doubling the logical reads. That might not be a tradeoff that you want to make but, when I first used this method, it was back in SQL Server 2000 before they had such a thing as ROW_NUMBER().
--===== Note: If you convert this to non-dynamic SQL, it will be remarkably slow
SET @sql = '
SELECT TOP '+CONVERT(VARCHAR(10),@PageSize)+ ' * FROM dbo.Employees
WHERE EmployeeID NOT IN (SELECT TOP '+CONVERT(VARCHAR(10),@Skip)+' EmployeeID FROM dbo.Employees )
ORDER BY EmployeeID;'
SET STATISTICS TIME,IO ON;
EXEC (@SQL)
SET STATISTICS TIME,IO OFF;
PRINT '---------------------------------------------------------------------------------------------------------'
GO 5Here are the results from that. Like I said… it cuts the CPU and Duration in half but it doubles the number of logical reads.
Beginning execution loop
SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 42 ms.
(50 row(s) affected)
Table 'Employees'. Scan count 2, logical reads 69082, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 281 ms, elapsed time = 304 ms.
SQL Server Execution Times:
CPU time = 281 ms, elapsed time = 346 ms.
---------------------------------------------------------------------------------------------------------
SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 0 ms.
(50 row(s) affected)
Table 'Employees'. Scan count 2, logical reads 69082, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 219 ms, elapsed time = 248 ms.
SQL Server Execution Times:
CPU time = 219 ms, elapsed time = 248 ms.
---------------------------------------------------------------------------------------------------------
SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 0 ms.
(50 row(s) affected)
Table 'Employees'. Scan count 2, logical reads 69082, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 218 ms, elapsed time = 239 ms.
SQL Server Execution Times:
CPU time = 218 ms, elapsed time = 239 ms.
---------------------------------------------------------------------------------------------------------
SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 0 ms.
(50 row(s) affected)
Table 'Employees'. Scan count 2, logical reads 69082, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 202 ms, elapsed time = 238 ms.
SQL Server Execution Times:
CPU time = 202 ms, elapsed time = 238 ms.
---------------------------------------------------------------------------------------------------------
SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 0 ms.
(50 row(s) affected)
Table 'Employees'. Scan count 2, logical reads 69082, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 219 ms, elapsed time = 240 ms.
SQL Server Execution Times:
CPU time = 219 ms, elapsed time = 240 ms.
---------------------------------------------------------------------------------------------------------
Batch execution completed 5 times.
Now, let’s shift gears a bit. Who in their right mind is going to be looking at page #20,000? Certainly no one that got to page #20,000 by hitting the “Next Page” button. If we take a look at the performance of all 3 methods for a lookup on a more modest 20th page, all 3 work reasonably well with the CTE winning for lowest number of reads and the dynamic double-top method continues to be the fastest. Well, unless you consider that a properly indexed cached table has only 6 reads. 😉
--===== Note: If you convert this to non-dynamic SQL, it will be remarkably slow
DECLARE @PageSize INT
,@PageNum INT
,@Skip INT
,@SQL VARCHAR(8000)
;
--===== Preset what would normally be the input parameters
SELECT @PageSize = 50
,@PageNum = 20
;
--===== Preset to change the page number to zero-based for easy calculations
-- and find the numbeer of rows to skip
SELECT @PageNum = @PageNum - 1
,@Skip = @PageSize*@PageNum
;
PRINT '========== Temp Table Method ===================================================';
SET STATISTICS TIME,IO ON;
SELECT TOP(@Skip+@PageSize)
RowNumber = ROW_NUMBER() OVER (ORDER BY EmployeeID)
,*
INTO #MyHead
FROM dbo.Employees
;
SELECT * FROM #MyHead WHERE RowNumber BETWEEN @Skip+1 AND @Skip+@PageSize
;
DROP TABLE #MyHead
;
SET STATISTICS TIME,IO OFF;
PRINT '========== CTE Method ==========================================================';
SET STATISTICS TIME,IO ON;
WITH MyCte AS
(
SELECT EmployeeID, EmployeeName, Age, Position, Address, PhoneNumber
,ROW_NUMBER() OVER (ORDER BY EmployeeID) AS RowNumber
FROM dbo.Employees
)
SELECT *
FROM MyCte
WHERE RowNumber BETWEEN @Skip+1 AND @Skip+@PageSize
;
SET STATISTICS TIME,IO OFF;
PRINT '========== Dynamic Double Top Method ===========================================';
--===== Note: If you convert this to non-dynamic SQL, it will be remarkably slow
SET @sql = '
SELECT TOP '+CONVERT(VARCHAR(10),@PageSize)+ ' * FROM dbo.Employees
WHERE EmployeeID NOT IN (SELECT TOP '+CONVERT(VARCHAR(10),@Skip)+' EmployeeID FROM dbo.Employees )
ORDER BY EmployeeID;'
SET STATISTICS TIME,IO ON;
EXEC (@SQL)
SET STATISTICS TIME,IO OFF;
PRINT '========== Cached Method =======================================================';
--===== Didn't include the time to build the cached table or the index on the cached table.
SELECT RowNumber = ROW_NUMBER() OVER (ORDER BY EmployeeID)
,*
INTO #Cached
FROM dbo.Employees
;
CREATE UNIQUE CLUSTERED INDEX IX_#Cached ON #Cached (RowNumber) WITH FILLFACTOR = 100
;
SET STATISTICS TIME,IO ON;
SELECT * FROM #Cached WHERE RowNumber BETWEEN @Skip+1 AND @Skip+@PageSize;
SET STATISTICS TIME,IO OFF;
DROP TABLE #Cached;
Results:
========== Temp Table Method ===================================================
Table 'Employees'. Scan count 1, logical reads 42, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 3 ms.
(1000 row(s) affected)
(50 row(s) affected)
Table '#MyHead_____________________________________________________________________________________________________________00000000005E'. Scan count 1, logical reads 37, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 32 ms.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 1 ms.
========== CTE Method ==========================================================
(50 row(s) affected)
Table 'Employees'. Scan count 1, logical reads 42, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 82 ms.
========== Dynamic Double Top Method ===========================================
SQL Server parse and compile time:
CPU time = 0 ms, elapsed time = 0 ms.
(50 row(s) affected)
Table 'Employees'. Scan count 2, logical reads 82, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 47 ms.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 47 ms.
========== Cached Method =======================================================
(1000000 row(s) affected)
(50 row(s) affected)
Table '#Cached_____________________________________________________________________________________________________________00000000005F'. Scan count 1, logical reads 6, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 35 ms.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 25, 2013 at 10:15 pm
There are also a wad of paging methods in the following discussion.
http://www.sqlservercentral.com/Forums/Topic672980-329-1.aspx
--Jeff Moden
Change is inevitable... Change for the better is not.
November 26, 2013 at 6:13 am
I enjoyed the comparison, Jeff, thank you. As for the Varchar column for a phone number, don't apologize for picking the right data type. That is, unless there are phone numbers somewhere in the world that contain Unicode (:w00t:), which might just cause a new twitch to develop in DBAs everywhere.
November 26, 2013 at 7:30 am
Thanks Jeff that is awesome!!!!
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply