June 14, 2010 at 8:46 pm
Comments posted to this topic are about the item Paging and Versioning Using ROW_NUMBER()
June 14, 2010 at 11:45 pm
Dear mr. Moore,
Perhaps I have a not so smart question but I wonder if this method
can be used on mysql databases as well for as far as I know one can't use stored procedures in a mysql database.
Hope te hear from you.
Kind regards,
Martin
June 15, 2010 at 2:42 am
Martin,
Unfortunately I don't know much about MySQL as my background is MS SQL Server.
However, you can definitely use stored procedures:
http://www.mysqltutorial.org/introduction-to-sql-stored-procedures.aspx
As for Row_Number() type functionality, I don't know if the partitioning capabilities are available, but I found the following link which discusses adding a rownumber field:
http://jimlife.wordpress.com/2008/09/09/displaying-row-number-rownum-in-mysql/
In conjunction with stored procedures, it should provide the same paging functionality, if this is what you were after.
Hope that helps,
Regards,
Lawrence
June 15, 2010 at 4:53 am
A well-presented and enjoyable article, thanks.
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
June 15, 2010 at 6:57 am
Hi Lawrence, I have spent quite a bit of time recently using Row_Number() with CTEs for paging purposes and have tried to gauge how to get the best performance possible out of them.
While researching I read that when you use a CTE in a query, it reruns the CTE for ever column in the outer select that references it:
such that in your example query:
;WITH BookCTE (RowNumber, BookTitle, BookEdition, BookPublishDate, BookAuthor)
AS
(
SELECT
ROW_NUMBER()OVER (PARTITION BY BookAuthor ORDER BY BookPublishDate DESC),
BookTitle, BookEdition, BookPublishDate, BookAuthor
FROM dbo.Books
)
SELECT BookTitle, BookEdition, BookPublishDate, BookAuthor
FROM BookCTE
WHERE RowNumber=1
ORDER BY BookTitle
it would run the CTE 4 times -- once for each column in the outer select. For the example query here that is not a big deal, but if the CTE definition is complex and involves some sort of aggregate function or group by it can be a bit tricky.
So I have experimented with writing the query as such and also including only the minimum number of columns in the CTE as required and then having the outer query join to the tables necessary for the select. Your example can't really be refined much using this method since all of the columns in the select statement are "used" by the CTE... but if we pretended that you also needed the PageCount, PublisherName, and ISBN of the book, then using this method it would read as:
;WITH BookCTE (RowNumber, BookTitle, BookEdition, BookPublishDate, BookAuthor)
AS
(
SELECT
ROW_NUMBER()OVER (PARTITION BY BookAuthor ORDER BY BookPublishDate DESC),
BookTitle, BookEdition, BookPublishDate, BookAuthor
FROM dbo.Books
)
SELECT C.BookTitle, C.BookEdition, C.BookPublishDate, C.BookAuthor, B.PageCount, B.PublisherName, B.ISBN
FROM BookCTE C inner join Books B on C.BookTitle=B.BookTitle --this isn't a very good key, you'd actually want to use the pkey of the table to join
WHERE RowNumber=1
ORDER BY BookTitle
Why I bring this up is, first, I wanted to verify the assertion that I read that this is indeed what occurs behind the scenes (since I don't remember where I read it) and also to comment that I have noticed empirically that sometimes the performance is better when I include all the select columns in the CTE and write a simple outer query and sometimes it is better when I do it the way I described above. In my application, the CTE definition ends up being variable most of the time (based on the input parameters from the user), so I am having to make my best guess as to which way the performance will be better in the majority of cases.
Do you (or any of the other SQL gurus reading this) have any thoughts on the topic or best practices on how to write these queries when they get complicated, to keep performance from going way down?
Thanks in advance,
Anye
--
Anye Mercy
"Service Unavailable is not an Error" -- John, ENOM support
"You keep using that word. I do not think it means what you think it means." -- Inigo Montoya in "Princess Bride"
"Civilization exists by geologic consent, subject to change without notice." -- Will Durant
June 15, 2010 at 7:11 am
Anye,
Many thanks for your post.
It's news to me that the performance of a CTE query is based on the number of columns in the "Outer" query.
Rather, CTEs are generally very efficient as the processing is done with one pass of the data.
Certainly, a very quick investigation using SET STATISTICS IO does not raise any concerns.
For example, running:
SET STATISTICS IO ON
;WITH BookCTE (RowNumber, BookTitle, BookEdition, BookPublishDate, BookAuthor)
AS
(
SELECT
ROW_NUMBER()OVER (PARTITION BY BookTitle ORDER BY BookEdition DESC),
BookTitle, BookEdition, BookPublishDate, BookAuthor
FROM dbo.Books
)
SELECT BookTitle, BookEdition, BookPublishDate, BookAuthor
FROM BookCTE
WHERE RowNumber=1
ORDER BY BookTitle
...gives the following output:
Table 'Books'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Running the same query but only returning a single column (e.g. BookTitle) yields the same IO results.
See if you can find the reference that stated this behaviour, or otherwise provide a setup that would show this to be the case. Perhaps it occurs with very large tables where the processing cannot be done fully in memory(?)
I'd obviously welcome other experts' views on this point.
Best regards,
Lawrence.
June 15, 2010 at 8:01 am
This article is a good example of how technical information can be written in a clear language.
Thanks for sharing this useful technique!!!
Roberto R.
June 15, 2010 at 8:07 am
Ok, I dug it up (or dug up something else that was related) and I apparently misread it -- the article I read says that the CTE is executed the number of times the CTE itself is referenced (i.e. # of joins) x the number of rows from the anchor.
His examples JOIN to the CTE multiple times and therein lies the performance hit, as opposed to # of columns from the CTE as I had previously believed.
However, it still leaves me curious as to why sometimes using a simpler CTE with more joins in the outer query performs better than the other way around. I will play around with this with statistics on with some of my "hairier" queries and see what it comes back with.
--
Anye Mercy
"Service Unavailable is not an Error" -- John, ENOM support
"You keep using that word. I do not think it means what you think it means." -- Inigo Montoya in "Princess Bride"
"Civilization exists by geologic consent, subject to change without notice." -- Will Durant
June 15, 2010 at 8:12 am
Anye,
Thanks for posting a follow up.
It sounds like any useful findings you gather could form the basis for an interesting CTE article. 😉
With regards,
Lawrence.
June 15, 2010 at 8:46 am
Great article. I've used ROW_NUMBER() in CTE to evaluate (compare) data in the previous rows or next rows. It was a real life saver. 🙂
June 15, 2010 at 9:10 am
regarding CTE's; very powerful tool for segmentation, but you do have to be careful as the engine will run the CTE query each time it's referenced. I would think, but haven't explicitly observed, that the db engine would somehow optimize this behavior, since all the reads are happening within the same transaction space it ought to be quite impossible to get a different result when running a CTE for multiple inclusion.
I've written some monsterous inline table valued functions that string 5 to 8 CTE's together, recalling the earlier CTE's in later segments; feeding prequalified data to work queries and paging CTEs and so on, only to find the query plan and overhead become monsterous. The showplan doesn't seem to indicate that there are any savings for multiple references to a CTE; question for Microsoft I suppose.
I eventually stepped back from the giant CTE approach and started using table valued variables; it's a different type of IO problem, but seemed to be a more effecient solution vs the monster CTE query.
When it comes to CTEs and the finding the best balance between convenience and performance you really have to try variations and compare results in the execution plan; as with so many sql server optimization topics; 'it depends'.
now, on to row numbering and paging; wanted to chip in my 2 cents worth.
One trick I've used to get a very tight grip on paging is to introduce a second ROW_NUMBER going in the opposite direction as the first, then summing the two to get a row count before page selection; it does introduce some additional overhead and it can be significant, but it the benefit outweighs the cost it can be quite useful. The version below uses a 0 based row and page indexes; first page is 0, first row is 0.
Note: removing the RN row number will significantly reduce overhead while continuing to allow you to use the page index functionality; you loose the row count and page count, but can still pull back a specific page in the sequence, accomplishing something like the LIMIT function mySql.
[font="Courier New"]
DECLARE @PageLimit INT, @PageIndex INT;
SELECT @PageLimit=20, @PageIndex=0;
SELECT
[RowIndex]=[IX]
,[RowCount]=[IX] + [RN]
,[PageCount]=CIELING(1.0 * [IX] + [RN] / @PageLimit)
,[PageIndex]=FLOOR(1.0 * [IX] / @PageLimit)
...
FROM (
SELECT
[IX]=ROW_NUMBER() OVER(ORDER BY [foo] ASC)-1
,[RN]=ROW_NUMBER() OVER(ORDER BY [foo] DESC)
...
) PageBase
WHERE FLOOR(1.0 * [IX] / @PageLimit) = @PageIndex;
[/font]
June 15, 2010 at 10:03 am
CTEs are generally poor performing. We have had to rewrite several for performance reasons. Your example breaks down when you have a table with a lot of rows and takes over an order of magnitude longer to execute. You should not be doing any performance evaluation a such a small table. The CTE is the worst performing of the 3 methods I know of. You do not mention the third which is joining a subquery back to the table.
select b.BookTitle, b.BookEdition, b.BookPublishDate, b.BookAuthor
frombooks b
inner join
(select booktitle, max(bookedition) as bookedition from books group by booktitle)q
on b.booktitle = q.booktitle and b.bookedition = q.bookedition;
I reinserted your data back into the table 17 times and updated the edition with the identity column:
Derived table:
Table 'Books'. Scan count 17, logical reads 15461
390 Milliseconds
Correlated subquery:
Table 'Books'. Scan count 17, logical reads 15461
400 Milliseconds
CTE:
Table 'Books'. Scan count 17, logical reads 21203
8103 Milliseconds!!!!!!!!!!
June 15, 2010 at 10:37 am
You can avoid CTE by doing this to incorporate the row_number function in the where clause
select * from
(select ROW_NUMBER()OVER (ORDER BY column_order_by) as RowNbr,col_1,col_2
from inner_table with(nolock)) as outer_table
where RowNbr < 5 --> use row number in the where clause
June 15, 2010 at 10:41 am
Thanks for your post Blah Baby.
It seems that the Row_Number() clause is the cause of the slow running, rather than the CTE itself.
(Worrying given the subject of my article was the former, and not the latter. 🙁 )
For example, the following is fast:
;WITH BookCTE (BookTitle, BookEdition)
AS
(
select booktitle, max(bookedition) as bookedition from books group by booktitle
)
SELECT b.BookTitle, b.BookEdition, b.BookPublishDate, b.BookAuthor
FROM dbo.Books b INNER JOIN BookCTE c ON b.BookTitle=c.BookTitle AND b.BookEdition=c.BookEdition
The issue of why ROW_NUMBER() is signifcantly slower as data volumes increase in this case, is counter-intuitive to me, and needs additional research.
Thanks again.
Lawrence.
June 15, 2010 at 3:18 pm
Lawrence,
Thank you for your answer.
I will have a look at the links you gave me.
Kind regards,
Martin
Viewing 15 posts - 1 through 15 (of 24 total)
You must be logged in to reply to this topic. Login to reply