March 29, 2011 at 9:24 am
One thing that might be worth trying, but it kinda depends on how big a number @DataSet is (I'm guessing it is 10,000 based on the Filter operator).
You appear to be using @DataSet and @m_PageID to pull out 1 page of records using the first subquery. Along the way you pull quite wide record sets for over 300,000 records and then dispose of all but 10,000. Try using a much narrower query to filter into a temp table, which may be able to use relevant indexes. Then join to that temp table to select the wider recordset on only the appropriate rows, touching about 97% less rows in the base tables.
Assuming b.CombinedID is enough to identify the records it would be something like,
; WITH cteFilter AS
(select
((row_number()OVER (ORDER BY b.CombinedID) / @DataSet ) +1) AS UniquePageId,
b.CombinedID
fromT_VM_GLACIER_BASE_FEEDS_FB b
INNER JOIN T_VM_GLACIER_SWAPSIDE_FEEDS_FB sON b.TaskId = s.TaskId
and b.BookName = s.BookName
and b.CombinedID = s.CombinedID
andb.RunVersion = s.RunVersion
INNER JOIN #BaseMaxRunVersion bmrvON s.TaskId = bmrv.TaskId
ands.BookName = bmrv.BookName
ands.RunVersion = bmrv.RunVersion
whereb.TaskId = @TaskId
andb.MajorType <> 'STRSWAP')
SELECT CombinedID
INTO #CombinedIDsWeCareAbout
FROM cteFilter
whereUniquePageId = @m_PageId
and then add this in to the first subquery...
fromT_VM_GLACIER_BASE_FEEDS_FB b
INNER JOIN #CombinedIDsWeCareAbout c ON b.CombinedID=c.CombinedID
INNER JOIN T_VM_GLACIER_SWAPSIDE_FEEDS_FB sON b.TaskId = s.TaskId
remove the (now redundant) row_number...
(select
--((row_number()OVER (ORDER BY b.CombinedID) / @DataSet ) +1) AS UniquePageId,
b.CombinedID,
and the redundant where clause...
andH.CombinedID = e.CombinedID
--whereUniquePageId = @m_PageId
March 31, 2011 at 1:04 am
Sorry for the late reply guys.
Paul, thanks for this idea. I'm trying your idea and will let you kow after testing.
Viewing 2 posts - 16 through 16 (of 16 total)
You must be logged in to reply to this topic. Login to reply