June 14, 2010 at 9:52 am
HI All,
The attached query is running for more than 5 min and ending up with the below mentioned following error . I ran the query execution plan and found that h_isActive column is using 50% cost on the index scan.
I have create a non clustered index on this column and there is no use. Please help me to tune up this query. I have attached the query, table structure and the execution plan.
FYI, The salesfact table contains 13 M rows, Availsfact table contains 17 M rows and timepart contains only 10 rows
Error:
====
An error occurred while executing batch. Error message is: Exception of type 'System.OutOfMemoryException' was thrown.
Regards
Raj
June 14, 2010 at 3:01 pm
Try this index to start:
SalesFact (h_IsActive) INCLUDE (sk_SalesFact,sk_BroadcastDate,sk_SpotScheduledStartTime,sk_SpotActualStartTime,sk_SpotStatus)
Date
You seem to have a cross join between the timepart table and date dimension. There's 20090 rows in the date dimension, after the 'join' to time part, there's almost 200 000. Is that intentional?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
June 16, 2010 at 2:32 am
moosamca (6/14/2010)
HI All,The attached query is running for more than 5 min and ending up with the below mentioned following error . I ran the query execution plan and found that h_isActive column is using 50% cost on the index scan.
I have create a non clustered index on this column and there is no use. Please help me to tune up this query. I have attached the query, table structure and the execution plan.
FYI, The salesfact table contains 13 M rows, Availsfact table contains 17 M rows and timepart contains only 10 rows
Error:
====
An error occurred while executing batch. Error message is: Exception of type 'System.OutOfMemoryException' was thrown.
Regards
Raj
This error comes from SSMS when you try to get a very large number of rows into a grid.
It is not a problem with SQL server - it is only a limitation of SSMS.
According to the estimated query plan your query would return over 43 million rows.
What are you planning to do with all this data ?
If you want to create a file with this data you should use an SSIS package or the export data wizard. You could even use the results to file function in SSMS.
If you are just trying to optimize the query you have to run it in a way that avoids the problem with returning massive amounts of data to SSMS.
You could for example use SELECT INTO
/SG
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply