March 11, 2009 at 12:21 am
I have installed SQL server 2005 (standard edition) with SP2 on a Windows Server 2003 with SP2 machine. It has a 4 GB ram and 8 processors.
Sometimes for query execution the CPU utilization goes 100% and never comes back until I restart the SQL server.
If I execute the stored procedure it works well and returns result in 10-30 seconds (around 5 million records).
When I execute the stored procedure using SQLhelper and .net code the SQL server is unable to return the complete results and query timeouts.
I need to generate the report from the data fields that i am getting using SP.
I process all records in the dot net code and display the report results accordingly.
What other alternates i can use to calculate the report.
Any prompt help will be appreciable.
March 11, 2009 at 1:55 am
I don't think you really want to process 5 million rows on the client side, especially if you're building a report. Store the results of your SP in a temp table and play with it on the database side to return the aggregated results to your .NET application.
The query doesn't only need to be executed, but also results need to be returned back to the client and this is a high cost operation if we talk about 5 million rows.
Gianluca
-- Gianluca Sartori
March 11, 2009 at 2:12 am
I think about this option but I have following questions in my mind
If I use the temp table then first I need to select the 5 million records from the desired table and will insert the 5 million records in the temp table. Plus after that i need to get the current record from temp table and perform logic on it.
What if instead of using the temp table i use the sql cursor to manipulate the data.
Which operation cost most in the following
1. Return results to .net code
2. Use of temp table
3. Use of cursors
March 11, 2009 at 2:25 am
If you're building a report you should be working with a reasonable set of records and 5 million rows is far from being reasonable, so you need to cut it down with aggregation.
Looping with cursors is the slowest possible solution, both inside SQL Server and in .NET.
You're getting five million rows calling a stored procedure, so maybe you should change the way you query the database building a new stored procedure that returns the aggregated data you need, limited to a reasonable number of records to work with in a report.
If you can't build your own stored procedure (security restrictions?), put the data in a temp table, create the appropriate indexes an aggregate the data: it will always be faster than looping with a cursor. Always remember that the database is smarter than you in sorting, aggregating and playing around with data.
Regards
Gianluca
-- Gianluca Sartori
March 11, 2009 at 2:28 am
Hi
Its difficult to tell without knowing exactly what you are trying to do. With the info given i think using of temp tables might be the best of the 3 options.
Using a cursor or processing 5 million records in the client is generally not done.
"Keep Trying"
March 11, 2009 at 2:31 am
The SP that i am using
CREATE PROCEDURE [dbo].[GetReport]
(
@whereClause varchar(2000),
@orderBy varchar(250) = NULL
)
AS
DECLARE @sqlWhere AS varchar(2000)
SET @sqlWhere = ''
IF @whereClause <> ''
SET @sqlWhere = ' WHERE ' + @whereClause
IF @orderBy IS NOT NULL AND LEN(@orderBy) > 0
SET @sqlWhere = @sqlWhere + ' ORDER BY ' + @orderBy
SET NOCOUNT ON
exec('WITH Report AS
(
SELECT field1, field2, field3, field4, field5, field6, field7, field8, field9, field10, field11,
CASE field12 WHEN 3 THEN CAST(1 AS BIT) ELSE CAST(0 AS BIT)
END, CASE field13 WHEN 3 THEN CAST(1 AS BIT)
ELSE CAST(0 AS BIT) END,
CASE field14 WHEN 3 THEN CAST(1 AS BIT) ELSE CAST(0 AS BIT)
END, CASE field15 WHEN 3 THEN CAST(1 AS BIT)
ELSE CAST(0 AS BIT) END, field16, field17,
field18, field19, 0, field18, CAST(0 AS BIT), field19 FROM Table1 INNER JOIN
Table2 ON Table1FK = Table2PK INNER JOIN
Table3 ON Table2FK = Table3PK LEFT OUTER JOIN
Table4 ON Table1PK = Table4FK LEFT OUTER JOIN
Table5 ON Table1PK = Table5PK) SELECT * FROM Report' + @sqlWhere)
RETURN @@Error
March 11, 2009 at 2:35 am
The used SP return the aggregate data and i am calling the SP 5 million times. One time call return the whole data to code and code do the remaining processing.
March 11, 2009 at 2:35 am
If you can alter the stored procedure code, aggregate the data with the appropriate group by clause, in order to get the data formatted the same way you need to display it in your report.
-- Gianluca Sartori
March 11, 2009 at 2:42 am
Babar Javaid (3/11/2009)
The used SP return the aggregate data and i am calling the SP 5 million times. One time call return the whole data to code and code do the remaining processing.
You mean you're calling the SP 5 million times in a loop? It doesn't sound very efficient.
I suggest moving your code in the SP and push back to the client only the rows you need to display.
-- Gianluca Sartori
March 11, 2009 at 3:00 am
I am calling the SP just one time to get the complete result set.
March 11, 2009 at 3:04 am
Your result set is too big. Work on the server side.
-- Gianluca Sartori
March 11, 2009 at 5:44 am
5 million rows is not a report, that's a data move. Generally reports are much, much smaller.
Since you're using a completely ad hoc mechanism for the query, it's going to be difficult to suggest tuning opportunities. Also, as currently configured, you're exposed for a SQL injection attack. For an example look up the phrase "Bobby Tables" in Google.
If you posted an actual execution plan, more specific recommendations could be made.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 12, 2009 at 5:19 am
What other ways that i can use to generate the reports for 5-10 million records.
March 12, 2009 at 5:30 am
The thing is, no one reads 5 million records. No one reads 5000 records. Very few people read 500. A report should be a succinct display of interesting data, usually 1-3 pages. More than that is simply a data feed that you're going to use elsewhere to generate a report.
Most of the time people want to know that they have 5 million records and then they look at the summaries. It's the summary they need and know that it came from 5 million records is just a warm & fuzzy for them.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
March 12, 2009 at 5:39 am
What other ways that i can use to generate the reports for 5-10 million records.
Viewing 15 posts - 1 through 15 (of 24 total)
You must be logged in to reply to this topic. Login to reply