June 20, 2012 at 11:46 pm
I have a fact table that has 65 GB of data, Now the clients are going to hit the database to get a single record.
This table has 58 columns and it has to show the result based on the 6 Input Parameters. i could expect almost 20 lakhs hits to this table for every 24 hours. Please guide me with some valuable suggestions on what needs to be done to improve the performance to get the result in less than 300 milliseconds.
I'm using Sql Server 2008 R2
June 21, 2012 at 1:00 am
Indexes are your friend here. Also think about archiving old data from the table (stuff that hasn't been read/updated for some time, easily found if you have a datetime column dating the entry).
Link to indexing: http://www.sqlservercentral.com/stairway/72399/
Example of moving old data:
SELECT colA, colb, colC, colD
INTO dbo.archiveTable
FROM dbo.originalTable
WHERE colD BETWEEN '<start_date>' AND '<end_date>'
DECLARE @count1 INT
DECLARE @count2 INT
SET @count1 = ( SELECT COUNT(*) FROM dbo.originalTable WHERE colD BETWEEN '<start_date>' AND '<end_date>' )
SET @count2 = ( SELECT COUNT(*) FROM dbo.archiveTable )
IF @count1 = @count2
BEGIN
PRINT 'Data archived successfully.'
DELETE FROM dbo.originalTable WHERE colD BETWEEN '<start_date>' AND '<end_date>';
END
IF (@count1 <> @count2 OR (@count2 IS NULL)
PRINT 'Problem with archival, review and try again.'
---
Note to developers:Want to get the best help? Click here https://www.sqlservercentral.com/articles/forum-etiquette-how-to-post-datacode-on-a-forum-to-get-the-best-help (Jeff Moden)
My blog: http://uksqldba.blogspot.com
Visit http://www.DerekColley.co.uk to find out more about me.
June 21, 2012 at 6:26 am
Queries to return a single record should be able to be tuned to a fare-thee-well. It just depends on the parameters being passed in and how well you can apply indexes.
58 columns means that you're only going to get a few rows of data per page within the file system, so you really want to avoid scans.
Without seeing the proposed queries and the data structure, I can't make any more recommendations than that.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 21, 2012 at 8:04 am
Maybe, you should provide more information:
- Structure of table
- Conditions to retrieve data (6 Input parameters)
...
I have recommendation here:
- Do you have indexes on conditions which are using to retrieve data?
- Do you have partitions on this table?
June 21, 2012 at 11:21 am
Hi gpm.alwyn,
Along with table structure, please provide the query which you are using currently.
--Ravi.
Regards,
Ravi.
June 22, 2012 at 8:59 am
seems kinda unusual to me to have a large "fact table" used so heavily for "singleton lookups", but as others have suggested proper indexing is what is needed here. Shouldn't be any problem returning a single row using a nonclustered index seek and bookmark lookup in under 300ms, assuming your system (especially physical IO) is sufficiently beefy.
However, I note that you may well need numerous NC indexes to guarantee index seeks for all potential singleton hits and that could really cause problems with your fact table load process, index maintenance, statistics updates, etc.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply