How to increase Sql Server Query performance for larger tables

  • I have a fact table that has 65 GB of data, Now the clients are going to hit the database to get a single record.

    This table has 58 columns and it has to show the result based on the 6 Input Parameters. i could expect almost 20 lakhs hits to this table for every 24 hours. Please guide me with some valuable suggestions on what needs to be done to improve the performance to get the result in less than 300 milliseconds.

    I'm using Sql Server 2008 R2

  • Indexes are your friend here. Also think about archiving old data from the table (stuff that hasn't been read/updated for some time, easily found if you have a datetime column dating the entry).

    Link to indexing: http://www.sqlservercentral.com/stairway/72399/

    Example of moving old data:

    SELECT colA, colb, colC, colD

    INTO dbo.archiveTable

    FROM dbo.originalTable

    WHERE colD BETWEEN '<start_date>' AND '<end_date>'

    DECLARE @count1 INT

    DECLARE @count2 INT

    SET @count1 = ( SELECT COUNT(*) FROM dbo.originalTable WHERE colD BETWEEN '<start_date>' AND '<end_date>' )

    SET @count2 = ( SELECT COUNT(*) FROM dbo.archiveTable )

    IF @count1 = @count2

    BEGIN

    PRINT 'Data archived successfully.'

    DELETE FROM dbo.originalTable WHERE colD BETWEEN '<start_date>' AND '<end_date>';

    END

    IF (@count1 <> @count2 OR (@count2 IS NULL)

    PRINT 'Problem with archival, review and try again.'

    ---

    Note to developers:
    CAST(SUBSTRING(CAST(FLOOR(NULLIF(ISNULL(COALESCE(1,NULL),NULL),NULL)) AS CHAR(1)),1,1) AS INT) == 1
    So why complicate your code AND MAKE MY JOB HARDER??!:crazy:

    Want to get the best help? Click here https://www.sqlservercentral.com/articles/forum-etiquette-how-to-post-datacode-on-a-forum-to-get-the-best-help (Jeff Moden)
    My blog: http://uksqldba.blogspot.com
    Visit http://www.DerekColley.co.uk to find out more about me.

  • Queries to return a single record should be able to be tuned to a fare-thee-well. It just depends on the parameters being passed in and how well you can apply indexes.

    58 columns means that you're only going to get a few rows of data per page within the file system, so you really want to avoid scans.

    Without seeing the proposed queries and the data structure, I can't make any more recommendations than that.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Maybe, you should provide more information:

    - Structure of table

    - Conditions to retrieve data (6 Input parameters)

    ...

    I have recommendation here:

    - Do you have indexes on conditions which are using to retrieve data?

    - Do you have partitions on this table?

  • Hi gpm.alwyn,

    Along with table structure, please provide the query which you are using currently.

    --Ravi.

    Regards,
    Ravi.

  • seems kinda unusual to me to have a large "fact table" used so heavily for "singleton lookups", but as others have suggested proper indexing is what is needed here. Shouldn't be any problem returning a single row using a nonclustered index seek and bookmark lookup in under 300ms, assuming your system (especially physical IO) is sufficiently beefy.

    However, I note that you may well need numerous NC indexes to guarantee index seeks for all potential singleton hits and that could really cause problems with your fact table load process, index maintenance, statistics updates, etc.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply