Index key with a maximum length of 1004 bytes

  • The statement below takes 2 minutes to execute (table scan).

    So, I try adding a non clustered on all the fields in the Select, but I get the error :

    Adding the selected columns will result in an Index key with a maximum length of 1004 bytes . The maximum permissible index length is 900 bytes - insert updates will fail.

    If I can't make a big composite index for this, Is there a better way to speed this up?

    INSERT INTO #events

    SELECT logID, companyID, departmentID, companyCRN, departmentCRN, eventValue, state, county,

    orderID, orderNo, userID, event, eventValue, source, 0, 0, 0, 0, NULL, NULL, NULL, NULL, NULL, company, department, titleUnit

    FROM eventItems (NOLOCK)

    WHERE eventTime BETWEEN @fromDate AND @thruDate

    AND (@companyCRN IS NULL OR companyCRN = @companyCRN)

    AND ((@companyCRN IS NULL OR @deptCRN IS NULL) OR (companyCRN = @companyCRN AND departmentCRN = @deptCRN))

    AND (@fromCoCRN IS NULL OR companyCRN >= @fromCoCRN)-- Enforce company CRN ranges, if applicable

    AND (@thruCoCRN IS NULL OR companyCRN <= @thruCoCRN)

    AND pi2Status IN ('','NoRecordsFound','Success','SuccessWithWarnings')

    AND event LIKE 'Search-%' AND event <> 'Search-Image'

    AND source <> 'TPX'

  • INCLUDE columns?

    Only the columns in the where need to be in the index key The rest can be include cloumns, and there's no limit to either size of number of those.

    You can make an index with a maximum key size over 900 bytes, that was a warning not an error.However, should a row be inserted where the combined columns actually do exceed 900 bytes, the insert will fail.

    That said, with the structure of the query as is, any index is probably a waste of time. . The catch-all query that you have there (multiple where clause predicates of the form (Col = @var OR @Var IS NULL)) does not perform well and will not perform well, regardless of indexes. There are two reasons. First, the execution plan is completely unstable. Depending on the value of the parameters the optimal indexes and operators will change completely. Secondly the optimiser is designed to, in cases like this, find a plan that will always work, regardless of parameter values. The downside of that is it will never run very fast.

    Your best bet is to build up a SQL statement based on what parameters are actually passed, and then use sp_executesql to run it, passing in those parameters.

    I've replied to several threads on this. Perhaps a search on 'catch-all queries' will dig one up. Or, you can read through this - http://www.sommarskog.se/dyn-search.html

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Geez long night, I meant the where..

    Thanks Gail.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply