How to have a 'text' type column included in an index

  • My text [type] column is not in any joins or where clauses, but it's one of the fields in the select list. My query is horribly slow, any suggestions on how to get it included in an index since I can't have it specified in the "include" columns?

    Not sure what other info you guys will want, but I'm guessing you'll ask 🙂

  • SQL Server indexes are constrained to a maximum width of 900 bytes; so you could include *part* of that text column, potentially;

    typically when you have to search a text/varchar(max) column, Full Text Indexing is the way to go.

    can you show us an example of the offendingly slow query?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Wow, thanks for the quick reply Lowell!

    I don't have to search by that field, I just need to include it in the select statement...does your answer change?

  • Second the nomination for Full Text Indexing. I have seen tremendous improvements when using it.

    Steve Jimmo
    Sr DBA
    “If we ever forget that we are One Nation Under God, then we will be a Nation gone under." - Ronald Reagan

  • ok i see what you are saying...you have a slow query, and at least a portion of the slowness is probably attributable to the grabbing lots of rows with a large text field;

    ideally, you should post the actual .sqlplan execution paln for us to look at;

    that's where we really can see things that might help...we could possibly identify missing indexes, better joins, missing SARG-able arguments out of date statistics , and so much more.

    can you post that executaion paln as an attachment here?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Here's the exec plan.

    Edit: By the way, my 'text' field is called "CorrespondenceFO".

    Here's my select statement:

    USE [IARTS]

    GO

    DECLARE @getdate-2 datetime,

    @Legal int,

    @StatusFailed int

    SET @getdate-2 = GetDate()

    SET @Legal = IARTS.dbo.cdv('CustNameType', 'Legal')

    SET @StatusFailed = 3

    Select Top 1

    cl.CorrespondenceLogID,

    ct.LookupValue AS Template,

    cl.Subject,

    (Select IDNumber From IADS..ERMSGetCustomerInformation(c.CustomerNumber)) As [DLNumber],

    cp.Cust_No AS CustomerNumber,

    CASE

    WHEN c.customerTypeID = -2THEN c.CustomerName

    ELSE c.CustomerFirstName + ' ' + c.CustomerLastName

    END AS CustomerName,

    csd.CustomerEventID,

    REPLACE(Convert(varchar, cl.createdDate, 101),'/', '-') AS CreateDate,

    REPLACE(Convert(varchar, @getdate-2, 101),'/', '-') AS DateFiled,

    cl.correspondenceFO

    FROM CorrespondenceLog cl

    INNER JOIN CorrespondenceTemplate ct on cl.correspondenceTemplateID = ct.correspondenceTemplateID

    INNER JOIN CorrespondenceParty cp on cl.correspondenceLogID = cp.correspondenceLogID

    INNER JOIN cust_name cn on cp.cust_no = cn.cust_no

    AND cn.cust_name_type_id = @Legal

    AND cn.end_dt is NULL

    INNER JOIN cspeed c on cp.cust_no = c.CustomerNumber

    LEFT OUTER JOIN IADS..CustomerSanctionDetailNoticeLog sanctionLog on sanctionLog.correspondenceLogID = cl.correspondenceLogID

    and sanctionLog.sanctionNotice = 1

    LEFT OUTER JOIN IADS..CustomerSanctionDetailNoticeLog LiftLog on LiftLog.correspondenceLogID = cl.correspondenceLogID

    and LiftLog.liftNotice = 1

    LEFT OUTER JOIN IADS..CustomerSanctionDetail csd on (LiftLog.customerSanctionDetailID = csd.customerSanctionDetailID OR sanctionLog.customerSanctionDetailID = csd.customerSanctionDetailID)

    WHERE cl.Void = 0

    AND (cl.ERMSStatus IS NULL OR cl.ERMSStatus = @StatusFailed)

    AND ct.ERMSDocument = 1

    ORDER BY CL.CORRESPONDENCELOGID

  • "text" datatype fields aren't actually stored in the clustered index. They are stored in an extended space. So, no, they really can't be part of an index, even if they are small enough to fit inside one. (Note that "Include" columns can extend an index past the 900 bytes mark, since they aren't part of the index key. That doesn't change the storage issues with "text" and "ntext".)

    Try this:

    USE ProofOfConcept;

    go

    CREATE TABLE dbo.IndexTest (

    ID INT IDENTITY PRIMARY KEY,

    Col1 DATETIME,

    Col2 VARCHAR(8000));

    GO

    CREATE INDEX IDX_IndexTest ON dbo.IndexTest (Col1) INCLUDE (Col2);

    GO

    INSERT INTO dbo.IndexTest (Col1, Col2)

    VALUES (GETDATE(), REPLICATE('a',8000));

    GO

    CREATE INDEX IDX_IndexTest2 ON dbo.IndexTest (Col1, Col2);

    GO

    DROP TABLE dbo.IndexTest;

    You'll get no error for the first index and insert, but the second index will fail on creation because there's data in there that it can't allow (more than 900 bytes).

    You can even Include a varchar(max) column in an index.

    But you can't include a text or ntext column, because of how they're stored.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • great greg...can you post an Actual execution plan? that's where a few more details will help as well

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Oh boy! I don't know which index did it, but I got it to do my Top 5000 in about 4 minutes whereas previous attempts I had to kill the query after 20 or 30 minutes.

  • Lowell (6/10/2011)


    great greg...can you post an Actual execution plan? that's where a few more details will help as well

    I've got it running now in ~4 minutes....but I'll attach it anyway.

  • ok one imporvement i see is some out of date statistics on the Table CorrespondenceLog;

    the estimated plan expecting 4 rows, but the actual ended up using almost 12 million.

    can you

    UPDATE STATISTICS CorrespondenceLog WITH FULLSCAN

    UPDATE STATISTICS CorrespondenceTemplate WITH FULLSCAN

    and see what impact that has?

    [edit] also bad CorrespondenceTemplate

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Lowell (6/10/2011)


    ok one imporvement i see is some out of date statistics on the Table CorrespondenceLog;

    the estimated plan expecting 4 rows, but the actual ended up using almost 12 million.

    can you

    UPDATE STATISTICS CorrespondenceLog WITH FULLSCAN

    UPDATE STATISTICS CorrespondenceTemplate WITH FULLSCAN

    and see what impact that has?

    [edit] also bad CorrespondenceTemplate

    Will do. I'm guessing this is attributed to doing a full restore from the production database yesterday, although I'm not sure why it thinks there should only be 4 rows in that table when there has ALWAYS been over 10 million...

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply