Having my query use less resources.

  • My question is more on which is more on what is efficient in terms of resource use. The first part is where I store the data in a temp table and from that to count the data to display an accurate count. The recursive division gave an incaccurate count so I had to put it in a temp table. Then after that it does the recursive division again to return data to be used on a page.

    --To Maintain an accurate count store the ids into a temp table and retrieve the count from that

    DECLARE @IDTable TABLE

    (

    QIDint

    ,DateCreatedsmalldatetime

    )

    INSERT INTO @IDTable (QID, DateCreated)

    SELECTM.QID, Q.DateCreated

    FROMtbl_Q_Tags_Map M

    JOINtbl_Q Q ON M.QID=Q.QID

    WHERETagID IN (SELECT Token FROM @TokenTable)

    AND Q.[Language]=@Language

    AND Q.StatusID=60

    GROUPBY M.QID, Q.DateCreated

    HAVINGCOUNT(DISTINCT TagID)=@TokenCount;

    SELECT@QCount = COUNT(QID)

    FROM@IDTable;

    ----------------------------------------------------------

    -- GRAB ONLY THE ROWS FOR THE INITIAL DISPLAY

    ----------------------------------------------------------

    WITH Results AS (

    SELECT ROW_NUMBER()

    OVER(ORDER BY Q.DateCreated DESC) as Row, M.QID

    FROMtbl_Q_Tags_Map M

    JOINtbl_Q Q ON M.QID=Q.QID

    WHERETagID IN (SELECT Token FROM @TokenTable)

    AND Q.[Language]=@Language

    AND Q.StatusID=60

    GROUPBY M.QID, Q.DateCreated

    HAVINGCOUNT(DISTINCT TagID)=@TokenCount

    )

    SELECT

    A.Row

    ,A.QID

    FROMResults A JOIN tbl_Q B on A.QID=B.QID

    WHERERow BETWEEN @StartRowIndex AND @EndRowIndex AND B.[Language]=@Language

    ORDERBY Row ASC;

    Then the latter portion is changed to use the temp table:

    WITH Results AS (

    SELECT ROW_NUMBER()

    OVER(ORDER BY DateAsked DESC) as Row, QID

    FROM@IDTable

    )

    SELECT

    A.Row

    ,A.QID

    FROMResults A JOIN tbl_Q B on A.QID=B.QID

    WHERERow BETWEEN @StartRowIndex AND @EndRowIndex AND B.[Language]=@Language

    ORDERBY Row ASC;

    Is using a temp table in the second part use more resources and if there is any modifications to make it use less resources?

  • If all you need is a count, don't insert rows into a table variable, just get the count directly:

    SELECT@QCount = COUNT(*)

    FROMtbl_Q_Tags_Map M

    JOINtbl_Q Q ON M.QID=Q.QID

    WHERETagID IN (SELECT Token FROM @TokenTable)

    AND Q.[Language]=@Language

    AND Q.StatusID=60

    GROUPBY M.QID, Q.DateCreated

    HAVINGCOUNT(DISTINCT TagID)=@TokenCount;

    Change @TokenTable into a temp table and cluster it on Token.

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • This my token temp table :

    DECLARE @Tokens varchar(512)='5,12'

    DECLARE @TokenTable TABLE

    (

    Tokenint

    )

    INSERT INTO @TokenTable (Token)

    SELECT Token from dbo.fn_SplitString(@Tokens,',');

    If I run the query you suggested, the count comes out to 2 and not the overall count of the items.

  • I think that it would help if you can post CREATE TABLE statements for all tables involved, INSERT statements with a few rows of sample data, and the expected result. That makes it much easier to understand and answer your question. Currently anything I could contribute would be guesswork.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Like Hugo said, being able to recreate what you you have and knowing what question you're trying to answer is key in solving the problem. See the link in my signature if you need more info on what we're after.

    Without that, some general advice is to use statistics IO to see how much you're reading. Also, you're using a table variable, not a temp table. They're both small in your example, but is that how they are in your real environment? A pertinent difference is that temp tables have statistics, whereas table variables do not.

  • Here's the other tables

    CREATE TABLE [dbo].[tbl_Q](

    [QID] [int] IDENTITY(1,1) NOT NULL,

    [StatusID] [tinyint] NOT NULL,

    [Question] [nvarchar](256) NOT NULL,

    [Answer] [nvarchar](2048) NULL,

    [DateCreated] [smalldatetime] NULL,

    CONSTRAINT [PK_tbl_Q] PRIMARY KEY CLUSTERED

    (

    [QID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    ) ON [PRIMARY]

    CREATE TABLE [dbo].[tbl_Q_Tags_Map](

    [TLID] [int] IDENTITY(1,1) NOT NULL,

    [QID] [int] NOT NULL,

    [TagID] [int] NOT NULL,

    CONSTRAINT [PK_tbl_Q_Tags_Map] PRIMARY KEY CLUSTERED

    (

    [TLID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    ) ON [PRIMARY]

    SELECT @TokenCount=COUNT(Token) FROM @TokenTable;

    My initial query that i first posted was giving the correct result. My question was how do I make it use less resources, and ScottPletcher suggested to not use a temp table for a count and gave script. Now that script does not work, because it is giving me the wrong count. I want the count of questions, but it is giving me the count of 2, of which is the count of tokens in @TokenTable from my post above.

  • I don't have any sample data, so I can't possibly test the code first. If I had data, I would have tested it first.

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • Here is some sample data for tbl_Q_Tags_Map:

    TLIDTagIDQID

    15284838

    25291475

    35291480

    45291485

    512284838

    612291475

    712291480

    812291485

    Here's some sample data in tbl_Q. I only used the fields I though would be relevant, and excluded the other two fields :

    QIDStatusIDDateCreated

    28483860 11/29/2010 9:07

    29147560 1/22/2015 10:59

    29148060 1/22/2015 11:07

    29148560 2/15/2015 11:07

    If I use this: DECLARE @Tokens varchar(512)='5,12'

    @QCount should equal to 4 but it is returning 2, the number of tokens chosen.

  • That's the CREATE TABLE statements I was asking for, but not the INSERT statements, nor the expected results.

    Also, the queries in your original question reference two table variables; one is declared and inserted into in the query itself but the other one is apparently created and populated before that query runs.

    If you want my help in reducing resource usage, then make sure that I have what I need to do my work. Which is: a full repro script that I can copy from your post, paste into an empty database, and run. So that I can spend my time on doing what you actually need help with, which is tuning your query, instead of having to pull bits and pieces of information from you in a long drawn out discussion.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply