Table Function returns varying number of records for an update statement

  • I have a stored procedure that run a number of update statements using table function to select 10% of the records based on the year the data was collected.

    I’m working on a new table, which currently has twelve records, so two records should get the update statement applied (the update statement puts a timestamp in a column that marks the data for review). However, the update is being applied to varying number of records. I have seen the update applied to zero to five records. I just cannot see what the issue is. If I use the table function to just select records, it always grabs just two records. The table function returns the table’s own primary key field for a join so there should be no duplicates seen by the update statement. Unless, I am missing something.

    The database is on SQL Server is 2014, but the database compatibility level is 100 (2008).

    When I created a stripped down db to test and post (2 tables, 1 view, 1 table value function), I stopped seeing the varying number of record effected by the update statement and always saw 2 records affected. The compatibility level of the new database was 120 (the biggest difference I saw in database settings). If I change the compatibility level to 100, I started seeing varying number of records affected.

    Does anyone know why the database at comparability level of 100 is selecting a varying number of records for the update statement?

    Thanks

    Below is the code to create a database, 2 tables with data, a view, and a table function.

    USE master;
    GO
    CREATE DATABASE Trouble;
    GO

    USE [Trouble]
    GO

    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    CREATE TABLE [dbo].[sample](
    [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
    [sample_date] [date] NOT NULL,
    [sample_year] AS (datepart(year,[sample_date]))
    );
    GO

    SET IDENTITY_INSERT [sample] ON;
    INSERT INTO dbo.[sample]
    (Id, sample_date)
    VALUES (410, CAST(N'2019-11-01' AS Date))
    ,(487, CAST(N'2019-06-30' AS Date))
    ,(488, CAST(N'2019-06-29' AS Date))
    ,(489, CAST(N'2019-06-26' AS Date))
    ,(490, CAST(N'2019-06-27' AS Date))
    ,(491, CAST(N'2019-06-25' AS Date))
    ,(492, CAST(N'2019-06-28' AS Date))
    ,(493, CAST(N'2019-06-15' AS Date))
    ,(494, CAST(N'2019-06-16' AS Date))
    ,(495, CAST(N'2019-07-23' AS Date))
    ,(496, CAST(N'2019-07-31' AS Date))
    ,(497, CAST(N'2019-08-01' AS Date))
    ,(498, CAST(N'2019-07-25' AS Date))
    ,(499, CAST(N'2019-07-28' AS Date))
    ,(500, CAST(N'2019-07-27' AS Date))
    ,(501, CAST(N'2019-07-26' AS Date))
    ,(502, CAST(N'2019-07-24' AS Date))
    ,(503, CAST(N'2019-08-10' AS Date))
    ,(504, CAST(N'2019-08-11' AS Date))
    ,(505, CAST(N'2019-08-09' AS Date))
    ,(506, CAST(N'2019-08-08' AS Date))
    ,(507, CAST(N'2019-08-12' AS Date))
    ,(508, CAST(N'2019-08-06' AS Date))
    ,(509, CAST(N'2019-08-07' AS Date))
    ,(510, CAST(N'2019-08-15' AS Date))
    ,(511, CAST(N'2019-08-13' AS Date));
    GO
    SET IDENTITY_INSERT [sample] OFF;
    GO

    CREATE TABLE [dbo].[moss](
    [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
    [plot_sample_Id] [int] NOT NULL FOREIGN KEY REFERENCES [sample](Id),
    [transect_code] [int] NOT NULL,
    [moss_point_m_code] [int] NOT NULL,
    [moss_depth_cm] [float] NOT NULL,
    [moss_depth_comments] [varchar](255) NULL,
    [TenPercentQAFlag] [datetime] NULL,
    );
    GO

    INSERT [dbo].[moss] ([plot_sample_Id], [transect_code], [moss_point_m_code], [moss_depth_cm], [moss_depth_comments])
    VALUES (410, 1, 1, 4, N'test')
    ,(410, 1, 8, 6, N'test modification')
    ,(410, 2, 1, 2, NULL)
    ,(410, 2, 8, 0.1, N'trace')
    ,(410, 2, 18, 0.5, NULL)
    ,(410, 2, 22, 0, NULL)
    ,(410, 2, 29, 0, NULL)
    ,(410, 3, 1, 0, NULL)
    ,(410, 3, 8, 0, NULL)
    ,(410, 3, 18, 0, NULL)
    ,(410, 3, 22, 0, NULL)
    ,(410, 3, 29, 0, NULL);
    GO

    --create GUID generator, use to random sort data
    CREATE VIEW [dbo].[_generateNewId]
    AS
    SELECT NEWID() AS NewId

    GO


    --function to get 10% of records
    CREATE FUNCTION [dbo].[GetTenPercent_Moss]
    (
    @parentId int,
    @sampleyear int
    )
    RETURNS TABLE
    AS
    RETURN
    (
    SELECT TOP 10 PERCENT sample_year, C.plot_sample_Id, C.Id
    FROM dbo.[sample] P
    INNER JOIN dbo.moss C
    ON P.Id = C.plot_sample_Id
    WHERE C.plot_sample_Id = @parentId and sample_year = @sampleyear
    ORDER BY (SELECT [NewId] FROM dbo._generateNewId)
    );
    GO
    --end database creation

    Here is the code that mimics the stored procedure. It empties the TenPercentQAFlag field (used to identify record for review). It then uses the view and table function to select records to get updated.

    --ALTER DATABASE Trouble SET COMPATIBILITY_LEVEL = 100;
    --ALTER DATABASE Trouble SET COMPATIBILITY_LEVEL = 120;

    USE Trouble;
    SET NOCOUNT ON;
    DECLARE @SampleYear int = 2019;

    --remove any previous flag
    UPDATE [moss] SET TenPercentQAFlag = NULL;
    SET NOCOUNT OFF; --see the number of records updated in the message tab

    update T set T.TenPercentQAFlag = CURRENT_TIMESTAMP
    --select T.TenPercentQAFlag
    from dbo.moss T
    inner join
    (
    select GTP.Id
    from dbo.[sample] P
    cross apply dbo.GetTenPercent_Moss(P.Id, @SampleYear) GTP
    ) as TenPct
    on T.Id = TenPct.Id

    --clean up
    --USE master; ALTER DATABASE Trouble SET SINGLE_USER WITH ROLLBACK IMMEDIATE; DROP DATABASE Trouble;
  • To  quote Ned Flanders, "Wow! As melon scratchers go, that's a honey doodle."

    I can confirm that I see the same issue and racked my brain trying to figure out how to fix this...

    The answer to "why this is happening?" would be that NEWID() is a side-effecting operator which is why it cannot be used in T-SQL inline functions. The work-around is to create a view with a NEWID() and reference that - that's what you're doing here.  The NEWID as a view is a hack and this behavior is related to the unpredictability of side-effecting operators. Your function is not deterministic (the same result is not guaranteed even when you send the same parameters to the same function) - in that regard, you are getting the correct results regardless of compatibility level.

    It's baffling, too, how the SELECT statement that you commented out always works as expected but the UPDATE does not.

    All that said, you can circumvent this problem by putting the filter IDs into a temp table as shown below. This always works as you would expect on all versions of SQL (2008++):

    IF OBJECT_ID('tempdb..#TenPctIDs') IS NOT NULL DROP TABLE #TenPctIDs;
    select T.Id
    INTO #TenPctIDs
    from dbo.moss T
    cross apply
    (
    select GTP.Id
    from dbo.[sample] P
    cross apply dbo.GetTenPercent_Moss(P.Id, @SampleYear) GTP
    ) as TenPct
    WHERE T.Id = TenPct.Id

    update T set TenPercentQAFlag = CURRENT_TIMESTAMP
    from dbo.moss T
    WHERE T.Id IN (SELECT t.ID FROM #TenPctIDs AS t);

    • This reply was modified 4 years, 11 months ago by  Alan Burstein.
    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • Thanks Alan,

    I've been scratching my head about it. The Ten Percent function was designed to grab a random set of records. Using the New ID is a pseudo way to get random set of record. It's not what I would have come up with, but you take was is given sometimes. (I did look at making the view use the RAND() function instead of New ID, but it didn't alter the varying number of updates.)

    I'm really curios to what is happening internally for the varying set of records that get updated.

    One other fix that a co-worker proposed was to pull out the FK to the sample table (plot_sample_Id) as an additional part of the join clause.

    update T set T.TenPercentQAFlag = CURRENT_TIMESTAMP
    --select T.TenPercentQAFlag
    from dbo.moss T
    inner join
    (
    select GTP.Id, GTP.plot_sample_Id
    from dbo.[sample] P
    cross apply dbo.GetTenPercent_Moss(P.Id, @SampleYear) GTP
    ) as TenPct
    on T.Id = TenPct.Id
    AND T.plot_sample_Id = TenPct.plot_sample_Id

    I haven't fully tested it yet, but they have more experience than I do. Well, I know what I'll be doing in the morning.

  • I'm really curios to what is happening internally for the varying set of records that get updated.

    This is one of the more interesting problems I've seen in a while on this forum. It really baffled me. Even though the NEWID trick is a hack the results should not change for a specific T-SQL function between versions unless it's using some undocumented or depreciated T-SQL syntax. It almost seems worthy of being reported as a bug if not for the fact that SQL 2008 is completely unsupported now.

    Post an update if you ever figure this out.

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • See what happens when you make the following change:

    ALTER FUNCTION [dbo].[GetTenPercent_Moss]
    (
    @parentId int,
    @sampleyear int
    )
    RETURNS TABLE
    AS
    RETURN
    (
    SELECT TOP (10) PERCENT
    [P].[sample_year]
    , [C].[plot_sample_Id]
    , [C].[Id]
    FROM
    [dbo].[sample] AS [P]
    INNER JOIN [dbo].[moss] AS [C]
    ON [P].[Id] = [C].[plot_sample_Id]
    CROSS APPLY (SELECT [NewId] FROM [dbo].[_generateNewId] AS [gni]) [ca1]
    WHERE
    [C].[plot_sample_Id] = @parentId AND [P].[sample_year] = @sampleyear
    ORDER BY
    [ca1].[NewId]
    );
  • It's not about DB compatibility level.

    Run the UPDATE repeatedly - 10, 20 times on the same level.

    You'll get any number of rows updated - between zero and 5 (the biggest number I've seen)

    The problem is in the function. To be precise, the function is the problem.

    You have [sample] referenced twice - in the function and in the query the function is applied to.

    Every time you call it in the CROSS APPLY the set of NEW_ID's get renewed, so, you're getting 2 sets of 10 percent, which may be overlapping, or may be not.

    I tried to simplify the query, and it gives me consistent results:

    UPDATE [moss] SET TenPercentQAFlag = NULL; 

    update T set T.TenPercentQAFlag = CURRENT_TIMESTAMP
    --select T.TenPercentQAFlag
    from dbo.moss T
    inner join
    (SELECT TOP 10 PERCENT --sample_year, C.plot_sample_Id,
    C.Id
    FROM dbo.[sample] P
    INNER JOIN dbo.moss C
    ON P.Id = C.plot_sample_Id
    WHERE P.sample_year = @sampleyear
    ORDER BY (SELECT [NewId] FROM dbo._generateNewId)
    ) as TenPct
    on T.Id = TenPct.Id

    select * from [moss]

     

    P.S. You're filtering the records by a value in non-indexed computed column.

    It's a guarantee for full table scan on every query.

    You better rethink this approach

     

    _____________
    Code for TallyGenerator

  • I figured it was something besides compatibility level, but I just could not see it.

    Lynn's solution is able to meet my needs with pulling the view out of the Order by clause and making it a column and then sorting by that column.

    Sergiy, unfortunately I left out a user requirement that the 10% was to be for each sample and not just year. I would have preferred by year and not sample. And yes, this is not ever going to be fast. Unfortunately, that will need to be a fix for another time.

  • Tim S wrote:

    I have a stored procedure that run a number of update statements using table function to select 10% of the records based on the year the data was collected.

    I’m working on a new table, which currently has twelve records, so two records should get the update statement applied (the update statement puts a timestamp in a column that marks the data for review). However, the update is being applied to varying number of records. I have seen the update applied to zero to five records. I just cannot see what the issue is. If I use the table function to just select records, it always grabs just two records. The table function returns the table’s own primary key field for a join so there should be no duplicates seen by the update statement. Unless, I am missing something.

    The database is on SQL Server is 2014, but the database compatibility level is 100 (2008).

    When I created a stripped down db to test and post (2 tables, 1 view, 1 table value function), I stopped seeing the varying number of record effected by the update statement and always saw 2 records affected. The compatibility level of the new database was 120 (the biggest difference I saw in database settings). If I change the compatibility level to 100, I started seeing varying number of records affected.

    Does anyone know why the database at comparability level of 100 is selecting a varying number of records for the update statement?

    Thanks

    Below is the code to create a database, 2 tables with data, a view, and a table function.

    USE master;
    GO
    CREATE DATABASE Trouble;
    GO

    USE [Trouble]
    GO

    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    CREATE TABLE [dbo].[sample](
    [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
    [sample_date] [date] NOT NULL,
    [sample_year] AS (datepart(year,[sample_date]))
    );
    GO

    SET IDENTITY_INSERT [sample] ON;
    INSERT INTO dbo.[sample]
    (Id, sample_date)
    VALUES (410, CAST(N'2019-11-01' AS Date))
    ,(487, CAST(N'2019-06-30' AS Date))
    ,(488, CAST(N'2019-06-29' AS Date))
    ,(489, CAST(N'2019-06-26' AS Date))
    ,(490, CAST(N'2019-06-27' AS Date))
    ,(491, CAST(N'2019-06-25' AS Date))
    ,(492, CAST(N'2019-06-28' AS Date))
    ,(493, CAST(N'2019-06-15' AS Date))
    ,(494, CAST(N'2019-06-16' AS Date))
    ,(495, CAST(N'2019-07-23' AS Date))
    ,(496, CAST(N'2019-07-31' AS Date))
    ,(497, CAST(N'2019-08-01' AS Date))
    ,(498, CAST(N'2019-07-25' AS Date))
    ,(499, CAST(N'2019-07-28' AS Date))
    ,(500, CAST(N'2019-07-27' AS Date))
    ,(501, CAST(N'2019-07-26' AS Date))
    ,(502, CAST(N'2019-07-24' AS Date))
    ,(503, CAST(N'2019-08-10' AS Date))
    ,(504, CAST(N'2019-08-11' AS Date))
    ,(505, CAST(N'2019-08-09' AS Date))
    ,(506, CAST(N'2019-08-08' AS Date))
    ,(507, CAST(N'2019-08-12' AS Date))
    ,(508, CAST(N'2019-08-06' AS Date))
    ,(509, CAST(N'2019-08-07' AS Date))
    ,(510, CAST(N'2019-08-15' AS Date))
    ,(511, CAST(N'2019-08-13' AS Date));
    GO
    SET IDENTITY_INSERT [sample] OFF;
    GO

    CREATE TABLE [dbo].[moss](
    [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
    [plot_sample_Id] [int] NOT NULL FOREIGN KEY REFERENCES [sample](Id),
    [transect_code] [int] NOT NULL,
    [moss_point_m_code] [int] NOT NULL,
    [moss_depth_cm] [float] NOT NULL,
    [moss_depth_comments] [varchar](255) NULL,
    [TenPercentQAFlag] [datetime] NULL,
    );
    GO

    INSERT [dbo].[moss] ([plot_sample_Id], [transect_code], [moss_point_m_code], [moss_depth_cm], [moss_depth_comments])
    VALUES (410, 1, 1, 4, N'test')
    ,(410, 1, 8, 6, N'test modification')
    ,(410, 2, 1, 2, NULL)
    ,(410, 2, 8, 0.1, N'trace')
    ,(410, 2, 18, 0.5, NULL)
    ,(410, 2, 22, 0, NULL)
    ,(410, 2, 29, 0, NULL)
    ,(410, 3, 1, 0, NULL)
    ,(410, 3, 8, 0, NULL)
    ,(410, 3, 18, 0, NULL)
    ,(410, 3, 22, 0, NULL)
    ,(410, 3, 29, 0, NULL);
    GO

    --create GUID generator, use to random sort data
    CREATE VIEW [dbo].[_generateNewId]
    AS
    SELECT NEWID() AS NewId

    GO


    --function to get 10% of records
    CREATE FUNCTION [dbo].[GetTenPercent_Moss]
    (
    @parentId int,
    @sampleyear int
    )
    RETURNS TABLE
    AS
    RETURN
    (
    SELECT TOP 10 PERCENT sample_year, C.plot_sample_Id, C.Id
    FROM dbo.[sample] P
    INNER JOIN dbo.moss C
    ON P.Id = C.plot_sample_Id
    WHERE C.plot_sample_Id = @parentId and sample_year = @sampleyear
    ORDER BY (SELECT [NewId] FROM dbo._generateNewId)
    );
    GO
    --end database creation

    Here is the code that mimics the stored procedure. It empties the TenPercentQAFlag field (used to identify record for review). It then uses the view and table function to select records to get updated.

    --ALTER DATABASE Trouble SET COMPATIBILITY_LEVEL = 100;
    --ALTER DATABASE Trouble SET COMPATIBILITY_LEVEL = 120;

    USE Trouble;
    SET NOCOUNT ON;
    DECLARE @SampleYear int = 2019;

    --remove any previous flag
    UPDATE [moss] SET TenPercentQAFlag = NULL;
    SET NOCOUNT OFF; --see the number of records updated in the message tab

    update T set T.TenPercentQAFlag = CURRENT_TIMESTAMP
    --select T.TenPercentQAFlag
    from dbo.moss T
    inner join
    (
    select GTP.Id
    from dbo.[sample] P
    cross apply dbo.GetTenPercent_Moss(P.Id, @SampleYear) GTP
    ) as TenPct
    on T.Id = TenPct.Id

    --clean up
    --USE master; ALTER DATABASE Trouble SET SINGLE_USER WITH ROLLBACK IMMEDIATE; DROP DATABASE Trouble;

    Hi Tim,

    can you ping us over the actual execution plan, got a hunch that there is a relatively simple cause here.

     

    😎

     

  • Eirikur Eiriksson wrote:

    Tim S wrote:

    I have a stored procedure that run a number of update statements using table function to select 10% of the records based on the year the data was collected.

    I’m working on a new table, which currently has twelve records, so two records should get the update statement applied (the update statement puts a timestamp in a column that marks the data for review). However, the update is being applied to varying number of records. I have seen the update applied to zero to five records. I just cannot see what the issue is. If I use the table function to just select records, it always grabs just two records. The table function returns the table’s own primary key field for a join so there should be no duplicates seen by the update statement. Unless, I am missing something.

    The database is on SQL Server is 2014, but the database compatibility level is 100 (2008).

    When I created a stripped down db to test and post (2 tables, 1 view, 1 table value function), I stopped seeing the varying number of record effected by the update statement and always saw 2 records affected. The compatibility level of the new database was 120 (the biggest difference I saw in database settings). If I change the compatibility level to 100, I started seeing varying number of records affected.

    Does anyone know why the database at comparability level of 100 is selecting a varying number of records for the update statement?

    Thanks

    Below is the code to create a database, 2 tables with data, a view, and a table function.

    USE master;
    GO
    CREATE DATABASE Trouble;
    GO

    USE [Trouble]
    GO

    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    CREATE TABLE [dbo].[sample](
    [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
    [sample_date] [date] NOT NULL,
    [sample_year] AS (datepart(year,[sample_date]))
    );
    GO

    SET IDENTITY_INSERT [sample] ON;
    INSERT INTO dbo.[sample]
    (Id, sample_date)
    VALUES (410, CAST(N'2019-11-01' AS Date))
    ,(487, CAST(N'2019-06-30' AS Date))
    ,(488, CAST(N'2019-06-29' AS Date))
    ,(489, CAST(N'2019-06-26' AS Date))
    ,(490, CAST(N'2019-06-27' AS Date))
    ,(491, CAST(N'2019-06-25' AS Date))
    ,(492, CAST(N'2019-06-28' AS Date))
    ,(493, CAST(N'2019-06-15' AS Date))
    ,(494, CAST(N'2019-06-16' AS Date))
    ,(495, CAST(N'2019-07-23' AS Date))
    ,(496, CAST(N'2019-07-31' AS Date))
    ,(497, CAST(N'2019-08-01' AS Date))
    ,(498, CAST(N'2019-07-25' AS Date))
    ,(499, CAST(N'2019-07-28' AS Date))
    ,(500, CAST(N'2019-07-27' AS Date))
    ,(501, CAST(N'2019-07-26' AS Date))
    ,(502, CAST(N'2019-07-24' AS Date))
    ,(503, CAST(N'2019-08-10' AS Date))
    ,(504, CAST(N'2019-08-11' AS Date))
    ,(505, CAST(N'2019-08-09' AS Date))
    ,(506, CAST(N'2019-08-08' AS Date))
    ,(507, CAST(N'2019-08-12' AS Date))
    ,(508, CAST(N'2019-08-06' AS Date))
    ,(509, CAST(N'2019-08-07' AS Date))
    ,(510, CAST(N'2019-08-15' AS Date))
    ,(511, CAST(N'2019-08-13' AS Date));
    GO
    SET IDENTITY_INSERT [sample] OFF;
    GO

    CREATE TABLE [dbo].[moss](
    [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
    [plot_sample_Id] [int] NOT NULL FOREIGN KEY REFERENCES [sample](Id),
    [transect_code] [int] NOT NULL,
    [moss_point_m_code] [int] NOT NULL,
    [moss_depth_cm] [float] NOT NULL,
    [moss_depth_comments] [varchar](255) NULL,
    [TenPercentQAFlag] [datetime] NULL,
    );
    GO

    INSERT [dbo].[moss] ([plot_sample_Id], [transect_code], [moss_point_m_code], [moss_depth_cm], [moss_depth_comments])
    VALUES (410, 1, 1, 4, N'test')
    ,(410, 1, 8, 6, N'test modification')
    ,(410, 2, 1, 2, NULL)
    ,(410, 2, 8, 0.1, N'trace')
    ,(410, 2, 18, 0.5, NULL)
    ,(410, 2, 22, 0, NULL)
    ,(410, 2, 29, 0, NULL)
    ,(410, 3, 1, 0, NULL)
    ,(410, 3, 8, 0, NULL)
    ,(410, 3, 18, 0, NULL)
    ,(410, 3, 22, 0, NULL)
    ,(410, 3, 29, 0, NULL);
    GO

    --create GUID generator, use to random sort data
    CREATE VIEW [dbo].[_generateNewId]
    AS
    SELECT NEWID() AS NewId

    GO


    --function to get 10% of records
    CREATE FUNCTION [dbo].[GetTenPercent_Moss]
    (
    @parentId int,
    @sampleyear int
    )
    RETURNS TABLE
    AS
    RETURN
    (
    SELECT TOP 10 PERCENT sample_year, C.plot_sample_Id, C.Id
    FROM dbo.[sample] P
    INNER JOIN dbo.moss C
    ON P.Id = C.plot_sample_Id
    WHERE C.plot_sample_Id = @parentId and sample_year = @sampleyear
    ORDER BY (SELECT [NewId] FROM dbo._generateNewId)
    );
    GO
    --end database creation

    Here is the code that mimics the stored procedure. It empties the TenPercentQAFlag field (used to identify record for review). It then uses the view and table function to select records to get updated.

    --ALTER DATABASE Trouble SET COMPATIBILITY_LEVEL = 100;
    --ALTER DATABASE Trouble SET COMPATIBILITY_LEVEL = 120;

    USE Trouble;
    SET NOCOUNT ON;
    DECLARE @SampleYear int = 2019;

    --remove any previous flag
    UPDATE [moss] SET TenPercentQAFlag = NULL;
    SET NOCOUNT OFF; --see the number of records updated in the message tab

    update T set T.TenPercentQAFlag = CURRENT_TIMESTAMP
    --select T.TenPercentQAFlag
    from dbo.moss T
    inner join
    (
    select GTP.Id
    from dbo.[sample] P
    cross apply dbo.GetTenPercent_Moss(P.Id, @SampleYear) GTP
    ) as TenPct
    on T.Id = TenPct.Id

    --clean up
    --USE master; ALTER DATABASE Trouble SET SINGLE_USER WITH ROLLBACK IMMEDIATE; DROP DATABASE Trouble;

    Hi Tim,

    can you ping us over the actual execution plan, got a hunch that there is a relatively simple cause here.

    😎

    Could be indexing, meaning the missing of a good covering index that would eliminate a clustered index scan.  If your itvf is as simple as it you posted, there is an index that could possible help.  Unfortunately I dropped the database and won't be able torecreate it until later tonight after work.

     

  • Eirikur - Sure I've attached the plan running at Level 100.

    This area of SQL, I am very much still learning. I can spot some stuff, but I'm still gaining the knowledge to full read a plan. It's ugly and from the original table function. I haven't had time to look at the plan with the modified function that Lynn provided. I won't get the opportunity till this weekend or Monday.

    The tfv is that simple; there are about 20 different data collection methods, so a tfv for each one.

    Thanks!

  • Tim S wrote:

    Eirikur - Sure I've attached the plan running at Level 100.

    This area of SQL, I am very much still learning. I can spot some stuff, but I'm still gaining the knowledge to full read a plan. It's ugly and from the original table function. I haven't had time to look at the plan with the modified function that Lynn provided. I won't get the opportunity till this weekend or Monday.

    The tfv is that simple; there are about 20 different data collection methods, so a tfv for each one.

    Thanks!

    Am I missing something?  Not sure where the plan is.

     

  • Lynn Pettis wrote:

    Tim S wrote:

    Eirikur - Sure I've attached the plan running at Level 100.

    This area of SQL, I am very much still learning. I can spot some stuff, but I'm still gaining the knowledge to full read a plan. It's ugly and from the original table function. I haven't had time to look at the plan with the modified function that Lynn provided. I won't get the opportunity till this weekend or Monday.

    The tfv is that simple; there are about 20 different data collection methods, so a tfv for each one.

    Thanks!

    Am I missing something?  Not sure where the plan is.

    Like in any other "job", we need a plan

    😎

     

  • I tried to attach the plan. I missed this message.

    Upload Errors:

    Trouble_Level100.sqlplan: Sorry, this file type is not permitted for security reasons.

    So I guess I won't be doing that. I'm not sure how else to get it in the thread.

  • One way, put it in a zip file, or rename it to Trouble_Level100.txt.  It is actually just an XML file.

     

  • True. I need a new brain.

    Renamed the file to *.txt

Viewing 15 posts - 1 through 15 (of 20 total)

You must be logged in to reply to this topic. Login to reply