INSERT STATMENT Performance CONVERT_IMPLICT

  • In the following test query first INSERT statment execution plan shows implicit conversion for  values but the second one does not make
    implicit conversion for values. In my thought  , implicit conversion cause to performance problems but ı want to ask that am i overlook another
    issue about this test case?

    DROP TABLE IF EXISTS TblTest
    CREATE TABLE [dbo].TblTest
    (
     Val1  VARCHAR(50),
        Val2 VARCHAR(50) NULL,
        IntVal1  BIGINT NULL ,

    )

    INSERT  TblTest 
    (Val1,Val2,IntVal1)
    VALUES ('Val1','Val2',1231231 )
    go 30000

    ---- 27 second

    DECLARE @VarVal1 VARCHAR(50)='Germany'
    DECLARE @VarVal2 VARCHAR(50)='Europe'
    DECLARE @ValId BIGINT = 1231231
    INSERT INTO TblTest (Val1,Val2,IntVal1)
    VALUES (@VarVal1,@VarVal2,@ValId )
    GO 30000

    ---- 21 second

  • In your first query, the literal values 2 varchar(4)'s and an int, therefore, they are implicitly cast to a varchar(50) and bigint respectively when inserted. The datatypes are declared and match the table with your variables, so they don't need to be cast. As far as causing performance problems, no, this won't be an issue for what you have. Problems for implicit conversions mainly surround things like WHERE and ON clauses. For example, having a varchar than only stores integer values, and then having the clause YourNotIntColumn = 123. YourNotIntColumn would be implicitly cast to an int, and thus the query would be non-SARGable,

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • Thom thanks for your answer but the test case is very simple  and what can cause to performance problem according to this
    idea.  I am seeing Scalar Operator(CONVERT_IMPLICIT(varchar(50),[@2],0)) operation in the execution plan.

  • ESAT ERKEC - Monday, December 24, 2018 2:23 AM

    Thom thanks for your answer but the test case is very simple  and what can cause to performance problem according to this
    idea.  I am seeing Scalar Operator(CONVERT_IMPLICIT(varchar(50),[@2],0)) operation in the execution plan.

    Isn't that what I addressed in my post? 'Val1' is not a varchar(50), and 1231231 is not a bigint; they are a varchar(4) and int respectively. not sure what else I can add here.

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • ESAT ERKEC - Monday, December 24, 2018 12:54 AM

    In the following test query first INSERT statment execution plan shows implicit conversion for  values but the second one does not make
    implicit conversion for values. In my thought  , implicit conversion cause to performance problems but ı want to ask that am i overlook another
    issue about this test case?

    DROP TABLE IF EXISTS TblTest
    CREATE TABLE [dbo].TblTest
    (
     Val1  VARCHAR(50),
        Val2 VARCHAR(50) NULL,
        IntVal1  BIGINT NULL ,

    )

    INSERT  TblTest 
    (Val1,Val2,IntVal1)
    VALUES ('Val1','Val2',1231231 )
    go 30000

    ---- 27 second

    DECLARE @VarVal1 VARCHAR(50)='Germany'
    DECLARE @VarVal2 VARCHAR(50)='Europe'
    DECLARE @ValId BIGINT = 1231231
    INSERT INTO TblTest (Val1,Val2,IntVal1)
    VALUES (@VarVal1,@VarVal2,@ValId )
    GO 30000

    ---- 21 second

    Removing the implicit cast with cast/convert will not solve the problem, which is the difference between an unparameterized and parameterized statements.
    😎

    If you look at the ParameterList section of the execution plans, you'll see the difference.
    Query 1
    <ParameterList>
      <ColumnReference Column="@3" ParameterDataType="int" ParameterCompiledValue="(1231231)" ParameterRuntimeValue="(1231231)" />
      <ColumnReference Column="@2" ParameterDataType="varchar(8000)" ParameterCompiledValue="'Val2'" ParameterRuntimeValue="'Val2'" />
      <ColumnReference Column="@1" ParameterDataType="varchar(8000)" ParameterCompiledValue="'Val1'" ParameterRuntimeValue="'Val1'" />
    </ParameterList>

    Query 2

    <ParameterList>
      <ColumnReference Column="@ValId" ParameterDataType="bigint" ParameterRuntimeValue="(1231231)" />
      <ColumnReference Column="@VarVal2" ParameterDataType="varchar(50)" ParameterRuntimeValue="'Europe'" />
      <ColumnReference Column="@VarVal1" ParameterDataType="varchar(50)" ParameterRuntimeValue="'Germany'" />
    </ParameterList>

    Changed query without implicit cast
    INSERT TblTest WITH (TABLOCKX)
    (Val1,Val2,IntVal1)
    VALUES (CONVERT(VARCHAR(50),'Val1',0),CONVERT(VARCHAR(50),'Val2',0),CONVERT(BIGINT,1231231,0) )

    The parameter list from the changed query

       <ParameterList>
         <ColumnReference Column="@3" ParameterDataType="int" ParameterCompiledValue="(1231231)" ParameterRuntimeValue="(1231231)" />
         <ColumnReference Column="@2" ParameterDataType="varchar(8000)" ParameterCompiledValue="'Val2'" ParameterRuntimeValue="'Val2'" />
         <ColumnReference Column="@1" ParameterDataType="varchar(8000)" ParameterCompiledValue="'Val1'" ParameterRuntimeValue="'Val1'" />
        </ParameterList>

  • Thom I want to ask another think also in your approach the following statment does not cause to implict conversion because
    the length of string values lenghts  are 50 but still execution plan shows an implict conversion.

    INSERT  TblTest 
    (Val1,Val2,IntVal1)
    VALUES ('AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA',
    'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
    ,1231231 )

  • I suspect one of the reasons your query is "slower", is because in your first batch you are checking for the existance, and dropping and creating the table tblTest in every batch. it isn't a "fair" test. It should be:
    CREATE TABLE [dbo].TblTest
    (
    Val1 VARCHAR(50),
      Val2 VARCHAR(50) NULL,
      IntVal1 BIGINT NULL ,

    )
    GO
    INSERT TblTest
    (Val1,Val2,IntVal1)
    VALUES ('Val1','Val2',1231231 )
    go 30000

    ---- 27 second

    DECLARE @VarVal1 VARCHAR(50)='Germany'
    DECLARE @VarVal2 VARCHAR(50)='Europe'
    DECLARE @ValId BIGINT = 1231231
    INSERT INTO TblTest (Val1,Val2,IntVal1)
    VALUES (@VarVal1,@VarVal2,@ValId )
    GO 30000

    GO

    DROP TABLE dbo.TblTest

    As for performance, rather than using GO, why not use a tally table?
    WITH N AS(
      SELECT N
      FROM (VALUES(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL))N(N)),
    Tally AS(
      SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS I
      FROM N N1 --10
       CROSS JOIN N N2 --100
       CROSS JOIN N N3 --1000
       CROSS JOIN N N4 --10000
       CROSS JOIN N N5 --100000
      )
    SELECT TOP 30000 V.Val1,Val2,IntVal1
    FROM Tally
      CROSS JOIN (VALUES ('Val1','Val2',1231231 )) V(Val1,Val2,IntVal1);

    On my Sandbox server, the queries (after one attempt), ran in 1:51, 1:42 and Less Than 1 second respectively. As you can see, the Tally Table was far far far faster.

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • No Thom they are seperated queries, I wrote that table creation script to show
    table artitecture. In my thought implicit conversion cause this problem because it cosumes  some cpu
    resource and this issue affect to execution time but i can not be sure.

  • ESAT ERKEC - Monday, December 24, 2018 2:58 AM

    No Thom they are seperated queries, I wrote that table creation script to show
    table artitecture. In my thought implicit conversion cause this problem because it cosumes  some cpu
    resource and this issue affect to execution time but ı can not be sure.

    Then why does the query with the tally table run in less than a second, which has implicit conversions for the insert? 😀

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • Using GO [EXEC COUNT] is neither a realistic nor accurate way of testing as the execution overhead is far higher than the actual statement cost.
    😎

    Here is a more realistic test harness

    USE TEEST;
    GO
    SET NOCOUNT ON;

    IF OBJECT_ID(N'dbo.TBL_TEST_IMPLICIT') IS NOT NULL DROP TABLE dbo.TBL_TEST_IMPLICIT;
    DECLARE @SAMPLE_SIZE  INT = 1000000;
    ;WITH T(N) AS (SELECT X.N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) X(N))
    , NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE) ROW_NUMBER() OVER (ORDER BY @@VERSION) AS N
         FROM T T1,T T2,T T3,T T4,T T5,T T6)
    SELECT
      CONVERT(VARCHAR(8000),NEWID(),0) AS VAL_1
     ,CONVERT(VARCHAR(8000),NEWID(),0) AS VAL_2
     ,CONVERT(VARCHAR(50),NEWID(),0) AS VAL_3
     ,CONVERT(VARCHAR(50),NEWID(),0) AS VAL_4
     ,CONVERT(INT,ABS(CHECKSUM(NEWID())) % @SAMPLE_SIZE,0)  AS INT_1
     ,CONVERT(BIGINT,ABS(CHECKSUM(NEWID())) % @SAMPLE_SIZE,0) AS BINT_2
    INTO dbo.TBL_TEST_IMPLICIT
    FROM  NUMS  NM;

    IF OBJECT_ID(N'dbo.TBL_TEST_IMPLICIT_INSERT') IS NOT NULL DROP TABLE dbo.TBL_TEST_IMPLICIT_INSERT;
    CREATE TABLE dbo.TBL_TEST_IMPLICIT_INSERT
    (
      VAL_1  VARCHAR(50) NOT NULL
     ,VAL_2  VARCHAR(50) NOT NULL
     ,BINT_VAL BIGINT  NOT NULL
    );

    DECLARE @timer TABLE (T_TXT VARCHAR(50) NOT NULL, T_TS DATETIME2(7) NOT NULL DEFAULT (SYSDATETIME()));
    DECLARE @CHAR_BUCKET VARCHAR(50) = '';
    DECLARE @BIGINT_BUCKET BIGINT  = 0;

    TRUNCATE TABLE dbo.TBL_TEST_IMPLICIT_INSERT;

    INSERT INTO @timer(T_TXT) VALUES('DRY RUN');
    SELECT
      @CHAR_BUCKET = TI.VAL_1
     ,@CHAR_BUCKET = TI.VAL_2
     ,@BIGINT_BUCKET = TI.INT_1
    FROM dbo.TBL_TEST_IMPLICIT TI
    INSERT INTO @timer(T_TXT) VALUES('DRY RUN');

    INSERT INTO @timer(T_TXT) VALUES('IMPLICIT');
    INSERT INTO dbo.TBL_TEST_IMPLICIT_INSERT WITH (TABLOCKX) (VAL_1,VAL_2,BINT_VAL)
    SELECT
      TI.VAL_1
     ,TI.VAL_2
     ,TI.INT_1
    FROM dbo.TBL_TEST_IMPLICIT TI
    OPTION (MAXDOP 1);
    INSERT INTO @timer(T_TXT) VALUES('IMPLICIT');

    TRUNCATE TABLE dbo.TBL_TEST_IMPLICIT_INSERT;

    INSERT INTO @timer(T_TXT) VALUES('DRY RUN');
    SELECT
      @CHAR_BUCKET = TI.VAL_3
     ,@CHAR_BUCKET = TI.VAL_4
     ,@BIGINT_BUCKET = TI.BINT_2
    FROM dbo.TBL_TEST_IMPLICIT TI
    INSERT INTO @timer(T_TXT) VALUES('DRY RUN');

    INSERT INTO @timer(T_TXT) VALUES('NO CONVERSION');
    INSERT INTO dbo.TBL_TEST_IMPLICIT_INSERT WITH (TABLOCKX) (VAL_1,VAL_2,BINT_VAL)
    SELECT
      TI.VAL_3
     ,TI.VAL_4
     ,TI.BINT_2
    FROM dbo.TBL_TEST_IMPLICIT TI
    OPTION (MAXDOP 1);
    INSERT INTO @timer(T_TXT) VALUES('NO CONVERSION');

    SELECT
      T.T_TXT
     ,DATEDIFF(MICROSECOND,MIN(T.T_TS),MAX(T.T_TS)) AS DURATION
    FROM @timer T
    GROUP BY T.T_TXT
    ORDER BY DURATION ASC;

  • I made a test in SQLQueryStress at the same condition about two INSERT command and
    there is a diffrence between CPU iteration avg cost so I insist on my claim 🙂

  • ESAT ERKEC - Monday, December 24, 2018 6:03 AM

    I made a test in SQLQueryStress at the same condition about two INSERT command and
    there is a diffrence between CPU iteration avg cost so I insist on my claim 🙂

    OK, so what is it your really asking here now? You've got the "why" there's an implicit cast, so what are you after here? If it's validation, Eirikur has written a very useful script for you to test which shows the results between variables vs VALUES.

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • Thom A - Monday, December 24, 2018 8:14 AM

    ESAT ERKEC - Monday, December 24, 2018 6:03 AM

    I made a test in SQLQueryStress at the same condition about two INSERT command and
    there is a diffrence between CPU iteration avg cost so I insist on my claim 🙂

    OK, so what is it your really asking here now? You've got the "why" there's an implicit cast, so what are you after here? If it's validation, Eirikur has written a very useful script for you to test which shows the results between variables vs VALUES.

    SQLQueryStress has the same execution overhead as the GO [EXECUTION COUNT], not a realistic test at all.
    😎

  •  Thanks for your reply Eirikur  , Could you correct me if  am i wrong? In your test scrip , SQL Server  has some idea about the inserted values data type because it can guess data types over column data type also i could not find out any CONVERT_IMPLICIT scalar operator in the following script execution plan however simple INSERT INTO statement execution plan includes CONVERT_IMPLICIT operation. Although i am not sure about this approach but SQL Server have does not any idea about explicit value where in double quote so it tries to make convert this value to mapped inserted column data type.


     IF OBJECT_ID(N'dbo.TBL_TEST_IMPLICIT') IS NOT NULL DROP TABLE dbo.TBL_TEST_IMPLICIT;
    DECLARE @SAMPLE_SIZE  INT = 1000000;
    WITH T(N) AS (SELECT X.N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) X(N))
    ,  NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE) ROW_NUMBER() OVER (ORDER BY @@VERSION) AS N
         FROM T T1,T T2,T T3,T T4,T T5,T T6)
    SELECT
      CONVERT(VARCHAR(8000),NEWID(),0) AS VAL_1
     ,CONVERT(VARCHAR(8000),NEWID(),0) AS VAL_2
     ,CONVERT(VARCHAR(50),NEWID(),0) AS VAL_3
     ,CONVERT(VARCHAR(50),NEWID(),0) AS VAL_4
     ,CONVERT(INT,ABS(CHECKSUM(NEWID())) % @SAMPLE_SIZE,0)  AS INT_1
     ,CONVERT(BIGINT,ABS(CHECKSUM(NEWID())) % @SAMPLE_SIZE,0) AS BINT_2
    INTO dbo.TBL_TEST_IMPLICIT
    FROM  NUMS  NM;
    DROP TABLE IF EXISTS TBL_TEST_IMPLICIT_INSERT
    CREATE TABLE dbo.TBL_TEST_IMPLICIT_INSERT
    (
      VAL_1  VARCHAR(50) NOT NULL
     ,VAL_2  VARCHAR(50) NOT NULL
     ,BINT_VAL BIGINT  NOT NULL
    );

    INSERT INTO dbo.TBL_TEST_IMPLICIT_INSERT WITH (TABLOCKX) (VAL_1,VAL_2,BINT_VAL)SELECT  TI.VAL_1 ,TI.VAL_2 ,TI.INT_1FROM dbo.TBL_TEST_IMPLICIT  TI

    INSERT INTO dbo.TBL_TEST_IMPLICIT_INSERT WITH (TABLOCKX) (VAL_1,VAL_2,BINT_VAL)
    SELECT
      TI.VAL_1
     ,TI.VAL_2
     ,TI.INT_1
    FROM dbo.TBL_TEST_IMPLICIT  TI

  • ESAT ERKEC - Monday, December 24, 2018 9:56 AM

     Thanks for your reply Eirikur  , Could you correct me if  am i wrong? In your test scrip , SQL Server  has some idea about the inserted values data type because it can guess data types over column data type also i could not find out any CONVERT_IMPLICIT scalar operator in the following script execution plan however simple INSERT INTO statement execution plan includes CONVERT_IMPLICIT operation. Although ı am not sure about this approach but SQL Server have does not any idea about explicit value where in double quote so it tries to make convert this value to mapped inserted column data type.


     IF OBJECT_ID(N'dbo.TBL_TEST_IMPLICIT') IS NOT NULL DROP TABLE dbo.TBL_TEST_IMPLICIT;
    DECLARE @SAMPLE_SIZE  INT = 1000000;
    WITH T(N) AS (SELECT X.N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) X(N))
    ,  NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE) ROW_NUMBER() OVER (ORDER BY @@VERSION) AS N
         FROM T T1,T T2,T T3,T T4,T T5,T T6)
    SELECT
      CONVERT(VARCHAR(8000),NEWID(),0) AS VAL_1
     ,CONVERT(VARCHAR(8000),NEWID(),0) AS VAL_2
     ,CONVERT(VARCHAR(50),NEWID(),0) AS VAL_3
     ,CONVERT(VARCHAR(50),NEWID(),0) AS VAL_4
     ,CONVERT(INT,ABS(CHECKSUM(NEWID())) % @SAMPLE_SIZE,0)  AS INT_1
     ,CONVERT(BIGINT,ABS(CHECKSUM(NEWID())) % @SAMPLE_SIZE,0) AS BINT_2
    INTO dbo.TBL_TEST_IMPLICIT
    FROM  NUMS  NM;
    DROP TABLE IF EXISTS TBL_TEST_IMPLICIT_INSERT
    CREATE TABLE dbo.TBL_TEST_IMPLICIT_INSERT
    (
      VAL_1  VARCHAR(50) NOT NULL
     ,VAL_2  VARCHAR(50) NOT NULL
     ,BINT_VAL BIGINT  NOT NULL
    );

    INSERT INTO dbo.TBL_TEST_IMPLICIT_INSERT WITH (TABLOCKX) (VAL_1,VAL_2,BINT_VAL)SELECT  TI.VAL_1 ,TI.VAL_2 ,TI.INT_1FROM dbo.TBL_TEST_IMPLICIT  TI

    INSERT INTO dbo.TBL_TEST_IMPLICIT_INSERT WITH (TABLOCKX) (VAL_1,VAL_2,BINT_VAL)
    SELECT
      TI.VAL_1
     ,TI.VAL_2
     ,TI.INT_1
    FROM dbo.TBL_TEST_IMPLICIT  TI

    Hi Esat,
    the implicit conversion is there, if you view the execution plan's XML then you'll see it.
    😎
    If the data types of the source and the destination do not match, SQL Server will always add an implicit type conversion if the conversion is allowed in general.

Viewing 15 posts - 1 through 15 (of 23 total)

You must be logged in to reply to this topic. Login to reply