The conversion of the varchar value overflowed an int column

  • the below script is return this error :
    The conversion of the varchar value '4103049600' overflowed an int column

    declare @jsondate varchar(40) = '/Date(4103049600000+0700)/'
    Set @jsondate = Replace(@jsondate,'/','')
    Set @jsondate = '\/'+@jsondate+'\/'

    Select cast(convert(char(12),
    dateadd(ms,1*substring(@jsondate,18,3),
    dateadd(ss,1*substring(@jsondate,8,10),'19700101'))
    ,120) + stuff(substring(@jsondate,21,5),4,0,':')
        as datetimeoffset(4));

    however if i change 4103049600000+0700 to be 1328029200000+0700  working fine

  • 2,147,483,647 is the maximum value for an INT, so the error makes sense.
    As you have not asked a question, I'm not sure whether that is the answer you were looking for.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Phil Parkin - Sunday, January 28, 2018 8:46 AM

    2,147,483,647 is the maximum value for an INT, so the error makes sense.
    As you have not asked a question, I'm not sure whether that is the answer you were looking for.

    +1

    ...

  • Yes i am talking about error when value come like that " 4103049600000+0700"
    the problem that this value is dynamic

    i changed like the below :

    declare @jsondate varchar(40) = '/Date(4103049600000+0700)/'--'/Date(4103049600000+0000)/'

    Set @jsondate = Replace(@jsondate,'/','')
    Set @jsondate = '\/'+@jsondate+'\/'
    declare @jsndate2 varchar(40) = (select cast(substring(@jsondate,8,10) as bigint))

     Select cast(convert(char(12),
     dateadd(ms,1*substring(@jsondate,18,3),
     dateadd(ss,1* (Select cast(@jsndate2 as bigint)) ,'19700101'))
      ,120) + stuff(substring(@jsondate,21,5),4,0,':')
                as datetimeoffset(4));

    return the below error:

    Arithmetic overflow error converting expression to data type int.

  • Did you even read my response?

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Yes i read and i know it is maximum  2,147,483,647  but the dynamic value return more than that, my question how i can solve that?

  • ikrami2000 - Sunday, January 28, 2018 2:59 PM

    Yes i read and i know it is maximum  2,147,483,647  but the dynamic value return more than that, my question how i can solve that?

    You could try bigint instead of int.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Really i don't know where to put , i changed like that 
    dateadd(ss,1* (Select cast(@jsndate2 as bigint)) ,'19700101'))

    but not working

  • ikrami2000 - Sunday, January 28, 2018 3:16 PM

    Really i don't know where to put , i changed like that 
    dateadd(ss,1* (Select cast(@jsndate2 as bigint)) ,'19700101'))

    but not working

    Could you please show your desired output ? like an example, What you're trying to do ?

  • I need readable date

  • ikrami2000 - Monday, January 29, 2018 12:11 AM

    I need readable date

    The problem is with dateadd function, Please change your query not to use Dateadd function, because dateadd will accept int as a parameter. whereas you're trying to pass big int as an input to dateadd function.

    The first example was worked (1328029200000+0700) because it was accepted as an int value by dateadd function , the second example was failed(4103049600000+0700) because the dateadd function didn't qualify it as int. i.e big int

    To  conclude, you'll need to modify the dateadd function to accept 4103049600000+0700 as INT (i.e BIGINT) and proceed for the output. Please note the others are declared correctly, only the issue with Dateadd function for 4103049600000+0700.

  • Hi. As mentioned, the parameter passed to the DATEADD function is an integer. When you pass a bigint it is converted to an integer before being passed to the function and it will fail if it is out of bounds.

    As a workaround, you can calculate the minutes and seconds portion and pass them in seperately, like this:

    DECLARE @jsondate VARCHAR(40) = '/Date(4103049600000+0700)/';

    SET @jsondate = '\/' + REPLACE(@jsondate, '/', '') + '\/';

    SET @jsondate = '\/' + REPLACE(@jsondate, '/', '') + '\/';

    DECLARE @Milliseconds BIGINT = SUBSTRING(@jsondate, 18, 3);DECLARE @Seconds BIGINT = SUBSTRING(@jsondate, 8, 10);DECLARE @Minutes BIGINT = @Seconds / 60;

    DECLARE @Milliseconds BIGINT = SUBSTRING(@jsondate, 18, 3);
    DECLARE @Seconds BIGINT = SUBSTRING(@jsondate, 8, 10);
    DECLARE @Minutes BIGINT = @Seconds / 60;

    SET @Seconds = @Seconds - (@Minutes * 60);

    SET @Seconds = @Seconds - (@Minutes * 60);

    SELECT CAST(  CONVERT(   CHAR(10),   DATEADD(MILLISECOND, @Milliseconds,     DATEADD(SECOND, @Seconds,      DATEADD(MINUTE, @Minutes, '19700101')))   ,120) + STUFF(SUBSTRING(@jsondate, 21, 5), 4, 0, ':') AS DATETIMEOFFSET(4));

    SELECT
     CAST(
      CONVERT(
       CHAR(10),
       DATEADD(MILLISECOND, @Milliseconds,
        DATEADD(SECOND, @Seconds,
         DATEADD(MINUTE, @Minutes, '19700101')))
       ,120) + STUFF(SUBSTRING(@jsondate, 21, 5), 4, 0, ':')
     AS DATETIMEOFFSET(4));

    The SQL Guy @ blogspot[/url]

    @SeanPearceSQL

    About Me[/url]

  • Sean Pearce - Monday, January 29, 2018 1:33 AM

    Hi. As mentioned, the parameter passed to the DATEADD function is an integer. When you pass a bigint it is converted to an integer before being passed to the function and it will fail if it is out of bounds.

    As a workaround, you can calculate the minutes and seconds portion and pass them in seperately, like this:

    DECLARE @jsondate VARCHAR(40) = '/Date(4103049600000+0700)/';

    SET @jsondate = '\/' + REPLACE(@jsondate, '/', '') + '\/';

    SET @jsondate = '\/' + REPLACE(@jsondate, '/', '') + '\/';

    DECLARE @Milliseconds BIGINT = SUBSTRING(@jsondate, 18, 3);DECLARE @Seconds BIGINT = SUBSTRING(@jsondate, 8, 10);DECLARE @Minutes BIGINT = @Seconds / 60;

    DECLARE @Milliseconds BIGINT = SUBSTRING(@jsondate, 18, 3);
    DECLARE @Seconds BIGINT = SUBSTRING(@jsondate, 8, 10);
    DECLARE @Minutes BIGINT = @Seconds / 60;

    SET @Seconds = @Seconds - (@Minutes * 60);

    SET @Seconds = @Seconds - (@Minutes * 60);

    SELECT CAST(  CONVERT(   CHAR(10),   DATEADD(MILLISECOND, @Milliseconds,     DATEADD(SECOND, @Seconds,      DATEADD(MINUTE, @Minutes, '19700101')))   ,120) + STUFF(SUBSTRING(@jsondate, 21, 5), 4, 0, ':') AS DATETIMEOFFSET(4));

    SELECT
     CAST(
      CONVERT(
       CHAR(10),
       DATEADD(MILLISECOND, @Milliseconds,
        DATEADD(SECOND, @Seconds,
         DATEADD(MINUTE, @Minutes, '19700101')))
       ,120) + STUFF(SUBSTRING(@jsondate, 21, 5), 4, 0, ':')
     AS DATETIMEOFFSET(4));

    Thanks Sean for your query, But I'm getting an error while executing,

    Msg 8114, Level 16, State 5, Line 4
    Error converting data type varchar to bigint.

    Can we really pass / declare substring to BIGINT datatype ? I've stuck here.

  • Working from the excellent suggestion by Sean, but tweaking slightly, try this:

    DECLARE @jsondate VARCHAR(40) = '/Date(4103049600000+0700)/'

    SET @jsondate = Replace(@jsondate,'/','\/')

    DECLARE @Milliseconds as INT = substring(@jsondate,18,3),

    @Seconds AS INT = substring(@jsondate,16,2) - ((substring(@jsondate,16,2)/60)*60),

    @Minutes as INT = CAST(substring(@jsondate,8,10) as BIGINT)/60

    SELECT CAST(

    CONVERT(

    CHAR(10),

    DATEADD(MILLISECOND, @Milliseconds,

    DATEADD(SECOND, @Seconds,

    DATEADD(MINUTE, @Minutes, '19700101')))

    ,120) + STUFF(SUBSTRING(@jsondate, 21, 5), 4, 0, ':')

    AS DATETIMEOFFSET(4));

    ...

  • Thank you very much ( HappyGeek), (subramaniam.chandrasekar)

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply