group records by week

  • I have a table , and I want to have a query that shows by week how many records are being created.

    I'm trying to pull the value from t_stamp like this just to see the value.

    SELECT dateadd(S, t_stamp, '1970-01-01 00:00:00')

    FROM

    #dataCapture

    Arithmetic overflow error converting expression to data type int.

    Insert Into #datacapture

    Values(4, NULL, 15.8773460388184, NULL, NULL, 192,1641918620183)

    Thanks

     

    CREATE TABLE [dbo].[#dataCapture](
    [tagid] [int] NOT NULL,
    [intvalue] [bigint] NULL,
    [floatvalue] [float] NULL,
    [stringvalue] [nvarchar](255) NULL,
    [datevalue] [datetime] NULL,
    [dataintegrity] [int] NULL,
    [t_stamp] [bigint] NOT NULL,
    PRIMARY KEY CLUSTERED
    (
    [tagid] ASC,
    [t_stamp] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
  • The DateAdd function is failing.  The increment parameter is an int and you are passing in a bigint

    For better, quicker answers, click on the following...
    http://www.sqlservercentral.com/articles/Best+Practices/61537/

    For better answers on performance questions, click on the following...
    http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

  • Thats my problem not sure how to make it work along with the grouping

    Thx.

  • You have provided an error message, but not asked any questions.

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

  • I have it displaying datetime now, how can I get it grouped by week?   and a COunt.

    SELECT *, CONCAT
    (
    CAST(DATEADD(SECOND, t_stamp/1000 ,'1970/1/1') AS DATE)
    ,' '
    ,CAST(DATEADD(SECOND, t_stamp/1000 ,'1970/1/1') AS TIME)
    )myDate

    • This reply was modified 2 years, 11 months ago by  Bruin.
  • I suspect that part of the problem is identifying what the value in t_stamp means.

    From  your query, it appears that you are assuming it to be the number of seconds that have elapsed since 1 Jan 1970.

    There are 604800 in a week (86400 * 7).

    So, 1641918620183 / 604800 is roughly 2714812 weeks, or roughly 52207 years.

    Now, assuming that the value in t_stamp is actually milliseconds since 1 Jan 1970, the following code seems to do the trick

    CREATE TABLE #dataCapture (
    tagid int NOT NULL
    , intvalue bigint NULL
    , floatvalue float NULL
    , stringvalue nvarchar(255) NULL
    , datevalue datetime NULL
    , dataintegrity int NULL
    , t_stamp bigint NOT NULL
    , PRIMARY KEY CLUSTERED ( tagid ASC, t_stamp ASC ));
    GO

    INSERT INTO #dataCapture ( tagid, intvalue, floatvalue, stringvalue, datevalue, dataintegrity, t_stamp )
    VALUES ( 4, NULL, 15.8773460388184, NULL, NULL, 192,1641918620183 );

    SELECT *
    , dt_stamp = DATEADD(SECOND, CAST( (t_stamp % 604800000) AS int ), DATEADD(WEEK, CAST( (t_stamp / 604800000) AS int ), CONVERT(datetime, '1970-01-01 00:00:00', 120) ))
    FROM #dataCapture AS dc

    DROP TABLE #dataCapture;
  • A simplified query to return dateStamp as datetime, and wkStamp as 1st day of teh week.

    You can then use the value in wk_stamp to do your grouping by

    SELECT *
    , dt_stamp = DATEADD(SECOND, CAST( (t_stamp / 1000) AS int ), CONVERT(datetime, '1970-01-01 00:00:00', 120) )
    , wk_stamp = DATEADD(dd, DATEDIFF(dd, 0, DATEADD(SECOND, CAST( (t_stamp / 1000) AS int ), CONVERT(datetime, '1970-01-01 00:00:00', 120) )) /7 *7, 0)
    FROM #dataCapture AS dc
  • Thanks the above query worked great. If I wanted to do a daily extract of records saying start at 00:00 to 11:59 how would that query look?

     

    Thx.

  • I like to use a parm to pass in like @date then the query dumps the data from the table to a (csv) file that is pipe delimited.

     

    Can this be achived?

     

    Thx.

  • Bruin wrote:

    Thanks the above query worked great. If I wanted to do a daily extract of records saying start at 00:00 to 11:59 how would that query look?

    Thx.

    What does that have to do with grouping "records" are crated by week???

    And what day of the week does your week start on?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I was getting and idea of how many records are accumulating for a week, and trying to figure out how to read the data.

    Now the request has shifted to pull previous days records to a file which will be used in another process yet to get created.

    Thanks.

  • Bruin wrote:

    Thanks the above query worked great. If I wanted to do a daily extract of records saying start at 00:00 to 11:59 how would that query look?

    Since the data is stored as a bigint in the table, the best way of filtering that data is to calculate the starting value and ending value of the range of 'dates' you want.

    For example:

    Declare @current_date datetime = '2022-01-12 10:22:21.183';

    Declare @ts_start bigint = datediff_big(ms, '1970-01-01 00:00:00.000', cast(@current_date - 1 As date))
    , @ts_end bigint = datediff_big(ms, '1970-01-01 00:00:00.000', cast(@current_date As date))

    Select @ts_start, @ts_end;

    Then filter in the where clause as:

    WHERE ts_stamp >= @ts_start
    AND ts_stamp < @ts_end

    It appears the value you have in ts_stamp is the number of milliseconds since 1970-01-01.  To convert that value to the correct date/time with millisecond accuracy would be:

    Declare @myDate datetime;
    Select date_ss = dateadd(second, 1641918620183 / 1000, '1970-01-01')
    , date_ms = dateadd(millisecond, 1641918620183 % 1000, dateadd(second, 1641918620183 / 1000, '1970-01-01'))

    With that said - I would create a persisted computed column on the table and index that column using the formula above.  Once you have that column you can then query the column with normal date filtering.

    Note: you also need to verify that the source of the data is sending you that data in the same time zone.  If they are sending the data as UTC then you would need to account for that in your conversion.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • I'm a little confused by:

     

    Declare @current_date datetime = '2022-01-12 10:22:21.183';

    Declare @ts_startbigint = datediff_big(ms, '1970-01-01 00:00:00.000', cast(@current_date - 1 As date))

    , @ts_endbigint = datediff_big(ms, '1970-01-01 00:00:00.000', cast(@current_date As date))

    Select @ts_start, @ts_end;

    --

    Msg 102, Level 15, State 1, Line 3

    Incorrect syntax near '='.

    Msg 137, Level 15, State 2, Line 6

    Must declare the scalar variable "@ts_start".

  • What if it has to be retro-fitted to SQL 2012 is datediff_big available?

    Thanks for your help.

     

     

    • This reply was modified 2 years, 11 months ago by  Bruin.
  • This is getting 1-day of information, can be it done more efficient?

    SELECT count(*)
    , dt_stamp = convert(date,DATEADD(SECOND, CAST( (t_stamp / 1000) AS int ), CONVERT(datetime, '1970-01-01 00:00:00', 120) ))
    FROM sqlt_data_1_2022_01(nolock) AS dc
    where
    convert(date,DATEADD(SECOND, CAST( (t_stamp / 1000) AS int ), CONVERT(datetime, '1970-01-01 00:00:00', 120) )) = '2022/01/01'
    group by convert(date,DATEADD(SECOND, CAST( (t_stamp / 1000) AS int ), CONVERT(datetime, '1970-01-01 00:00:00', 120) ))

Viewing 15 posts - 1 through 15 (of 64 total)

You must be logged in to reply to this topic. Login to reply