Group records during capture

  • I have a table which is being written to by another application. A few things of note I can't alter schema, or unable to upgrade the sql version at this time.

    I need some help in a couple of areas. The first is to grab the last 15 minutes of information from this table. The second issue is how to grab a single record per minute.

    After you load the table and data then run this query you will get 6 records. Since the floatvalue hasn't changed for this Tagid on these 6 records how could I get just 1 record for 2024-12-10 09:30 timeframe. I basically want 1 record per minute by tagid if floatvalue's are the same.

    select *,

    DATEADD(s,t_stamp/1000,'1969-12-31 19:00:00') as 'DateRecorded'

    from

    [dbo].[sqlt_data_1_2024_cpy]

    order by t_stamp desc

    Thanks.

    Insert into [dbo].[sqlt_data_1_2024_cpy]
    values(27002,NULL,2202.01,NULL,NULL,192,1733841007518);

    Insert into [dbo].[sqlt_data_1_2024_cpy]
    values(27002,NULL,2202.01,NULL,NULL,192,1733841017649);

    Insert into [dbo].[sqlt_data_1_2024_cpy]
    values(27002,NULL,2202.01,NULL,NULL,192,1733841027650);

    Insert into [dbo].[sqlt_data_1_2024_cpy]
    values(27002,NULL,2202.01,NULL,NULL,192,1733841037653);

    Insert into [dbo].[sqlt_data_1_2024_cpy]
    values(27002,NULL,2202.01,NULL,NULL,192,1733841047654);

    Insert into [dbo].[sqlt_data_1_2024_cpy]
    values(27002,NULL,2202.01,NULL,NULL,192,1733841057657);


    CREATE TABLE [dbo].[sqlt_data_1_2024_cpy](
    [tagid] [int] NOT NULL,
    [intvalue] [bigint] NULL,
    [floatvalue] [float] NULL,
    [stringvalue] [varchar](255) NULL,
    [datevalue] [datetime] NULL,
    [dataintegrity] [int] NULL,
    [t_stamp] [bigint] NOT NULL,
    PRIMARY KEY CLUSTERED
    (
    [tagid] ASC,
    [t_stamp] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
  • You should know how to use date functions by now!

    1. DATEADD(minute, -15, CURRENT_TIMESTAMP)

    2. ROW_NUMBER() OVER (PARTITION BY DATEDIFF(minute, '2000', datevalue), floatvalue ORDER BY datevalue) AS rn

  • That would be true if working with a Normal date, but the Unix-Timestamp stored in T_stamp needs converted I'm thinking.

     

  • Bruin wrote:

    That would be true if working with a Normal date, but the Unix-Timestamp stored in T_stamp needs converted I'm thinking. 

    Yep, t_stamp stored as BIGINT could be converted to DATETIME2(3).  The formula in the posted query appears to be incorrect

    DATEADD(s,t_stamp/1000,'1969-12-31 19:00:00') as 'DateRecorded'

    For a full explanation of unix timestamp to t-sql type conversion there's a Jeff Moden article

    https://www.sqlservercentral.com/articles/convert-unix-timestamps-to-date-times-in-sql-server-part1

    This issue came up a few months ago in another topic by the same member.

    https://www.sqlservercentral.com/forums/topic/row-changes-per-hour-by-tagid

    This query demonstrates Jeff's itvf and the conversion method we used in the previous post

    select *, DATEADD(s,t_stamp/1000,'1969-12-31 19:00:00') as 'DateRecorded' 
    from #sqlt_data_1_2024_cpy t
    cross apply dbo.msUnixTStoDATETIME2(t.t_stamp) jm_unx_tstamp
    cross apply (values (cast(dateadd(millisecond, t.t_stamp%1000, dateadd(second, t.t_stamp/1000, '1970-01-01')) as datetime2(3)))) v(ss_dt)

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • This was removed by the editor as SPAM

  • Using this above query

    select *, DATEADD(s,t_stamp/1000,'1969-12-31 19:00:00') as 'DateRecorded'

    from #sqlt_data_1_2024_cpy t

    cross apply dbo.msUnixTStoDATETIME2(t.t_stamp) jm_unx_tstamp

    cross apply (values (cast(dateadd(millisecond, t.t_stamp%1000, dateadd(second, t.t_stamp/1000, '1970-01-01')) as datetime2(3)))) v(ss_dt)

    How could I last 15 minutes of activity pulling the unique record per minute

    Many thanks

  • Bruin wrote:

    How could I last 15 minutes of activity pulling the unique record per minute

    Scroll up a few posts and see Ken McKelvey's reply.  Once t_stamp has been converted to DATETIME2(3) you could use built-in t-sql date functions.

    Ken McKelvey wrote:

    1. DATEADD(minute, -15, CURRENT_TIMESTAMP)

    Instead of CURRENT_TIMESTAMP substitute the converted t_stamp column 'ss_dt' (or 'TheDateTime' if using JM's function)

    Ken McKelvey wrote:

    2. ROW_NUMBER() OVER (PARTITION BY DATEDIFF(minute, '2000', datevalue), floatvalue ORDER BY datevalue) AS rn

    This code creates a sequence with boundaries defined by the PARTITION BY.  The DATEDIFF function creates a value based on the converted t_stamp which is constant for rows within 1 minute.  The 2nd parameter in DATEDIFF is the starting datetime, which Ken has set to '2000' as an arbitrary value intended to be calendarized prior to any converted t_stamp.  Zero would also work afaik

     

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • This was removed by the editor as SPAM

  • Instead of trying to convert the unix timestamp to a datetime2 - and then truncating that value down to the nearest minute, why not just round the timestamp itself to the nearest minute?

    That should just be: t_stamp / 60000 * 60000

    To get the last 15 minutes - convert your datetime to a bigint and query for t_stamp >= that value.

    DECLARE @from_timestamp bigint = {function to convert datetime to timestamp}

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply