group records by week

  • Looks like there was a copy/paste error - the statements should be:

    Declare @current_date datetime = '2022-01-12 10:22:21.183';

    Declare @ts_start bigint = datediff_big(ms, '1970-01-01 00:00:00.000', cast(@current_date - 1 As date))
    , @ts_end bigint = datediff_big(ms, '1970-01-01 00:00:00.000', cast(@current_date As date))

    Select @ts_start, @ts_end;

    It looks like DATEDIFF_BIG wasn't introduced until SQL Server 2016.  So instead of calculating the start/end to the millisecond - you can calculate to the second and multiply by 1000 to get to milliseconds.

    Declare @currentDate datetime = '20220101';

    Declare @ts_start bigint = cast(1000 As bigint) * datediff(second, '1970-01-01 00:00:00.000', cast(@currentDate As date))
    , @ts_end bigint = cast(1000 As bigint) * datediff(second, '1970-01-01 00:00:00.000', cast(@currentDate + 1 As date));

    Select count(*)
    , dt_stamp = cast(dt.dt_stamp As date)
    From sqlt_data_1_2022_01 dc With (NoLock)
    Cross Apply (Values (dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, '1970-01-01')))) As dt(dt_stamp)
    Where dc.t_stamp >= @ts_start
    And dc.t_stamp < @ts_end
    Group By
    cast(dt.dt_stamp As date);

    If you want to increase the date range - adjust the settings when defining @ts_start and @ts_end.  For example, to include the full month:

    Declare @start_date datetime = '20220101'
    @end_date datetime = '20220131';

    Declare @ts_start bigint = datediff(second, '1970-01-01 00:00:00.000', cast(@start_date As date))
    , @ts_end bigint = datediff(second, '1970-01-01 00:00:00.000', cast(@end_date + 1 As date));

    So again - filter the table by the actual column instead of converting the column value to a datetime and then filtering.

    And I will state it again, if possible - add a computed column to the table, persist it and put an index on that column.  Then you can query that column directly using normal date functions.

     

     

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • I had posted another response - not sure where it went.  As I stated before - you should calculate the start/end range of the t_stamp column instead of converting that to a date/time value.

    Declare @start_date datetime = '20220101'
    @end_date datetime = '20220131';

    Declare @ts_start bigint = cast(1000 As bigint) * datediff(second, '1970-01-01 00:00:00.000', cast(@start_date As date))
    , @ts_end bigint = cast(1000 As bigint) * datediff(second, '1970-01-01 00:00:00.000', cast(@end_date + 1 As date));

    Select count(*)
    , dt_stamp = dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, '1970-01-01'))
    From sqlt_data_1_2022_01 dc With (NoLock)
    Cross Apply (Values (dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, '1970-01-01')))) As dt(dt_stampt)
    Where dc.t_stamp >= @ts_start
    And dc.t_stamp < @ts_end
    Group By
    cast(dt.dt_stamp As date);

    DATEDIFF_BIG was introduced in SQL Server 2016 - the way around that is to calculate for second and multiply by 1000 to get to millisecond.

    Using CROSS APPLY to calculate the actual dt_stamp - we can simplify the query and use normal date functions against that column.  You can now create groupings by day/week/month as you need.

    I still think you should consider adding a persisted computed column to the table.  That way - the conversion from t_stamp to an actual datetime is already done - and you can add an index on that column.

    Another 'trick' would be to replace the references to '1970-01-01' with the integer value 25567.  This will eliminate any implicit conversions.

    Declare @start_date datetime = '20220101'
    @end_date datetime = '20220131';

    Declare @ts_start bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@start_date As date))
    , @ts_end bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@end_date + 1 As date));

    Select count(*)
    , dt_stamp = dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, 25567))
    From sqlt_data_1_2022_01 dc With (NoLock)
    Cross Apply (Values (dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, 25567)))) As dt(dt_stampt)
    Where dc.t_stamp >= @ts_start
    And dc.t_stamp < @ts_end
    Group By
    cast(dt.dt_stamp As date);

    To confirm that 25567 is actually 1970-01-01: CAST(25567 AS datetime)

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Thanks for updates: I ran 2nd example and received:

     

    cast(dt.dt_stamp As date);

     

    Msg 529, Level 16, State 2, Line 14

    Explicit conversion from data type bigint to date is not allowed.

  • Sorry - having problems with copying:

    Declare @start_date datetime = '20220101'
    @end_date datetime = '20220131';

    Declare @ts_start bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@start_date As date))
    , @ts_end bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@end_date + 1 As date));

    Select count(*)
    , cast(dt.dt_stamp As date)
    From sqlt_data_1_2022_01 dc With (NoLock)
    Cross Apply (Values (dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, 25567)))) As dt(dt_stamp)
    Where dc.t_stamp >= @ts_start
    And dc.t_stamp < @ts_end
    Group By
    cast(dt.dt_stamp As date);

    Not sure why you couldn't figure that out - I provided several examples and you should be able to take those examples and modify them for your requirements.

    You should not take a script and just run it...you need to review and validate before running.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Thats very fast retrieval... thank you!!!  Now that this query shows me how much data is being written to the table per day could it be altered to get 8 hr intervals for previous day?

    00:00 00:08

    008:00 16:00

    16:00 23:59

    The end game is I need to pull information from this table to a (csv) file for other requirements. I think after seeing how much data is stored daily I need to make 3 passes to create 3 files based upon time requirements above.

    Thanks for your help

  • You can break it out any way you want - now that you have your datetime column it is just a matter of defining the value.  How you go about that depends on how you are outputting the data to the CSV files.

    I would build a reporting table with the appropriate categories - then using either SSIS or Powershell - loop over each category and pull that data from the reporting table and output to a file using the category as part of the filename.

    If your question is about how to determine the category - that would just be a case expression:

    CASE WHEN datepart(hour, dt.dt_stamp) < 8 THEN 1
    WHEN datepart(hour, dt.dt_stamp) < 16 THEN 2
    ELSE 3
    END

    This is just one way - there are other methods available.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • any way to make it all done in SP without having to invoke SSIS or Powershell? Dump 3 files based upon the CASE above?

     

     

  • Bruin wrote:

    any way to make it all done in SP without having to invoke SSIS or Powershell? Dump 3 files based upon the CASE above? 

    Yeah - you can probably build something using xp_cmdshell and BCP to create the files.  You'll need to do some research on how to set that up.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Not having any luck trying to Insert The CASE into previous query.

    CASE WHEN datepart(hour, dt.dt_stamp) < 8 THEN 1

    WHEN datepart(hour, dt.dt_stamp) < 16 THEN 2

    ELSE 3

    END

    Like to see the break out.

     

    Declare @ts_start bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@start_date As date))
    , @ts_end bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@end_date + 1 As date));

    Select count(*)
    , cast(dt.dt_stamp As date)
    From sqlt_data_1_2022_02 dc With (NoLock)
    Cross Apply (Values (dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, 25567)))) As dt(dt_stamp)
    Where dc.t_stamp >= @ts_start
    And dc.t_stamp < @ts_end
    Group By
    cast(dt.dt_stamp As date);

    • This reply was modified 2 years, 9 months ago by  Bruin.
  • What problem are you having?  I would put that CASE expression in another CROSS APPLY.  Then it can be included in the SELECT and the GROUP BY as any other column.  Or just repeat the CASE expression in both the select and group by...

     

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Would I be able to give each row a Label?

    ,CASE WHEN datepart(hour, dt.dt_stamp) < 8 THEN 1

    WHEN datepart(hour, dt.dt_stamp) < 16 THEN 2

    ELSE 3

    END as 'HrReported'

    Instead of having a 1,2,3  I want something like 'MidNightTo8:00am' for 1.

    Thx.

  • Did you know that we've had a DATE data type in SQL Server for many years now? There is no need to use the old DATETIME that was inherited from Sybase many decades ago.

    Then when you get a chance read the ISO 8601 standards. One of the standard date formats uses weeks. It's rather popular in the Scandinavian countries. Its display format is"yyyyWww-[1-7]" where ww is the week within the year, followed by the day of the week within that week. Microsoft does not support this, but when you build your calendar table (you do have a calendar table, don't you?) You can download it as a character string from the Internet or compute it yourself.

    Please post DDL and follow ANSI/ISO standards when asking for help. 

  • jcelko212 32090 wrote:

    There is no need to use the old DATETIME...

    That's what so many people say but that's seriously wrong.  Although DATETIME doesn't meet some of the ISO 8601 standards, it meets a whole lot more than DATE, TIME, and DATETIME2 ever will.  They were a great idea but their implementation behind the scenes is as horrible and ignorant as those that programmed Excel to allow a date of 1900-02-29.

    And, if you learn how to use the temporal datatypes and functions correctly, there's little reason for all the READs that a Calendar table will generate.  There are actually very few things that actually justify the use of a Calendar table.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Would I be able to create different output files based on in the SP?

    ,CASE WHEN datepart(hour, dt.dt_stamp) < 8 THEN ..... Create output file1

    WHEN datepart(hour, dt.dt_stamp) < 16 THEN .... Create output file1

    ELSE .... Create output file3

    END

    Thx.

  • Bruin wrote:

    Would I be able to create different output files based on in the SP?

    ,CASE WHEN datepart(hour, dt.dt_stamp) < 8 THEN ..... Create output file1

    WHEN datepart(hour, dt.dt_stamp) < 16 THEN .... Create output file1

    ELSE .... Create output file3

    END

    Thx.

    That is not how SQL works - SQL cannot create an output file.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 15 posts - 16 through 30 (of 64 total)

You must be logged in to reply to this topic. Login to reply