Query: Want to see "most active users" over the last year in minute incriments

  • I want to see when we had the most people online doing "stuff".

    We have a table that shows the user, the thing they were doing (serving up page X, downloading file Y, taking training video Z).

    What I am a bit confused about is how do I handle the issue of breaking down concurrent activities over time.

    I want to group these activities into buckets by minute.

    Do I just pull everything by minute and group, count, etc.. on the minute mark?

    What if I have 7:05 PM and 55 seconds past. During that minute we have 100 users. Then at 7:06 PM and 0 seconds past we have 110 users.

    7:06 PM and 10 seconds past we have 50 users.

    So how do I slice that out?

    Max users per minute? Average users per minute?

    Just not sure what the best way to accomplish this is.

    When I'm done I want to graph it to see the peeks and valleys over a 24 hour period, etc...

    Just not sure the best approach from a "when do I decide who gets put into what group/bucket"?

    Thanks!

  • you'll have to group by the incremental that you want to track.

    here's a rough example, where i'm just guessing at your table structure, but assuming all times have millisecodns in them, so you need to round them to the nearest minute....int he end, you want to change it to, say 10 minute increments, or hours, or whatever the reporting might look ok and not too deetailed.

    SELECT count(userid) As userCount,

    DATEADD(minute, DATEDIFF(dd,0,visitingDate), 0) as NearestMinute

    from YourData

    WHERE visitingDate between '2011-10-01 00:00:00.000' and '2011-10-07 23:59:59.997'

    Group By DATEADD(minute, DATEDIFF(dd,0,visitingDate), 0)

    are you going to graph that? that's, what 3600 plotted points per day?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Yeah i am expecting some large variances so I figured start small and aggregate out.

    The tsql isn't the issue (though thanks for above) I was just wondering if my approach of grouping into "buckets" was flawed.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply