Tracking Concurrent Logins

  • I have a simple table

    create table user_logins (

    userlogin datetime,

    userlogout datetime,

    logindomain varchar(50),

    username varchar(50),

    clientname varchar(50)

    )

    A record is placed in the table when a user logs in... and the logout field in the record is updated when they log out. If the system crashes while they are in, the userlogout field will be null. For reporting purposes the user's logout should be treated as their next login using the same username from the same clientname, if the next logout is in the following week, the logout date should be set to the next sunday from the login date at 1am.

    I managed to solve that part by rebuilding the table one record at a time and looking if the next record in a cursor was a valid next login for the current record, and if so, using that login time, else setting it to the upcoming sunday at 1am.

    I need to find useful information from this result set. Such as the maximum number of currently logged in users for a week. The only way I can think of to do this is to build some kind of a while loop for the entire time period, incrementing by the minute (or perhaps the second) and selecting the number of records that the particular date I have generated in this interation of the loop falls between in the login and logout fields.

    something like

    declare @counterdate datetime

    set @counterdate = '02/01/08 01:00:00'

    declare @resultstable table (

    currenttime datetime,

    logindomain varchar(50),

    clientname varchar(50),

    username varchar(50),

    numberoflogins int

    )

    while @counterdate <= '02/08/08 01:00:00'

    BEGIN

    insert into @resultstable

    select

    @counterdate,

    a.logindomain,

    a.clientname,

    a.username,

    count(*)

    from user_logins a where @counterdate between a.userlogin and a.userlogout

    group by

    a.logindomain,

    a.clientname,

    a.username

    set @counterdate = dateadd(ss,1,@counterdate)

    END

    select

    logindomain,

    clientname,

    username,

    max(numberoflogins)

    from @resultstable

    group by

    logindomain,

    clientname,

    username

    Is there a simpler way to do this? Looping 604,800 times to generate this seems a bit obscene...



    --Mark Tassin
    MCITP - SQL Server DBA
    Proud member of the Anti-RBAR alliance.
    For help with Performance click this link[/url]
    For tips on how to post your problems[/url]

  • I'm strapped for time, so I might not be able to experiment with this, but what if you created the equivalent of a table of numbers, but instead made a table of minutes? In other words, populate a table with one minute increments over the time frame in question, then simply join against that from your other table? Again, just a guess, but I'll bet it will be a lot more effecient than what you're currently doing.

    Another option, that would take a bit more exploration & testing than I'm currently capable of, Itzik Ben-Gan has a method for matching Current & Previous Occurences using a CTE with ROW_NUMBER() and OVER (PARTION BY... It looks like that's also a possibility, although you're problem is a bit different.

    If I can, I'll get back around to trying that first thing and I'll report back unless someone else comes up with something before I get to it.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • I tried it. It worked.

    I took your table & loaded it with data (still beta testing Red Gates SQL Data Generator). I then created this table:

    CREATE TABLE tableofdates

    (

    BeginTime DATETIME

    ,EndTime DATETIME

    )

    I loaded that with data for the last week with a quick & dirty query that can be easily replaced with something more set based, but I was moving quick (forgive us our trespasses, Jeff):

    DECLARE @MyDate datetime

    SET @MyDate = '3/1/2008'

    WHILE @MyDate < '3/8/2008'

    BEGIN

    INSERT INTO tableofdates

    SELECT @MyDate

    ,DATEADD(mi,1,@MyDate)

    SET @MyDate = DATEADD(mi,1,@MyDate)

    END

    Then I wrote this little query to pull the data together. It really did show the simultaneous connections by minute.

    SELECT t.begintime

    ,t.endtime

    ,COUNT(u.username)

    FROM user_logins u

    JOIN tableofdates t

    ON u.userlogin BETWEEN t.begintime AND t.endtime

    GROUP BY t.begintime

    ,t.endtime

    ORDER BY t.begintime

    ,t.endtime

    You could probably take this and do some more tweaks to turn it into something even more useful. With no indexes on either table and a thousand rows, it ran in 56ms. I then reran it with 100,000 rows in the user_logins table and got everything back in 1057ms. Again, I think more than a few tweaks can be made to how I put this together to make it a bit more robust & faster & more useful. But it might be enough to get you started.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Thanks! I was trying to think of a set/relation method to make it work instead of the cumbersome loop... 43 seconds for this stage for a week's worth of data isn't too bad. Beats the 5 minutes or so the loop select I had used was taking.



    --Mark Tassin
    MCITP - SQL Server DBA
    Proud member of the Anti-RBAR alliance.
    For help with Performance click this link[/url]
    For tips on how to post your problems[/url]

  • Did you try putting an index on the time table? I think it'll make a bit of a difference. Same with your user login table.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Actually that was after indexing both tables

    I created the time table as just a list of 1 minute entries and indexed that. I indexed the logins table both by login, logout, and both (figured more can't hurt).

    It's got about 200,000+ logins to play with and by the minute a week is still 10,080 entries... so 43 seconds isn't really a bad result..



    --Mark Tassin
    MCITP - SQL Server DBA
    Proud member of the Anti-RBAR alliance.
    For help with Performance click this link[/url]
    For tips on how to post your problems[/url]

  • Based on the query above, I'm seeing slightly different results. I'm getting an index seek on the userlogin when I've got it indexed and a scan on the tableofdates (shock, since we're hitting the entire table). Processing 400,000 rows is only taking 235ms once I've got the indexes in place. I also tried it with the userlogin clustered, but that was a tad slower at 360ms.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • oddly enough I get the reverse... I get a seek on the times table... and a scan on the logins... hmmm



    --Mark Tassin
    MCITP - SQL Server DBA
    Proud member of the Anti-RBAR alliance.
    For help with Performance click this link[/url]
    For tips on how to post your problems[/url]

  • Actually I'm using a slightly different query

    select * from (

    select

    logindomain,

    max_login = max(logins)

    from (

    select

    a.logindomain,

    b.timeentry,

    logins = count(*)

    from

    user_logins a join #times_list b on

    b.timeentry between a.userlogin and a.userlogout

    group by

    b.timeentry,

    a.logindomain

    ) main

    group by

    main.logindomain

    ) a

    order by a.max_login desc



    --Mark Tassin
    MCITP - SQL Server DBA
    Proud member of the Anti-RBAR alliance.
    For help with Performance click this link[/url]
    For tips on how to post your problems[/url]

  • Ah, well, that makes sense then. Still, I think you'd be able to avoid the scan. Look at the execution plan and see what criteria are being used by the scan. If they don't have an index, you could put one in place.

    Either way, I'm glad it's working.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Well it's an indexed scan... as opposed to a table... so I can accept that...

    I have no scan predicates, just objects... logindomain, login and logout times... :/ created an index on that... still get an index scan...



    --Mark Tassin
    MCITP - SQL Server DBA
    Proud member of the Anti-RBAR alliance.
    For help with Performance click this link[/url]
    For tips on how to post your problems[/url]

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply