January 10, 2017 at 9:17 am
At least for me, i only want a last login if it's changed since yesterday. i don't need 50 updates to update to the latest second
By adding a WHERE statement to check of the last updated date is substantially older, you could avoid the deadlocks, i think.
I think i'd add to the WHERE statement to check if the LastLogon was older than midnite of today:
UPDATE master.dbo.servo
SET LastLogon = GETDATE()
WHERE UserName = SUSER_SNAME()
AND LastLogon < DATEADD(dd, DATEDIFF(dd,0,getdate()), 0) --__Midnight for the Current Day:
you could also change the process to just do a blind insert, which will remove the deadlock issue, and have a background job delete older entries, and keep just one row per user.
Lowell
January 10, 2017 at 9:34 am
Good idea. I'll try that. Thanks again.
January 11, 2017 at 12:19 pm
I promise to not keep bugging you on this 🙂 but I've run into one more issue with my cleanup script.
Here is my setup:
--Create table
USE MASTER
CREATE TABLE UserLogon
(
UserName NVARCHAR(255),
Dt DATETIME,
)
GO
--Grant rights on the table
GRANT INSERT ON master.dbo.UserLogon TO public;
GO
GRANT UPDATE ON master.dbo.UserLogon TO public;
GO
GRANT SELECT ON master.dbo.UserLogon TO public;
GO
--Create trigger
CREATE TRIGGER tr_UserLogon ON ALL SERVER FOR LOGON
AS
INSERT INTO master.dbo.UserLogon
SELECT SUSER_SNAME(), GETDATE()
GO
--Enable trigger
ENABLE TRIGGER tr_UserLogon ON ALL SERVER;
I followed your advice to create the following stored procedure that cleans up this table to only keep the latest login. Here is what I created:
CREATE PROCEDURE sp_UserLogon
AS
WITH t AS(
SELECT UserName, Dt,
ROW_NUMBER() OVER(PARTITION BY UserLogon.UserName ORDER BY UserLogon.Dt DESC) pos
FROM UserLogon
)
DELETE UserLogon
FROM UserLogon
JOIN t ON UserLogon.UserName = t.UserName
AND UserLogon.Dt = t.Dt
WHERE t.pos <> 1
SELECT UserName, Dt FROM UserLogon ORDER BY UserName
GO
I put this sp on a job that runs every 10 minutes in order to keep the table from growing so fast. Its also the way I access the table when I want to check the logins.
The issue I'm apparently running into that I will sometimes have duplicate usernames in the result set like this:
user12017-01-11 13:09:35.590
user22017-01-11 13:16:57.130
user22017-01-11 13:16:57.097
user22017-01-11 13:16:57.300
user22017-01-11 13:16:57.277
user32017-01-11 13:16:36.360
If I execute the sp a couple of times it will trim the results and I will finally get a distinct list with the max date/time. I can only assume its because user2 (the main application user) is opening connections so fast that row_number doesn't assign a unique value.
Would you mind giving any suggestions on how to remedy this? I've wondered if adding an identity column to my table and using that would help, but I'm not sure if that would help or not. If my design is completely terrible feel free to tell me 🙂 I'm just starting and am eager to learn.
January 11, 2017 at 2:03 pm
no problem with asking for another look, I'm glad to help.
yeah you could potentially be seeing new logins being created after your delete, so they show up in your query; you'd have to be a pretty busy server though to see three or four new connections that quickly.
I only have a slight tweak to suggest, what you have would work fine with no changes.
you could delete directly from the CTE, no need to join the table again, the CTE really is the table as far as pointers to the data are concerned
CREATE PROCEDURE sp_UserLogon
AS
WITH t AS(
SELECT
ROW_NUMBER() OVER(PARTITION BY UserLogon.UserName ORDER BY UserLogon.Dt DESC) pos,
*
FROM UserLogon
)
DELETE FROM t
WHERE pos > 1
SELECT UserName, Dt FROM UserLogon ORDER BY UserName
GO
Lowell
January 11, 2017 at 2:20 pm
Ah, good suggestion on deleting directly from the CTE. That should make things a little faster. Thanks again.
Viewing 5 posts - 16 through 19 (of 19 total)
You must be logged in to reply to this topic. Login to reply