April 16, 2020 at 4:17 am
I have been asked to create a report based on when a column gets updated with the row and date it was updated. what is the best way to create the report. Table has about 3 million rows and the column report is created on is INT.
Thanks in advance.
April 16, 2020 at 4:44 am
Do you have a trigger on the relevant columns that writes the change info somewhere, or how are you collecting the change information? Awful thin on details.
April 16, 2020 at 4:55 am
I do not have a trigger on column yet. Would creating a after update trigger and collecting the data the best way to create the report.
my concern with trigger was that I didn’t want to affect the performance of other process with my trigger for this particular report.
April 16, 2020 at 8:21 am
Well, perhaps not, but how else to you capture update events on a table?
April 16, 2020 at 11:57 am
You could capture the update events using Extended Events. If there's a specified stored procedure used to update the table, you can use rpc_completed and filter for the procedure name or the object_id. If it's batches, you can use sql_batch_completed and filter for a combination of 'UPDATE' and the table name. This would allow you to capture the update statement and the values passed to it. You won't get a before value, but you will be able to see the exact values passed.
Using this method is much less intrusive than a trigger and will absolutely affect performance radically less than a trigger will. Plus, you won't need to modify your existing structures to put this into place.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 16, 2020 at 12:31 pm
...and the column report is created on is INT.
Thanks in advance.
Can you clarify what this means, please?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
April 25, 2020 at 9:48 am
One solution:
Schedule job and stored procedure. Take the record ID (primary key?) and a hash of all the other columns and store it in a new table. The stored procedure should then repeat the process and check the hash values against those that are stored. any records with changed hash value have been updated and should trigger the report. then same the new hash values to the table. Not pretty, but even with 3M records, you won't be taking up a lot of space.
If the records have ROWSTAMP fields then you could just log the largest rowstamp value and check to see if there are any that are larger than the last logged value. These records have been 'touched'. You need to be careful because ROWSTAMP gets changed before the transaction is committed.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply