September 11, 2002 at 2:19 pm
Hi, all! I would like to be able to "parse" the transaction log in order to get some kind of history of what has been added, changed, deleted since a certain point in time. Ideally, I would need information on the tables/records/fields the actions where performed on. I want to use this to implement an import/export utility and for creating a history of certain operations in the database.
If I can not do that, is there any way for me to catch the moment a transaction log entry is created and to get the data I need? I know I can do triggers and use timestamps
for added/changed records, but I would like to have a generic solution that manages all tables in my database.
Thanks!
P.S. I already know about http://www.lumigent.com, but that is not a solution I can affort.
September 11, 2002 at 3:34 pm
One option is to put triggers on all the tables. Several tools available that will do it for you, not that hard to roll your own if it needs to be free. The other is to run a trace to log all the changes.
Andy
September 12, 2002 at 7:24 am
There are a few different ways to go with this. Obviously the Lumigent solution, and I will agree with you on that one. It is kind of expensive for small companies/individuals. The other is to read the log with fn_dblog. Parsing the info after reading the log is another can of worms.
To read the entire log, (and for a database with a large number of transaction I Do Not Recomend Using this function in this manner) your statement will look like:
Select * from ::fn_dblog('', '')
This will give you every entry in the transaction log. You can customize the statement to get the desired results.
Ex: Select * from ::fn_dblog('', '')
where [Begin Time] is Null
and [End Time] is Null
and [Transaction ID] in (select [Transaction ID] from ::fn_dblog('', '')
where [Begin Time] >= '2002/09/01 00:00:00:000'
or [End Time] <= '2002/09/12 08:08:59:999')
This is looking for the transactions from the first of September to today (9/12) at 8:08:59 AM. This is also ommitting the begin and end transaction entries.
The trigger option may be a bit easier to implememnt and maintain though.
September 12, 2002 at 9:45 am
Thanks for the answer, I will try to run the trace utility, although I think I can only get some information (for example I do not know if I can get RecordId and TableId/Name of the record being changed). Do you know, can this trace be setup to "listen" to changes, or can it only be used as a "run-now" kind of thing?
I will look into it, thanks again.
Oana
September 12, 2002 at 9:52 am
Thanks for the answer, it did try to use fn_dblog, but it did not understand the meaning of some fields returned there. Do you have any idea how do I parse that, is there anything available somewhere?
Also, I understand this fn_dblog is an undocumented user-defined function, so I won't get any support from Microsoft on this and I need my solution to be as reliable as possible. I wouldn't want to use triggers, because of the performance hit, but it might be the solution in the end.
Thanks,
Oana
September 12, 2002 at 5:47 pm
In a word: LogExplore 3.0. It works.
September 16, 2002 at 2:51 pm
Read up on the sp_trace_... stored procedures. I've set up something similar to monitor utilization and "bad queries".
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply