January 15, 2007 at 3:58 pm
Can anyone recommend a good tool on reading SQL 2000 DB Transaction Logs.
Some of the DBs I maintain generate a lot of logs from hundreds of jobs/replications performed on them. I need to look at the transaction logs to pinpoint which job or procedure is causing my miseries. I cannot use profiler as I'm not certain as to what time this occurs.
TIA
January 15, 2007 at 4:25 pm
There is no good help from MS in this reagard but you can use fn_dblog() function...
http://www.novicksoftware.com/UDFofWeek/Vol1/T-SQL-UDF-Volume-1-Number-27-fn_dblog.htm
It is better to use third party tools like Log Explorer, ApexSQL's Log
Navigator..
http://www.lumigent.com/products/le_sql.html
MohammedU
Microsoft SQL Server MVP
January 15, 2007 at 11:37 pm
yes third party tools are the best option for this.
apex and red-gaet also having same feature tool. Also, you need to have Full recovery option for the database
My Blog:
January 16, 2007 at 8:05 am
examining tran logs is not for the faint hearted, however I'm slightly interested on why you consider this a misery, there seem to have been a number of posts about tran logs, what do you expect to do when you have identified which update/insert/delete is causing the growth - not allow it? Each change has to be logged, that's the point, so knowing this what do you hope to gain from this? You should be aware that index rebuilds and such also cause log growth.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
January 16, 2007 at 10:50 am
Actually if you use a third party tool I don't think it's too bad reading the log. Lumigent, Red Gate, ApexSQL, Golden Gate all have tools that work well.
The biggest problems are if you don't know the time or the person making the change. Then you just have a lot of data to weed through.
January 16, 2007 at 1:24 pm
Steve - yeah agreed it's not too bad until you have several gb of logs to go through!! also assumes you know what you're looking for.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
January 16, 2007 at 2:31 pm
Thanks for all your replies. I'll look into these tools.
BTW, the reason why I need to know which is causing the log outbursts (probably a job) is to find out if the statements used can be optimized. Also, I'll be looking into bulk-logged recovery options.
Thanks again.
January 17, 2007 at 7:44 am
That's an interesting view but you can't usually optimise an update or insert to use less tran log, I agree you can optimise the queries but an insert is an insert. One pointer, however, on data loads ,is to make sure that if tables are being cleared each day ( for example ) that you truncate rather than drop. e.g. dropping a 1gb table will require about 1.5gb tlog space, truncating and then dropping the table will not really use any log space at all. ( It's also much quicker on big tables )
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply