July 18, 2005 at 10:24 am
Dear DBAs,
I would like to get some information from cluster.log file. Does any one know how to use LogParser.exe for cluster log file. What will be -i:<input_format>? EVT? ETW?
Thanks a lot
Kenny
July 21, 2005 at 8:00 am
This was removed by the editor as SPAM
August 7, 2005 at 9:01 am
I too would like to know if logparser will do cluster logs. I'm reading up on it and I guess, I'll have to make an input template file.
For now I use a perl script to merge both node logs and create a tab delimited file.
If you know perl, the script looks like:
use strict;
use warnings;
my $folder="C:\\FolderWithClusterLogs\\";
my $file1="Clusterlog_node1";
my $file2="Clusterlog_node2";
open(XCLS, ">".$folder."clustcollect.tsv" ) or die "Can't open input.txt: $!";
sub parse {
open(CLS, $folder.$_[0].".txt" ) or die "Can't open input.txt: $!";
while () {
chomp;
s/^"//; #crop leading and trailing quotes from some messages
s/"$//;
my $cemp;
$cemp = $_;
$cemp =~ /^(\S{8})\.(\S{8})\:\\d{4}\/\d\d\/\d\d)-(\d\d)\d\d)\d\d).(\d\d\d)/;
my $c1=$1;
my $c2=$2;
my $c3=$3;
my $c4=$4;
my $c5=$5;
my $c6=$6;
my $c7=$7;
$cemp =~ /^\S{8}\.\S{8}\:\:\d{4}\/\d\d\/\d\d-\d\d:\d\d:\d\d.\d\d\d (\[\S*\]|.+>:|.+( |)(.+)/;
my $c8=$1;
my $c10=$3;
$c1=~s/^0+// if $c1;
$c2=~s/^0+// if $c2;
print XCLS "$_[0]\t$c1\t$c2\t$c3\t$c4:$c5:$c6\t$c7\t$c8\t$c10\n";
}
close CLS;
}
parse($file1);
parse($file2);
close XCLS;
(Emoticons suck - Worried is semicolon ":", left parenthesis, backslash, "d"
Happy is the usual semicolon, right parenthesis. )
There are some error lines that don't get parsed properly, (follow the warning messages to find the failed parses). You can open the output file "clustercollective.tsv" with Excel. Add the appropriate headers and use autofilter to look at specific modules. I freeze the window so the headers and autofilter pulldown boxes always show. I also do alot of color coding to the make the log more readable in Excel.
You can get a handle on the messages here:
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply