Log Monitoring through Perl

Hi,

I am new to perl. I want to write a perl script to monitor logs. Where i want to monitor exceptions logged or any kind of error strings. I have a dir(On Solaris) with multiple log file which keeps rolling to .gz file after some time in that same dir. These logs files size keeps on increasing till the time they rollup.

In test.cfg i have put all the log dir which i want to read. and i am copying the exception and error trace to separate files.

#!/usr/local/bin/perl -w

$config_file="/opt/solitare/perl_script/test.cfg";
unless(open(CONFIG,"/opt/solitare/perl_script/test.cfg"))
{
die("Could not open Config File");
}
@config = <CONFIG>;
foreach $temp1(@config)
{
chomp($temp1);
chdir($temp1);
system("rm -r /opt/solitare/error_log/ ");
mkdir("/opt/solitare/error_log/",0777);
opendir(DIR,$temp1) || die("not able to open $temp1 dir\n");
while ($filename= readdir(DIR))
{
chomp($filename);
if($filename =~ /\.log$/)
{
print("file name is = $filename \n");
$logfile="outfile";
system ("touch outfile");
unless (open(OUTFILE,">>outfile"))
{
die ("cannot open output file outfile\n");
}
unless(open(FILE,"$filename"))
{
die "Could not open $filename input log file.";
}
foreach $line (<FILE>)
{
if($line =~ /Exception/ || $line =~ /at[\t ]/ || $line =~ /Error/)
{
print OUTFILE ($line);
# print ($line);
}
}
rename ($logfile , "/opt/solitare/error_log/$filename.bk");
# print("Back Up FIle Name $filename.bk\n");
print("scanning of $filename file is completed\n");
close(OUTFILE);
}
}
close(FILE);
}
close(CONFIG);

If i use this script for log file which are not increasing in size or not active currently, It works fine.

I hope am clear with my query, as i have put query for the first time on any forum. :slight_smile:

Thanks.