Monitoring log file for entries - Find command & sorting

hi,

I would like to monitor a log file, which rolls over, everytime a server is restarted.

I would like to grep for a string, and to be more efficient i'd like to grep only newly appended data. so something like a 'tail -f' would do, however, as the log rolls over i think a 'tail -F' is what i need, but this doesn't seem to be available in Solaris.

so any ideas welcome. the log file/s is named:

"test*.txt"

the string i'm looking for is "error_message_here"

so i was going to do something like this:

ls -lr|grep -i 'test.txt'|grep -i 'error_message'  # and i can then send this to email etc.

or should i use the find command, however, how would i then cat out the message.

so far i have:

find . -name 'test*.txt' | xargs ls -lr | head -1 

output is this:

-rw-r--r-- 1 hrid hrinkp 601456524 Aug 22 09:06 ./test1.txt

so i find the files by name, then list/sort them with ls -lr, i then get the top entry (filename).......and that is the file that i'd like to grep/cat for the message.

how would i do that now? because i've used ls -l, its in long format, so i need to specify the coloumn which is the filename (./test1.txt) that I would like to search i.e.

thanks

The first line puts a list of all test*txt files in reverse order into $1, $2, ... Note: you don't have to use -l, when you want files to be sorted in reverse order.

set -- $( ls -r test*.txt)

Now $1 holds the name of the file you want to grep. Now do it:

grep error_message $1 | mail ...

that seems to be all

1 Like

thanks