How to speed up grep?

hi

i'm greping the files with sepefic keyword, where the file is of too big. Assume there are 10 days log file each of more than 200mb. i've to grep all those files with a specific keywords.

for example,

  1. i'll grep for error message
  2. after the i'll do one more grep for keyword info
    3 .....

like that i'm doing grep on the log file for lot of times. so the process looks bit late. Can anyone tell how to speed up this?

Regards,
vijay:b:

show what you did. provide input samples as needed.

I assume you are doing this.,

grep 'error message' FILE | grep 'info'

You wanted to do it in, single grep then.,If you are sure, the info is followed by error message, then you can do it by the following,.

grep "error message.*info" FILE

Or you meant some thing else ?

Try To go in ur log folder

grep -i " error message" `find ./ -type f`

it will search all files in that folder and grep " error message".

---------- Post updated at 07:04 PM ---------- Previous update was at 07:00 PM ----------

if output is too much then just give as following

I am trying this one on Solaris 10

grep -i " error message" `find ./ -type f` | more

Not sure of your exact requirement - particularly whether there are dependencies. The "egrep" command can search for multiple messages in one pass.

ls -1tr logfile* | while read FILENAME
do
    egrep -i "message 1|message 2|message 3|message 4" "${FILENAME}"
done

I sometimes use "egrep" to eliminate known good messages ( egrep -iv ) and then only look at the exceptions.

Unless you only look at the logs every 10 days there is little point in reading old logs a second time unless the search criteria changes after the event.