gerp data from several files

Hello

I need help to grep data from several logs and generated one report .

The logs files are generated in one directory and the name of files starts from v

The log files contains this data and more data . I want to grep this data and put in one file .

/users/test/db/nameoftestcase

memory leak =50

error =10

The output i want to print
original name of file /users/test/dbnameof/testcase/ memoryleak error

find /onedirectory/somewhere -name 'v*' |\
while read file
do
      grep -e '/users/test/db' -e 'memory leak' -e 'error =' "$file"
done > /users/test/dbnameof/testcase/memoryleak_error

filenames with spaces in them are trouble.

why do you need the while loop
simply find <dir> -name <filename>|xargs grep -l -e <pattern> -e <pattern>

should give you the desired results

Jim actually explained the rations, which was to deal with filnames containing spaces, he used the while loop in order to be able to put quotes around the filename.

Try your example with filenames with spaces and you will understand, with no spaces the xargs method is much more efficient.

However I believe the OP was asking for something more think this:

find /onedirectory/somewhere -name "v*" -exec \
    awk '/\/users\/test\/db/ { printf "%s\t%s\t", FILENAME, $0}
         /memory leak =/ { printf "%s\t" $0}
         /error =/ {print}' {} \; > reportFile