read a file for input and grep in another file

Hi,

I'm trying to read a fille into a loop and grep for a string and print the last field of the string in the second file. Then redirect the output to another file but keeping the output in the same order as the original file.

I've tried using the following but the ouput from this does not maintian the origianl order of the file i'm reading in. Any suggesstion on how to fix would be much appreciated.

for name in 'cat file.1'
do
grep $name | awk '{print $NF}' file.2
done >> file.3

Thanks,

Uh oh, useless use of cat :slight_smile: Try this (untested):

#!/bin/ksh

# create or clear the output file if it exists.
> file.3

while read name
do
  grep $name file.2 | awk '{print $NF}' >> file.3
done < file.1

exit 0

Please mention how large the original files are, and post some small but representative sample data of all three files, pointin out if they are sorted in any useful and relevant way.

Two quite different ideas:

1) Convert file.1 into an egrep command and execute it on file.2 outputting to file.3.

2) Use "cat -n" to number each line of file.1 , execute the original grep loop on file.2 but output to pipe, numeric sort on first field to restore the order of the file, uniq (to remove duplicates), strip off the leading number, output to file.3.

@gary_w
The script posted contains the same design problem but does fix the problem that the original script never read file.1 (because of wrong type of quotes).

Cheers chaps will give your suggesstions ago.

I ended up using the following

while read name
do
awk '$1 ~ '$name'' file.2| awk '{print $NF, $4}' >> file.3
done < file.1

Thanks again for the help.