I have 20 files which have respective 50 lines with different values.
I would like to process each line of the 50 lines in these 20 files one at a time and do an average of 3rd field ($3) of these 20 files. This will be output to an output file.
Instead of using join to generate whole bunch of redundant files and then compute the average, im looking any other possible better way to do the above right away.
awk offers arrays for precisely this type of problem. Make each item with a count a key of the array, then at END print the results. If all keys occur in all files you can simply divide by the number of input files, otherwise you will need to collect both the sum and the count (divisor) for each key.
The approach proposed by summer_cherry is kind of clever, although you should note that it has several assumptions: it assumes that all the files contain all the same tools, in the same order, and that they all have exactly four columns.
Also, if there are many files, the backticks will overflow. Moreover, it's rather excessive to use ls -l when you don't care for the long format, just the number of files. And since you are pasting the files, the number of columns will indicate how many files there are, so you can calculate the sum in awk from that.