size of these files is from 5-20 mb. I want to scan each of these files one by one and extract multiple values from each of them. The format for extraction would be same for each of the files.
The high level code that i am thinking of is :
while read lookupfilecontent
wf_dir = `grep lookupfilecontent |cut -d ',' -f7`
sess_dir =` grep lookupfilecontent |cut -d ',' -f10| cut -d ',' -f1`
#this goes on for 7-8 values
done<data/SrcFiles/ETL_LOOKUP.dat
Now i want to ask if there a more efficient way for this ? Wouldn't using grep mutiple times be a performance concern as I am reading single file everytime in each line ?
You can operate the greps on all the files at once, or if there are too many and the names of the files do not need to be in the result you can concatenate the files first and run your greps on that, maybe you can combine greps?
Bit of a guess because it is not clear at this point what you need to do with the results, what the files and the output looks like and what results you are looking for..
The input files as logs of production environment and are sensitive , so cannot share the input files . Let me know if i am not clear in my requirements.