Shell script - paralllel reading problem

I have a shell script which has awk utlilty,also,in it. This a single script that many other jobs (informatica) using. Each job ll generate a log file. This script will read the particular log file and need to give output. The log file is act as input file for the script. So , a single script used to read many log files, if we run the jobs parallel.

When i run all the jobs parallel, for few jobs ,the script works fine and give output propely. But for 4 to 8 jobs (out of 80 jobs) alone the script not at all reading the log file.

Can you please help me on this. Why the script behaves in this manner ? What should I do, so that the script will work fine , even in the parallel job run. ?

thank you
Gops

Did you check if the log files were already created for those 4 to 8 jobs that failed to read it. Perhapes there might have been a timing issue of when the job was trying to read and the log wasn't created. It will make sense to put a check in the script to verify that the log files are created before it tries to read it. How is the script executed? How are the log files generated and in what format?

cheers,
Devaraj Takhellambam

[LEFT]Yes Devaraj,

the script look for temp file the path of the log file is mentioned. Its reading the temp file properly. But the script could not pick the actual log file that mentioned in the temp file, for those 4-8 jobs alone. Ya .. the log file is being created for those jobs. Execution sequnceis like

          job--->job-log--->script execution-->output of script

But if I run the job one by one , all the jobs succeeded[/LEFT]