Read a file from the nth line on

I have a script which reads from a job file and executed the scripts in the job file in sequence.

#! /bin/ksh
set -x
while read line
do
$line.ksh
if [ $? -ne 0 ]
    # mail the team
fi
done <"$file"

The job file will be like

abcd
efgh
ijkl
mnop
qrst

This is working fine. I need to add the functionality, if the job fails in ijkl, when i run the job second time after fixing the issues with job ijkl. I need my script to run from ijkl.

If everything ran fine. It needs to start from the first job abcd.

Any idea how to go about it?

several ideas, if you know it failed on the nth line, your can use the tail command to chop off the first lines (calculate the number using wc -l on the whole file to find the total number of lines).

Or you can have some logic in the loop not to take an action if a loop index value is less (or equal depending on implementation) than the last fail value (which can be taken from the index of a previous run). When there is no previous fail the same code will with with 0 or 1.

Mike

Assuming script does not run to completion if a job fails, something like the following might work:

At top of script, put:

touch jobs_done.txt

In the loop, before each job runs, put test condition:

grep -q $line jobs_done.txt
if [ $? -ne 0 ]; then continue; fi

In the loop, after job runs, put following line:

echo $line >> jobs_done.txt

At end of script, put:

rm jobs_done.txt

If script runs to completion, even if job fails, then the basic idea may still work, but 1) don't delete jobs_done.txt at the end, 2) add some code (FAIL or SUCCESS) to the jobs_done.txt file and also grep for that code to determine what to do.