Executing a batch of files within a shell script with option to refire the individual files in batch

Hello everyone. I am new to shell scripting and i am required to create a shell script, the purpose of which i will explain below.

I am on a solaris server btw.

Before delving into the requirements, i will give youse an overview of what is currently in place and its purpose.

 
#!/usr/bin/bash
./xmlprocshell.sh file1_import.xml    # This script accepts an XML file as an argument and performs different tasks and writes the result to an answerfile
./xmlparse.sh file1_import_answer.xml # This script parses the xml answer file and prints the results and ERRORS if any.
a=`printf 'cat //Guiroot/Errors/Errorcount/text()'| xmllint --shell file1_import_answer.xml | sed -n 2p` #extract the errorcount from the answer file
echo "THERE ARE $a ERRORS IN EXECUTION"
if [ $a -gt 0 ] ; then
exit
else
echo "execution successful"
fi
 
./xmlprocshell.sh file2_import.xml    
./xmlparse.sh file2_import_answer.xml
b=`printf 'cat //Guiroot/Errors/Errorcount/text()'| xmllint --shell file2_import_answer.xml | sed -n 2p`
echo "THERE ARE $b ERRORS IN EXECUTION"
if [ $b -gt 0 ] ; then
exit
else
echo "execution successful"
fi
.
.
#And so on

There are over 50 of these xml files that needs to be executed by the shell. If there are errors in execution of an xml file, the shell script should exit and the error will need to be fixed.

The tricky part is: once the error is fixed, the script should present the user with an option to resume the execution from the xml file that failed or execute all the files from the beginning.

How do i go about designing a script that will do the following:

  1. Present the user with a list of 50 files to run.
  2. Any option that the user chooses should pass the file as an argument to the xmlprocshell.sh which do do the deed.
  3. If the execution is successful the next file in the order is executed automatically and so on.

Do i put the above inside a function and pass the files as arguments or do i go with loops. If anyone can provide me with an example template, i'd be extremely grateful. Thanks in advance

Hmm. Like you say, the tricky bit is saving it. Thinking on something.

---------- Post updated at 01:09 PM ---------- Previous update was at 12:47 PM ----------

This mockup works for me:

#!/bin/sh

function fail
{
        printf "Should $1 succeed? "
        read VAR
        [ "$VAR" == "y" ] && return 0
        return 1
}

if [ -f .faillist ]
then
        set -- `cat .faillist`
        echo "Files succeeded:  `cat .succlist`"
        echo "Files remaining:  $*"
        echo "$1 failed previously."
        echo

        while [ "$VAR" != "r" -a "$VAR" != "s" -a "$VAR" != "q" ]
        do
                printf " Resume, Start Over, Quit [r/s/q]: "
                read VAR
        done
fi

case "$VAR" in
r)                      ;;      # Change nothing, everything's ready
q)      exit 1          ;;
s)      rm  -f .faillist .succlist
        set -- *.xml
        ;;
*)      set -- *.xml    ;;
esac

while [ "$#" -gt 0 ]
do
        echo "$@" > .faillist
        if ! fail "$1"
        then
                echo "Error in $1"
                echo "Aborting"
                exit 1
        fi

        printf " %s" "$1" >> .succlist
        shift
done

echo "Files succeeded:  `cat .succlist`"

rm .faillist
rm .succlist

It looks for xml files in the current directory but can be changed to look for them anywhere. It uses two files to record what succeeded and what failed.

Replace the 'fail' function with whatever you want to do to process these xml files. If you don't make it a function, you don't even need bash or ksh, nearly any sh should do.

1 Like

Thanks a mil Corona. I will get cracking on the shell script as soon as i have access to a terminal and post the resuts.

I really appreciate your time and effort.