Help with shell script handling processes

Hello

I have a file which has around 120 lines of commands.

I am trying to write a shell script like which reads the 'command' file and executes line by line with some additional (common argument) with maximum 6 commands active at a time. Each of these commands when executed takes time which is variable and cannot be implemented within the script.

Attached both the command_file and shell script which I am executing like below,

bash: $ nohup ./program01.sh > /usr/local/app/output.log &

Expected outcome, in the /usr/local/app/output.log is as below:

Running command for Documentation with name starting with A....
Running command for Documentation with name starting with B....
Running command for Documentation with name starting with C....
Running command for Documentation with name starting with D....
Running command for Documentation with name starting with E....
Running command for Documentation with name starting with F....
Running command for Documentation with name starting with G....
No of running is 7...waiting
Running command for Documentation with name starting with H....
No of running is 7...waiting
Running command for Documentation with name starting with I....
No of running is 7...waiting
Running command for Documentation with name starting with J....
No of running is 7...waiting
Running command for Documentation with name starting with K....
No of running is 7...waiting

The outcome I see, in the /usr/local/app/output.log is as below:

Running command for Documentation with name starting with A....
Running command for Documentation with name starting with B....
Running command for Documentation with name starting with C....
Running command for Documentation with name starting with D....
Running command for Documentation with name starting with E....
Running command for Documentation with name starting with F....
Running command for Documentation with name starting with G....
Running command for Documentation with name starting with H....
Running command for Documentation with name starting with I....
No of running is 7...waiting
No of running is 7...waiting
Running command for Documentation with name starting with M....
Running command for Documentation with name starting with N....
Running command for Documentation with name starting with O....

It is skipping J & K and starting to execute M, N, O and so on. How do I make the program to wait before all the previous lines from the command_file are executed and only then proceed to next line.

May be it's a simple trick, I am missing out. I am not a shell expert. Your help is highly welcomed.

Regards
JS

If there is no precedence, why not simply split the 120 line file into 6 files of 20 lines each, and then start the 6 files. Remove the & from the end of each line. There will never be more than 6 processes.

1 Like
#!/bin/bash

maxnum=6

# Initial load
c=1
while [ $c -le $maxnum ]; do
    read line || exit
    echo "$line" | sh &
    p[$c]=$!
    echo ${p[$c]}
    ((c++))
done

# Loop through all, if dead read and start new
while read line; do
    echo "$line"

    q=1
    while [ $q ]; do
        c=1
        while [ $c -le $maxnum ]; do
           kill -0 ${p[$c]} 2>/dev/null
           if [ $? -ne 0 ];then
               echo "$line" | sh &
               p[$c]=$!
               echo "run $c"
               q=""
               break
           fi
           ((c++))
        done
    done
done
# don't exit till remaining background jobs complete
wait
1 Like

If i may propose a command povided by my visual toolset TUI.
Can not open the attached command_list.txt - for the provided code, i expected to have each "regular valid command" on a single line.

Once TUI is installed, place this where your other files are.

tmpdir=/tmp	# Dir for temp/work files
LIMIT=6 	# amount of background processes
C=0		# counter

# Create tempfiles, as the tool is based on 'scripts' not 'commands'
while read line
do
	echo "$line" > $tmpdir/$$.$C
	((C++))
done<<command_file.txt

# run all the commands/scripts, but only LIMIT processes
tui-bgjob-mgr -l $LIMIT $tmpdir/$$.*

# Remove the tempfiles/commands
rm $tmpdir/$$.*

Alternativly, you could try this:

while read cmd;do $cmd;done<<command_file.txt

Which is basicly the same as calling it like:

bash command_file.txt

:wink:

Hope this helps

1 Like

Thanks everyone, but I was looking for something easier. With some more dragging around I found the best way to do it without too much complexity.

Adding the code below for reference.

 
#!/bin/bash
FILE2=/usr/local/app/command_file.log
while read line; do
command=$(echo $line)
log=$(echo $line | awk -F'"' '{$0=$2}1')
count=$(ps aux | grep -i "tscV6Connector" | grep -v "grep" | wc -l)
instance=$(echo $line | awk -F'"' '{$0=$4}1')
if [ $count -le 6 ];
then 
    echo "Running command for "$log" with name starting with "$instance"...."
    nohup "additional_command_parameters" > /usr/local/app/log/"$log"/"$log_$instance".log &
    sleep 10    #Sleep time for the extraction process.
    now1=$(date +'%d/%m/%Y %X');
    echo -e "*** TSO *** "$now1": Started extraction for MAGIC object type "$log" with name starting "$instance" process PID = "$!""
else 
    now2=$(date +'%d/%m/%Y %X');
    echo -e "*** TSO *** "$now2": Total MQL extraction process reached "$count"....\n*** TSO *** "$now2": waiting for previous process PID "$!" to complete"
    wait $!;    #Main wait statement when count >=7. Once wait is over, run the next command after wait.
    nohup "additional_command_parameters" > /usr/local/app/log/"$log"/"$log_$instance".log &
    sleep 10;    #Sleep time for the last MQL which was waiting before log files.
    now=$(date +'%d/%m/%Y %X');
    echo -e "*** TSO *** "$now3": Started extraction for MAGIC object type "$log" with name starting "$instance" process PID = "$!"\n*** TSO *** "$now3": Total number of extraction MQLs on DB are = "$count"\n"
fi
done < $FILE2

This way when 7th process is trying to start, it waits until 6th is completed and so on .....

A simple logic, works well enough :slight_smile: :slight_smile: :slight_smile:

Experts are welcome to comment.

BR/
JS

Not necessarily. You wait for the last background job you started to finish, but it might have already completed before you counted the number of running jobs. So you could have 7 jobs running concurrently instead of 6. Should we assume that for what you are doing the difference between running 6 or 7 jobs at once doesn't really matter?

Please note the comments I have added to you code in red above. Maybe you will find them useful.

You might also want to consider changing:

log=$(echo $line | awk -F'"' '{$0=$2}1')
instance=$(echo $line | awk -F'"' '{$0=$4}1')

to:

IFS='"' read -r junk log junk instance junk <<-EOF
        $line
EOF

to get rid of two invocations of awk for every line read from your input file.