while read list
do
nohup ./process.sh $list &
process_id=$!
wait $process_id
done < sample.lst
command1..
command2..
//Am calling a shell script within and firing the multiple processes concurrently.
// I have to wait until all the above processes are finished and have to execute command1
//How do i assign the individual processes into a process id and wait
If i try the above code, will the processes gets triggered concurrently? or it wait for first process to finish and then start second process?
My requirement is to trigger all the processes to run in the background concurrently and wait for all them to finish. and then execute next commands in the main script.
the wait inside the loop waits on each child process in turn, not for all child processes running at once. Put a single wait outside the loop.
If your list is long the whole process can have problems when you run large numbers of child processes. The ulimit command is your friend. I would give you a how-to but I need to know the UNIX system and the shell you use for the project.
Please include system and shell information in your reply.
In post#1, you'll start a list 's process in background, and then immediately sit there wait ing for it to finish before starting the next one. So - serial execution, nothing in parallel. Try dropping the individual wait s, and place a generic one after the loop.
I will only have these processes anywhere between 5 - 10 in total. Do i still need to bother about count limit? Forget about limit for now. Is my code a effective one?
To be honest i don't get it: what exactly is the nohup for in this script? When you are waiting for the finishing processes anyway you don't need any nohup, no?