Script that waits for process to complete

Hello,

I am in need of running an executable provided by a vendor that basically syncs files to a db. This tool can only be run against one folder at a time and it cannot have more than one instance running at a time. However, I need to run this tool against multiple folders. Each run of the tool takes anywhere from 5 to 15 minutes to complete. Running this executable takes the following form:

/path/to/exec/synctool <couple of switches> "/folder/to/scan"

I was wondering if it would be possible to script this with the intention of
A. being able to read a list of folders to scan (there may be up to 30 different folders that this tool has to scan). I can easily generate this text file.
B. Run one instance of the tool and monitor it to see when it completes.
C. Run again for the next folder and so forth until all folders are scanned.

Thanks!

Try:

while read -r dir
do      /path/to/exec/synctool <couple of switches> "$dir"
done < /path/to/file/containing/list/of/directories/to/process

Sth like

while read folder_to_scan
  do /path/to/exec/synctool <couple of switches> "$folder_to_scan"
  done < list_of_folders_to_scan 

?

I thought of that type of script but will that actually wait for the first instance of the tool to finish and then run the next instance?

I guess I'm just being a tad paranoid...haha.

The script waits until synctool is finished.
If synctool immediately finishes, putting the actual scan job into the background, then - bad luck.

I believe it actually continues to run because I currently am running one folder at a time and then watching TOP until the process goes away. Then I move to the next folder.

Thank you for the easy suggestion!