Running two command at the same time

Is there any way I could run two commands at the same time? Say I have in my script a command that grep a keyword from a huge size file:

zgrep $KEYWORD $FILE

and because this is a large file it takes a while to finish, so I would want that while zgrep is doing its job, I have a function that serves a progress bar:

Grepping..

Grepping....

This means that the dot (.) increases simultaneously while zgrep is doing its job internally in my script. Is that possible?

Run the command at background with &.

You didn't say which shell you were using, but something like this should work:

grep $PAT $FILE > foo.bar &
typeset PID=$!
echo "Processing...\c"
while ps $PID > /dev/null 2>&1; do
  echo ".\c" && sleep 1
done
echo "Done."

The "\c" in the echo command stops the CR/LF (ksh). Other shells probably use "echo -n" instead.

Of course, it just occurred to me that this is an endless loop if the background command never dies. And it's never good to program endless loops. Sooooooo:

grep $PAT $FILE > foo.bar &
typeset PID=$!
echo "Processing...\c"
typeset START_TIME=$SECONDS  TIMEOUT=1000
while ps $PID > /dev/null 2>&1; do
  echo ".\c" && sleep 1
  if [[ $(((SECONDS-START_TIME))) -gt $TIMEOUT ]]; then
    echo "TIMEOUT! ($TIMEOUT seconds)"
    break
  fi
done
echo "Done."

This way, the loop will exit after 1000 seconds regardless. This probably isn't much of an issue for "grep", but it's a good practice.

But what if the zgrep is not yet done on its job after the 1000 seconds?

I notice that the loop is still infinite.... while in progress I press CTRL+C to quit the script. Then I found out that foo.bar was created almost 2mins ago already.. but still the progress is still running, see below:

Orbix_BOX:/tmp> ./testScript
Mon Nov 26 08:06:44 PST 2007
Processing...............................................................................................................................(CTRL+C)Orbix_BOX:/tmp> date
Mon Nov 26 08:08:57 PST 2007
Orbix_BOX:/tmp> ls -latr foo.bar
-rw-rw---- 1 orbix orbix 137065 Nov 26 08:06 foo.bar
Orbix_BOX:/tmp>

That's because the grep was running in background. If you want to kill it, for instance, you may want to use "trap" to kill it when you press CTRL+C:

[...]

function ctrlc {
   kill $GREP_PID
}

trap ctrlc INT

[...]

grep $PAT $FILE > foo.bar &
GREP_PID=$!

[...]

Regards.

The background job will continue to run.

I didn't script an error path, since that's a whole other conversation entirely. But as a rule, the background job should be killed.

echo "Terminating job '$PID'."
kill $PID && sleep 1
ps $PID && kill -9 $PID && sleep 1
ps $PID 2>&1 && echo "Job did not die!" || echo "Job terminated."

Of course, the error path from the loop should also clean up any temp files, stop any other background processes, log some diagnostic info, terminate the script, etc.

Ideally you would create a trap that would kill the background job(s) if the script exited or was killed. But that is a separate issue.

On the Unixes I use, "ps $PID" will return non-zero when $PID is not in the process table. It may be that your Unix does not, in which case you must use some form of "ps $PID | grep" to determine if the background process has completed.

It works well now. Thanks for everyone who help me! :smiley:

Or

while kill -0 $PID 2> /dev/null
do
  echo $PID still alive
done