Is there a way to make this find job run as one process, more efficiently?

Hi, unixers, I'm doing something like

find -type f -name '*.gif' | xargs rm &
find -type f -user root | xargs chown idealo &
wait
#do more stuff

on a very large dir. I'm thinking this would be more efficient if it was running as a single process that did one thing for the first condition, another thing for the other condition. I could do this with perl (File::Find), but I'm wondering if there's an elegant solution with bash.

thanks for any insight!

You might be wrong in your thinking.

By creating two sub-proceses you are probably making the overall time of the job shorter. This is because both of those commands will spend time wating on disk i/o's to complete. When one process is waiting for i/o completion the other one can run.