error checking in bash

Could someone please advise on a good way to implement error checking in a script?
i am taking input from a file, iserting it's values into two commands and sending the output of each to seperate log files. i need to check those log files for for an occurance of 'error', stop the command if 'error' exists,. if it does not, then continue to next command. i have this so far....using mkdir for testing purposes.
...exerpt
for COMPONENT in `cat $1|awk '{print $1, $2}'`
do
printf "\n \n Creating components for $COMPONENT \n"
mkdir $COMPONENT >> ~/scripts/$COMPONENTERROR 2>&1
if grep error $COMPONENTERROR ! 1 || { echo "check error log"; exit 1; } fi

printf "Creating Island $2 \\n"
mkdir $2 >> ~/scripts/$ISLANDERROR 2>&1
	\#if [stop_this -eq 1] then exit "STOP" else
	\#fi

done
....
or should i take another approach?

Many commands and scripts return an exit code of non zero if an error occurred. This would be a good place to start checking but what you are attempting with grep is common when log files are the only means for communicating error messages.

mkdir $2 2> yourlog
if [[ $? -ne 0 ]]
then
    start looking for errors in yourlog
fi

This redirect stderr to yourlog and if the exit code from "mkdir" is non zero, the log is checked. You can also capture the error message into arrays or variables whatever you prefer. I prefer arrays.

set -A ERROR_ARRAY $(mkdir $2 2>&1)
for i in ${ERROR_ARRAY[@]}
do
    ...
done

Put this at the top of every script you write.

LOG=/tmp/mylog

##################
function if_error
##################
{
if [[ $? -ne 0 ]]; then # check return code passed to function
print "$1 TIME:$TIME" | tee -a $LOG # if rc > 0 then print error msg and quit
exit $?
fi
}

Example below:

hostname
if_error "hostname command failed to return hostname"

chmod 775 /etc/hosts
if_error "chmoding /etc/hosts to 775 has failed"

-X

Thanks guys, I really appreciate it. Both work for me.