Check for exit status

Hi I have following code
I want If whole code executes successfully then return true If found any error then print the error

I tried if [ "$?" == "0" ]; then
But this checks only for the just upper line execution

 
#!/bin/bash
PATH1=/var/log/mysql
PATH2=/home/ankur/log
FILE1=mysql-bin.index
mysqladmin -u root -p'test!@#' flush-logs
printf "%s\n" $(comm -3 $PATH1/$FILE1 $PATH2/.FILE2)>$PATH2/.FILE3
sed -e '$d' $PATH2/.FILE3 >$PATH2/.FILE4
awk '{ a[NR]=$0 } END { for(i=NR; i; --i) print a } ' $PATH2/.FILE4 > $PATH2/.FILE5
sleep 1
echo "Today is $(date)"
echo "--------------------"
for i in `cat $PATH2/.FILE5`;
do
j=(${i/*\//})
#echo "$j"
if [ ! -f $PATH2/$j ];
then cp -fr  $PATH1/$j $PATH2/$j ;
gzip -9  $PATH2/$j
echo "Copying binlogs : $j"
#else
#echo "ankur"
#cat FILE1 > FILE2
#exit 0
fi  ;
done
cat $PATH1/$FILE1 > $PATH2/.FILE2
sed -e '$d' $PATH2/.FILE2 >$PATH2/.TEMP
cat $PATH2/.TEMP>$PATH2/.FILE2
exit 0

Put the piece of code in a function then check the status of function.

Hi I put the function as follow

 
#!/bin/bash
checkstatus () {
..
.
}
checkstatus;
if [ "$?" == "0" ]; then
echo STATUS=${?}
else
echo "failed"
fi
exit 0

But If I got any error except last line of code, exit status =0 always

I want If any error found in function at any line then function terminates from there only and show error

You want to check for errors on each line but do not want to check for errors on each line, shell remembers the last command status, you rather have to pipeline the process.

Two ways :
1) Put your commands in nested if ... then ... fi statements.
2) Chain your commands with && like

command1 && command2 && command3
# Is similar to
if command1
then if command 2
       then command3
       fi
fi

That will return "0" only if the 3 commands executed successfully. The other thing is that command2 will only be executed if command1 executed successfully, and so on. It's almost preferable to avoid executing a command if the previous operation didn't succeed.