Bash Script to Find the status of URL

#!/bin/bash
timevar=`date +%F_�%H_%M�` #-- > Storing Date and Time in a Variable
get_contents=`cat urls.txt`  #-- > Getting content of website from file. Note the file should not contain any http:// as its already been taken care of
######### Next Section Does all the processing #########
for i in $get_contents
do
statuscode=`curl -connect-timeout 30 -w �totaltime:%{time_total}\n� -s -I -L http://$i | awk '{for(i = 1; i<=NF; i++) if($i=="�totaltime:0.000n��totaltime:0.000n�HTTP/1.1") print $(i+1);}'`
case $statuscode in
200) echo "$timevar $i $statuscode okay" ;;
301) echo "$timevar $i $statuscode bad" ;;
esac
done

this is the simple script which i wrote to find the status of URL now i have doubt how i generate the log file, inside how can i give the option to store the output as file.

now the output like this

and this has to be store in a file (like log) and critical error code displayed means it automatically send mail to admin. with the error code.

is it possible to generate a script like that.

With Regards
Anish Kumar.V