ksh to check url

I have a server that keeps going down (503 Service not available). Until we find out the problem I would like to setup a simple ksh script in cron that will query url and report the status code. This way we can get someone to restart the process.

Does anyone know a simple command I can call from ksh to report the status code of a url?

Have a look at the output of:

wget -q -O /dev/null -S http://..../

It will dump the http headers, so with a little grep'ing you could get the correct info

lynx -dump http://google.com &> /dev/null
if [ $? = 0 ]
then
echo "website is up"
else
echo "website is down. Trigger the email"
fi

Any idea how to get this to work with a secure https:// site?

I did find this however I am working on AIX and it does not have wget. Do you know if there is anything like this for AIX?

You can use curl - download from here.
With the command:

 curl -s -D - -o /dev/null ...url...