wget to check an URL

I all,
I wrote an script which starts a Weblogic server and waits until its loaded to deploy several apps. The way I checked was something like:

[...]
while [ $rc -ne 0 ]; do
   wget --spider <URL>:<port>/console > /dev/null 2>&1
   rc=$?
done
[...]

This works perfectly because it's an HTML site and when server is started the site is accessible.
Now I need to do the same, but I also need to wait for server instances, which listens at same URL but different port than Server.
I can't use return code of wget because it always return 1.
If instance is up, wget responses:

If instance is down:

The return code is always 1 because although it was listening there is no HTML web to view as server has.
So how can I check if instance is available or not? Any other bash command to check an URL and port?

Thanks a lot and sorry for english :wink:

wget only supports http and ftp protocols.
From what I understand, you want to check if each service listens to his port?
What about:

$ nmap -p[port] ip.ad.dr.ess | grep [port]

?
It outputs (with ftp service for example):

# invoke-rc.d vsftpd stop
Stopping FTP server: vsftpd.
$ nmap -p21 127.0.0.1 | grep '21'
21/tcp closed ftp

# invoke-rc.d vsftpd start
Starting FTP server: vsftpd.
$ nmap -p21 127.0.0.1 | grep '21'
21/tcp open  ftp

If it's always going to return those messages, just look for "connected" or "failed" and take action appropriately.

host='172.18.242.32:7001'
result=`wget $host | head -1`

if (echo $result | grep conected 1>/dev/null 2>&1);
then
        echo success
else
        echo failure
fi