Unix Script to read the XML file from Website

Hi Experts,

I need a unix shell script which can copy the xml file from the below pasted website and paste in in my unix directory.

http://www.westpac.co.nz/olcontent/olcontent.nsf/fx.xml

Thanks in Advance...

wget -o /Your_Unix_DIR/fx.xml http://www.westpac.co.nz/olcontent/olcontent.nsf/fx.xml
wget http://www.westpac.co.nz/olcontent/olcontent.nsf/fx.xml -O "xml.txt"
awk 'BEGIN{FS="[ <>]"}{if($0 ~ "</rate>") print $6}' xml.txt >> out.txt 

Hi All,

I have executed the get_Fx.sh script file in (HP-UX B.11.31 U ia64 3519845154) version.

get_Fx.sh Code:

#!/bin/sh
wget developer.westpac.co.nz/production -O "xml.txt"
awk 'BEGIN{FS="[ <>]"}{if($0 ~ "</rate>") print $6}' xml.txt >> /dev/exchangerates/fromsource/fx.xml

I am getting below error.

get_Fx.sh[2]: wget: not found.
get_Fx.sh[3]: /dev/exchangerates/fromsource/fx.xml: Cannot create the specified file.
\{xyz@abc01:/dev/exchangerates/fromsource}

Then you need install wget or curl , google on web to find out the way to install it.

permission denied?try to change to "root"

HI,

I have checked with my Unix Team and it seems they need lot of approvals to install WGET on the unix system. Can I achive the same using FTP script?

You would have to find out if the ftp port is open on the other end, and if there are any firewall restrictions for IP address.

sorry,I am a unix newbie:o,unfamiliar with FTP:wall:,hope others can help you.