Extracting data from https server with the help of unix shell script

There is a folder which can be accessed through URL by giving a particular Username and Password.Inside the folder there are few excel sheets.The excel sheets/folder need to be imported from there to unix box with the help of unix shell script.

Can anyone help me?Does anyone have code for it? kindly share if anyone has code for the same.If anyone has done it before please reply.

man wget

wget --user=user --password=password "URL/EXCEL_FILE"

That will work with basic HTTP folder authentication but not with a form login. Post the URL and I can be more specific.

If there are a bunch of files and you know their names make a loop...

#!/bin/bash
url="SOME_URL"
user="username"
pass="getinthere"
for xlsfile in "file1.xls" "file2.xls" ; do
  echo "Retrieving $xlsfile"
  wget --user=$user --password=$pass "$url/$xlsfile"
done