shell script to download files from a site?

Hey everyone, my wife has purchased a bundle package of a bunch of images from a site, and now has to download each one of them manually. There are about 500 downloads, and it's quite a hassle to browse to each page and download them all individually.

I would like to write a shell script to just go out and grab all the links and do it all automatically, but one part of the script eludes me.

She has an email that contains the initial links to all the download pages. I'm pretty sure I could manipulate the data in that page, and get all those links out and into a file. Then from there, I would do a for loop and loop through that file. So what I would need to do from there is have the script go to each link, then get the source code from that page. At that point, I could manipulate the data in that file and grab the link for the file in there. This is where it gets a little tricky. I can't just tell it to download that link, because it's just a php page that initiates the download for you. So I would have to have it hit the link, and tell it to save the file somewhere.

If all I could do was get it to hit the link for me, that would still save a lot of time. Then we could just sit there while it runs, and click "Save" on every one.

Anyone have any ideas?

Thanks.

If you have firefox you can use the add-on DownThemAll. Excellent for this job!

Regards

Cool addon.