curl

Aren't there any way to download files as below?
For example, I want to download all .html files under the root directory of unix.com/

curl -O http://www.unix.com/*.html

This won't work, but please tell me the way to do this.

Well, the best way is to get the file list of the directory, but i don't think this is possible!

All url's in the unix.com forum follow the pattern,

http://www.unix.com/showthread.php?p=79349 and similiar.

Why not keep incrementing the count which comes after p= and retrieve the file.

vino

Thank you vino!
wget is perfect, i think.

Ah... in fact, I don't want to download this forums' files.

As far as I know, wget can't do that, can it? (Warning: wildcards not supported in HTTP.),, wget 1.8.2

Well, I have some info about the file name, so it's not the point for me.
Also, wget can get the directory, so we can use wildcards after the download, i think.

Thank You.

yes, it accepts wildcards when you download from a 'FTP' site, but i'm not sure about HTTP

though wildcards are not supported for HTTP, you can still use -r and -p.
and btw, if this forum works similar to database-driven phpbb, there's no page such as curl. there's showthread.php, and p seems to be the thread index variable, whose contents will be fetched from database when showthread.php is executed.