wget -r

I have noticed a lot of expensive books appearing
online so I have decided to copy them to CD.

I was going to write a program in java to do this,
but remembered that wget GNU program some of
you guys were talking about.

Instead of spending two hours or so writing a
program to do this. I am going to download
wget and do a one line command, but I want
to make sure that one liner is correct or I will
recursively use up all my hard drive.:eek:

The web site is for example http://bb.book.com/somebook

So I will run this command

wget -r -Dwebsite.com http://bb.book.com/somebook/

And copy all files linked to in this domain. The whole book!:smiley:

Do you have any suggestions?

A lot of those books are probably Linux books like
'Advanced Linux Programming' or stuff at tldp.org.

What you will end up with is a big directory of files
These books are not going away anytime soon. :smiley:

And yes your syntax is correct, but you will create a directory tree full of (possibly) loads of files.

You should probably use something like

wget -r -k -l 5

So you can view the files offline. The -l limits the number of subdirectories.