$ wget -r -np -nd http://example.com/packages/
This little gem is probably my most used variation. It will download all files in the /packages/ directory on example.com — without traversing up to parent directories (-np), and without recreating the directory structure on your machine (-nd).
$ wget -r -np -nd –accept=iso http://example.com/centos-5/i386/
Adding the –accept argument with a list of file extensions (comma separated) will grab only those files ending in the specified extension.
Another way to grab just the files you want:
$ wget -i filename.txt
Put all the desired urls in filename.txt and run wget against it to download a list of files automatically.
On a bad connection?
$ wget -c http://example.com/really-big-file.iso
The “-c” option tells wget to continue and retry until it has completed downloading.
wget -m -k (-H) http://www.example.com/
Mirror a site, converting its links to work locally, so that you can move the site to another server. Use the ‘-H’ option if images are loaded from another site.