You can use this under following situations:
- Download all images from a website
- Download all videos from a website
- Download all PDF files from a website
$ wget -r -A.pdf http://url-to-webpage-with-pdfs/
Following is the command line which you want to execute when you want to download a full website and made available for local viewing.
$ wget --mirror -p --convert-links -P ./LOCAL-DIR WEBSITE-URL
- -mirror : turn on options suitable for mirroring.
- -p : download all files that are necessary to properly display a given HTML page.
- -convert-links : after the download, convert the links in document for local viewing.
- -P ./LOCAL-DIR : save all the files and directories to the specified directory.
First, store all the download files or URLs in a text file as:
$ cat > download-file-list.txt URL1 URL2 URL3 URL4
Next, give the download-file-list.txt as argument to wget using -i option as shown below.
$ wget -i download-file-list.txt