Bash function to save all images from a URL in the current directory, takes a URL as param.
Saved it as a bash function so that i don't have to worry about remembering the params for wget
.
-nd
no directories (save all files to the current directory; -P directory changes the target directory)-r
-l 2: recursive level 2-A
accepted extensions-H
span hosts (wget doesn't download files from different domains or subdomains by default)-p
page requisites (includes resources like images on each page)-e
robots=off: execute command robotos=off as if it was part of .wgetrc file. This turns off the robot exclusion which means you ignore robots.txt and the robot meta tags (you should know the implications this comes with, take care).-P
sets the directory prefix where all files and directories are saved to.
How do I use Wget to download all images into a single folder, from a URL?