Skip to content

Instantly share code, notes, and snippets.

@myfonj
Created February 18, 2020 09:34
Show Gist options
  • Save myfonj/a35933443493b8c30c6fb248d8d76fc4 to your computer and use it in GitHub Desktop.
Save myfonj/a35933443493b8c30c6fb248d8d76fc4 to your computer and use it in GitHub Desktop.
GNU Wget 1.20.3, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...
Mandatory arguments to long options are mandatory for short options too.
Startup:
--version
(-V) display the version of Wget and exit
--help
(-h) print this help
--background
(-b) go to background after startup
--execute=COMMAND
(-e) execute a `.wgetrc'-style command
Logging and input file:
--output-file=FILE
(-o) log messages to FILE
--append-output=FILE
(-a) append messages to FILE
--debug
(-d) print lots of debugging information
--quiet
(-q) quiet (no output)
--verbose
(-v) be verbose (this is the default)
--no-verbose
(-nv) turn off verboseness, without being quiet
--report-speed=TYPE
output bandwidth as TYPE. TYPE can be bits
--input-file=FILE
(-i) download URLs found in local or external FILE
--input-metalink=FILE
download files covered in local Metalink FILE
--force-html
(-F) treat input file as HTML
--base=URL
(-B) resolves HTML input-file links (-i -F) relative to URL
--config=FILE
specify config file to use
--no-config
do not read any config file
--rejected-log=FILE
log reasons for URL rejection to FILE
Download:
--tries=NUMBER
(-t) set number of retries to NUMBER (0 unlimits)
--retry-connrefused
retry even if connection is refused
--retry-on-http-error=ER
comma-separated list of HTTP errors to retry
--output-document=FILE
(-O) write documents to FILE
--no-clobber
(-nc) skip downloads that would download to existing files (overwriting them)
--no-netrc
don't try to obtain credentials from .netrc
--continue
(-c) resume getting a partially-downloaded file
--start-pos=OFFSET
start downloading from zero-based position OFFSET
--progress=TYPE
select progress gauge type
--show-progress
display the progress bar in any verbosity mode
--timestamping
(-N) don't re-retrieve files unless newer than local
--no-if-modified-since
don't use conditional if-modified-since get requests in timestamping mode
--no-use-server-timestamps
don't set the local file's timestamp by the one on the server
--server-response
(-S) print server response
--spider
don't download anything
--timeout=SECONDS
(-T) set all timeout values to SECONDS
--dns-timeout=SECS
set the DNS lookup timeout to SECS
--connect-timeout=SECS
set the connect timeout to SECS
--read-timeout=SECS
set the read timeout to SECS
--wait=SECONDS
(-w) wait SECONDS between retrievals
--waitretry=SECONDS
wait 1..SECONDS between retries of a retrieval
--random-wait
wait from 0.5*WAIT...1.5*WAIT secs between retrievals
--no-proxy
explicitly turn off proxy
--quota=NUMBER
(-Q) set retrieval quota to NUMBER
--bind-address=ADDRESS
bind to ADDRESS (hostname or IP) on local host
--limit-rate=RATE
limit download rate to RATE
--no-dns-cache
disable caching DNS lookups
--restrict-file-names=OS
restrict chars in file names to ones OS allows
--ignore-case
ignore case when matching files/directories
--inet4-only
(-4) connect only to IPv4 addresses
--inet6-only
(-6) connect only to IPv6 addresses
--prefer-family=FAMILY
connect first to addresses of specified family, one of IPv6, IPv4, or none
--user=USER
set both ftp and http user to USER
--password=PASS
set both ftp and http password to PASS
--ask-password
prompt for passwords
--use-askpass=COMMAND
specify credential handler for requesting username and password. If no COMMAND is specified the WGET_ASKPASS or the SSH_ASKPASS environment variable is used.
--no-iri
turn off IRI support
--local-encoding=ENC
use ENC as the local encoding for IRIs
--remote-encoding=ENC
use ENC as the default remote encoding
--unlink
remove file before clobber
--keep-badhash
keep files with checksum mismatch (append .badhash)
--metalink-index=NUMBER
Metalink application/metalink4+xml metaurl ordinal NUMBER
--metalink-over-http
use Metalink metadata from HTTP response headers
--preferred-location
preferred location for Metalink resources
Directories:
--no-directories
(-nd) don't create directories
--force-directories
(-x) force creation of directories
--no-host-directories
(-nH) don't create host directories
--protocol-directories
use protocol name in directories
--directory-prefix=PREFIX
(-P) save files to PREFIX/..
--cut-dirs=NUMBER
ignore NUMBER remote directory components
HTTP options:
--http-user=USER
set http user to USER
--http-password=PASS
set http password to PASS
--no-cache
disallow server-cached data
--default-page=NAME
change the default page name (normally this is 'index.html'.)
--adjust-extension
(-E) save HTML/CSS documents with proper extensions
--ignore-length
ignore 'Content-Length' header field
--header=STRING
insert STRING among the headers
--compression=TYPE
choose compression, one of auto, gzip and none. (default: none)
--max-redirect
maximum redirections allowed per page
--proxy-user=USER
set USER as proxy username
--proxy-password=PASS
set PASS as proxy password
--referer=URL
include 'Referer: URL' header in HTTP request
--save-headers
save the HTTP headers to file
--user-agent=AGENT
(-U) identify as AGENT instead of Wget/VERSION
--no-http-keep-alive
disable HTTP keep-alive (persistent connections)
--no-cookies
don't use cookies
--load-cookies=FILE
load cookies from FILE before session
--save-cookies=FILE
save cookies to FILE after session
--keep-session-cookies
load and save session (non-permanent) cookies
--post-data=STRING
use the POST method; send STRING as the data
--post-file=FILE
use the POST method; send contents of FILE
--method=HTTPMethod
use method "HTTPMethod" in the request
--body-data=STRING
send STRING as data. --method MUST be set
--body-file=FILE
send contents of FILE. --method MUST be set
--content-disposition
honor the Content-Disposition header when choosing local file names (EXPERIMENTAL)
--content-on-error
output the received content on server errors
--auth-no-challenge
send Basic HTTP authentication information without first waiting for the server's challenge
HTTPS (SSL/TLS) options:
--secure-protocol=PR
choose secure protocol, one of auto, SSLv2, SSLv3, TLSv1, TLSv1_1, TLSv1_2 and PFS
--https-only
only follow secure HTTPS links
--no-check-certificate
don't validate the server's certificate
--certificate=FILE
client certificate file
--certificate-type=TYPE
client certificate type, PEM or DER
--private-key=FILE
private key file
--private-key-type=TYPE
private key type, PEM or DER
--ca-certificate=FILE
file with the bundle of CAs
--ca-directory=DIR
directory where hash list of CAs is stored
--crl-file=FILE
file with bundle of CRLs
--pinnedpubkey=FILE/HASHES
Public key (PEM/DER) file, or any number of base64 encoded sha256 hashes preceded by 'sha256//' and separated by ';', to verify peer against
--random-file=FILE
file with random data for seeding the SSL PRNG
--ciphers=STR
Set the priority string (GnuTLS) or cipher list string (OpenSSL) directly. Use with care. This option overrides --secure-protocol. The format and syntax of this string depend on the specific SSL/TLS engine.
HSTS options:
--no-hsts
disable HSTS
--hsts-file
path of HSTS database (will override default)
FTP options:
--ftp-user=USER
set ftp user to USER
--ftp-password=PASS
set ftp password to PASS
--no-remove-listing
don't remove '.listing' files
--no-glob
turn off FTP file name globbing
--no-passive-ftp
disable the "passive" transfer mode
--preserve-permissions
preserve remote file permissions
--retr-symlinks
when recursing, get linked-to files (not dir)
FTPS options:
--ftps-implicit
use implicit FTPS (default port is 990)
--ftps-resume-ssl
resume the SSL/TLS session started in the control connection when opening a data connection
--ftps-clear-data-connection
cipher the control channel only; all the data will be in plaintext
--ftps-fallback-to-ftp
fall back to FTP if FTPS is not supported in the target server
WARC options:
--warc-file=FILENAME
save request/response data to a .warc.gz file
--warc-header=STRING
insert STRING into the warcinfo record
--warc-max-size=NUMBER
set maximum size of WARC files to NUMBER
--warc-cdx
write CDX index files
--warc-dedup=FILENAME
do not store records listed in this CDX file
--no-warc-compression
do not compress WARC files with GZIP
--no-warc-digests
do not calculate SHA1 digests
--no-warc-keep-log
do not store the log file in a WARC record
--warc-tempdir=DIRECTORY
location for temporary files created by the WARC writer
Recursive download:
--recursive
(-r) specify recursive download
--level=NUMBER
(-l) maximum recursion depth (inf or 0 for infinite)
--delete-after
delete files locally after downloading them
--convert-links
(-k) make links in downloaded HTML or CSS point to local files
--convert-file-only
convert the file part of the URLs only (usually known as the basename)
--backups=N
before writing file X, rotate up to N backup files
--backup-converted
(-K) before converting file X, back up as X.orig
--mirror
(-m) shortcut for -N -r -l inf --no-remove-listing
--page-requisites
(-p) get all images, etc. needed to display HTML page
--strict-comments
turn on strict (SGML) handling of HTML comments
Recursive accept/reject:
--accept=LIST
(-A) comma-separated list of accepted extensions
--reject=LIST
(-R) comma-separated list of rejected extensions
--accept-regex=REGEX
regex matching accepted URLs
--reject-regex=REGEX
regex matching rejected URLs
--regex-type=TYPE
regex type (posix|pcre)
--domains=LIST
(-D) comma-separated list of accepted domains
--exclude-domains=LIST
comma-separated list of rejected domains
--follow-ftp
follow FTP links from HTML documents
--follow-tags=LIST
comma-separated list of followed HTML tags
--ignore-tags=LIST
comma-separated list of ignored HTML tags
--span-hosts
(-H) go to foreign hosts when recursive
--relative
(-L) follow relative links only
--include-directories=LIST
(-I) list of allowed directories
--trust-server-names
use the name specified by the redirection URL's last component
--exclude-directories=LIST
(-X) list of excluded directories
--no-parent
(-np) don't ascend to the parent directory
Email bug reports, questions, discussions to <[email protected]>
and/or open issues at https://savannah.gnu.org/bugs/?func=additem&group=wget.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment