Sometimes it is necessary to dowload large files to remote server (for example datasets for Kaggle competitions), but download is available only for authenticated users. You can do it in backround task using cookies from your browser for authentication. This note generally copy wget manual https://www.gnu.org/software/wget/manual/wget.html
First you have to create cookies.txt
file with copy of all your cookies for required site
If you use chrome, this extension is useful https://chrome.google.com/webstore/detail/cookiestxt/njabckikapfpffapmjgojcnbfjonfjfg
Example:
wget --load-cookies ./cookies.txt -x -nH -bqc --cut-dirs=3 https://www.example.com/very-large.zip
-x
Force creation of directorieswget -x http://example.com/some/files.txt
will save the downloaded file toexample.com/some/files.txt
-nH
Disable generation of host-prefixed directories. By default, invoking Wget with-r http://example.com/
will create a structure of directories beginning withexample.com
. This option disables such behavior--cut-dirs=NUMBER
Ignore NUMBER remote directory components-b
Go to background after startup-q
Quiet-c
Resume getting a partially-downloaded file
if you need just one file, command will be shorter:
wget --load-cookies ./cookies.txt -bqc https://www.example.com/very-large.zip