-
-
Save hrwgc/7455343 to your computer and use it in GitHub Desktop.
#!/bin/bash | |
# simple function to check http response code before downloading a remote file | |
# example usage: | |
# if `validate_url $url >/dev/null`; then dosomething; else echo "does not exist"; fi | |
function validate_url(){ | |
if [[ `wget -S --spider $1 2>&1 | grep 'HTTP/1.1 200 OK'` ]]; then echo "true"; fi | |
} |
awesome!!
Very nice solution and useful. Here is how I used it in the boolean format
function validate_url(){
if [[ `wget -S --spider $1 2>&1 | grep 'HTTP/1.1 200 OK'` ]]; then
return 0
else
return 1
fi
}
if validate_url $endpoint; then
# Do something when exists
else
# Return or print some error
fi
Thanks, works great.
itz-azhar's example made this a 2 second addition to my script.
ta.
Really cool, thanks!
failed on amazon (sic) servers
wget -S --spider https://github.com/dscharrer/innoextract/releases/download/1.8/innoextract-1.8-linux.tar.xz
Hello, and thanks for this,
Anyway, I simplified the code before using it, there is no need to grep since wget returns 0 (OK) if file exists and there is no error, like authentication error for example:
function validate_url()
{
wget --spider $1
return $?
}
And if you want no output from wget (always hide output instead of putting >/dev/null
when calling validate_url
):
function validate_url()
{
wget --spider $1 >/dev/null 2>&1
return $?
}
And call it with:
if validate_url http://www.google.com; then
echo Code if OK here
else
echo Code if NOK here
fi
And I though I had to do some extra dev since I need to give a user and password and another parameter but it works too with for example (put all parameter inside "" is important else $1 in wget -S --spider $1
will only be the part before first space, URL then in example):
if validate_url "http://www.my_server_with_authentification.com/path/to/file --user=me --password=pwd --no-check-certificate"; then
echo Code if OK here
else
echo Code if NOK here
fi
hello Team,
Can you tell some solution for random file.
we need to download file daily from https url which is having random name for current data and time .
Starting name is fix and file last part is daily changed due to date and time.
current we use for loop for same but not success output always.
thanks
It works but it's not reliable
validate_url 'www.bing.com/az/hprichbg/rb/MaryLouWilliams_EN-US11937645356_1920x1200.jpg'
, echo's true yet the file doesn't exists.add no redirection to
wget --max-redirect=0 -S --spider $url