-
-
Save hrwgc/7455343 to your computer and use it in GitHub Desktop.
#!/bin/bash | |
# simple function to check http response code before downloading a remote file | |
# example usage: | |
# if `validate_url $url >/dev/null`; then dosomething; else echo "does not exist"; fi | |
function validate_url(){ | |
if [[ `wget -S --spider $1 2>&1 | grep 'HTTP/1.1 200 OK'` ]]; then echo "true"; fi | |
} |
Ty <3
Yup still great code :-)
Thanks !
Thank you. The script is really helpful !
Thank you!
Thank you! Very helpful!
The function validate_url() was helpful for me, but the example usage didn't work as expected - I had to change
if `validate_url $url >/dev/null`; then dosomething; else echo "does not exist"; fi
to
if `validate_url $url`; then dosomething; else echo "does not exist"; fi
otherwise it always went in the "else" statement even for valid URLs
Anyway, thanks for the example!
Really helpful. Thanks!!!
but if the remote url is https,the function is not working?
Adapted the code to FTP server availability check, and for checking if the file is on server. For HTTP replace '220' or '150' with 'HTTP/1.1 200 OK'.
#!/bin/bash
Response code: Service ready for new user (220)
Response code: 150 Here comes the directory listing / opening binary data connection
url="ftp://path/to/file.something"
function validate_url(){
if [[ wget -S --spider $url 2>&1 | grep '220'
]]; then exit_status=$?; fi
if [[ $exit_status == 0 ]]; then
echo "FTP location exists"
elif [[ $exist_status == 1 ]]; then
echo "FTP location not available"
fi
}
validate_url
It works but it's not reliable validate_url 'www.bing.com/az/hprichbg/rb/MaryLouWilliams_EN-US11937645356_1920x1200.jpg'
, echo's true yet the file doesn't exists.
add no redirection to wget --max-redirect=0 -S --spider $url
awesome!!
Very nice solution and useful. Here is how I used it in the boolean format
function validate_url(){
if [[ `wget -S --spider $1 2>&1 | grep 'HTTP/1.1 200 OK'` ]]; then
return 0
else
return 1
fi
}
if validate_url $endpoint; then
# Do something when exists
else
# Return or print some error
fi
Thanks, works great.
itz-azhar's example made this a 2 second addition to my script.
ta.
Really cool, thanks!
failed on amazon (sic) servers
wget -S --spider https://github.com/dscharrer/innoextract/releases/download/1.8/innoextract-1.8-linux.tar.xz
Hello, and thanks for this,
Anyway, I simplified the code before using it, there is no need to grep since wget returns 0 (OK) if file exists and there is no error, like authentication error for example:
function validate_url()
{
wget --spider $1
return $?
}
And if you want no output from wget (always hide output instead of putting >/dev/null
when calling validate_url
):
function validate_url()
{
wget --spider $1 >/dev/null 2>&1
return $?
}
And call it with:
if validate_url http://www.google.com; then
echo Code if OK here
else
echo Code if NOK here
fi
And I though I had to do some extra dev since I need to give a user and password and another parameter but it works too with for example (put all parameter inside "" is important else $1 in wget -S --spider $1
will only be the part before first space, URL then in example):
if validate_url "http://www.my_server_with_authentification.com/path/to/file --user=me --password=pwd --no-check-certificate"; then
echo Code if OK here
else
echo Code if NOK here
fi
hello Team,
Can you tell some solution for random file.
we need to download file daily from https url which is having random name for current data and time .
Starting name is fix and file last part is daily changed due to date and time.
current we use for loop for same but not success output always.
thanks
Really helpful. Thanks!!!