-
-
Save deanet/3427090 to your computer and use it in GitHub Desktop.
#!/bin/bash | |
## uploading to google | |
## rev: 22 Aug 2012 16:07 | |
det=`date +%F` | |
browser="Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:13.0) Gecko/20100101 Firefox/13.0.1" | |
username="[email protected]" | |
password="password" | |
accountype="HOSTED" #gooApps = HOSTED , gmail=GOOGLE | |
pewede="/tmp" | |
file="file-$det.tar" | |
tipe="application/x-tar" | |
/usr/bin/curl -v --data-urlencode Email=$username --data-urlencode Passwd=$password -d accountType=$accountype -d service=writely -d source=cURL "https://www.google.com/accounts/ClientLogin" > $pewede/login.txt | |
token=`cat $pewede/login.txt | grep Auth | cut -d \= -f 2` | |
uploadlink=`/usr/bin/curl -Sv -k --request POST -H "Content-Length: 0" -H "Authorization: GoogleLogin auth=${token}" -H "GData-Version: 3.0" -H "Content-Type: $tipe" -H "Slug: $file" "https://docs.google.com/feeds/upload/create-session/default/private/full?convert=false" -D /dev/stdout | grep "Location:" | sed s/"Location: "//` | |
/usr/bin/curl -Sv -k --request POST --data-binary "@$file" -H "Authorization: GoogleLogin auth=${token}" -H "GData-Version: 3.0" -H "Content-Type: $tipe" -H "Slug: $file" "$uploadlink" > $pewede/goolog.upload.txt |
Hi liderbug,
no problem with my environment.. this is running from fresh environment..
[deanet@lumbung ~]$ bash -x bin/google.sh
++ date +%F
+ det=2013-08-02
+ browser='Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:13.0) Gecko/20100101 Firefox/13.0.1'
+ username=*snip*
+ password='*snip*'
+ accountype=GOOGLE
+ pewede=/tmp
+ file=icecast.tar
+ tipe=application/x-tar
+ /usr/bin/curl -v --data-urlencode Email=*snip* --data-urlencode 'Passwd=*snip*' -d accountType=GOOGLE -d service=writely -d source=cURL https://www.google.com/accounts/ClientLogin
* About to connect() to www.google.com port 443 (#0)
* Trying 2607:f8b0:4008:803::1014... connected
* Connected to www.google.com (2607:f8b0:4008:803::1014) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
* SSL connection using SSL_RSA_WITH_RC4_128_MD5
* Server certificate:
* subject: CN=www.google.com,O=Google Inc,L=Mountain View,ST=California,C=US
* start date: Jul 12 08:56:36 2013 GMT
* expire date: Oct 31 23:59:59 2013 GMT
* common name: www.google.com
* issuer: CN=Google Internet Authority,O=Google Inc,C=US
> POST /accounts/ClientLogin HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.14.0.0 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: www.google.com
> Accept: */*
> Content-Length: 102
> Content-Type: application/x-www-form-urlencoded
>
} [data not shown]
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 102 0 423 --:--:-- --:--:-- --:--:-- 423< HTTP/1.1 200 OK
< Content-Type: text/plain
< Cache-control: no-cache, no-store
< Pragma: no-cache
< Expires: Mon, 01-Jan-1990 00:00:00 GMT
< Date: Fri, 02 Aug 2013 09:18:02 GMT
< X-Content-Type-Options: nosniff
< X-XSS-Protection: 1; mode=block
< Content-Length: 818
< Server: GSE
<
{ [data not shown]
115 818 102 818 0 102 2857 356 --:--:-- --:--:-- --:--:-- 15911* Connection #0 to host www.google.com left intact
* Closing connection #0
++ cut -d = -f 2
++ grep Auth
++ cat /tmp/login.txt
+ token=*snip*
++ sed 's/Location: //'
++ grep Location:
++ /usr/bin/curl -Sv -k --request POST -H 'Content-Length: 0' -H 'Authorization: GoogleLogin auth=*snip*' -H 'GData-Version: 3.0' -H 'Content-Type: application/x-tar' -H 'Slug: icecast.tar' 'https://docs.google.com/feeds/upload/create-session/default/private/full?convert=false' -D /dev/stdout
* About to connect() to docs.google.com port 443 (#0)
* Trying 2607:f8b0:4008:803::1006... connected
* Connected to docs.google.com (2607:f8b0:4008:803::1006) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* warning: ignoring value of ssl.verifyhost
* skipping SSL peer certificate verification
* SSL connection using SSL_RSA_WITH_RC4_128_MD5
* Server certificate:
* subject: CN=*.google.com,O=Google Inc,L=Mountain View,ST=California,C=US
* start date: Jul 12 09:00:30 2013 GMT
* expire date: Oct 31 23:59:59 2013 GMT
* common name: *.google.com
* issuer: CN=Google Internet Authority,O=Google Inc,C=US
> POST /feeds/upload/create-session/default/private/full?convert=false HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.14.0.0 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: docs.google.com
> Accept: */*
> Content-Length: 0
> Authorization: GoogleLogin auth=*snip*
> GData-Version: 3.0
> Content-Type: application/x-tar
> Slug: icecast.tar
>
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0< HTTP/1.1 200 OK
< Location: https://docs.google.com/feeds/upload/create-session/default/private/full?convert=false&upload_id=AEnB2UoISsv6UkzrzkUs5MwLTobAYKQPPWFn0nVJdas9SVtUTOMk6rks66kuGTZ4kfixu05jlRf_x0ijmccdLcWgEj3vp3WlLw
< Date: Fri, 02 Aug 2013 09:18:03 GMT
< Server: HTTP Upload Server Built on Jul 24 2013 17:20:01 (1374711601)
< Content-Length: 0
< Content-Type: text/html; charset=UTF-8
<
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Connection #0 to host docs.google.com left intact
* Closing connection #0
+ uploadlink='https://docs.google.com/feeds/upload/create-session/default/private/full?convert=false&upload_id=AEnB2UoISsv6Uk'rzkUs5MwLTobAYKQPPWFn0nVJdas9SVtUTOMk6rks66kuGTZ4kfixu05jlRf_x0ijmccdLcWgEj3vp3WlLw
+ /usr/bin/curl -Sv -k --request POST --data-binary @icecast.tar -H 'Authorization: GoogleLogin auth=*snip*' -H 'GData-Version: 3.0' -H 'Content-Type: application/x-tar' -H 'Slug: icecast.tar' 'https://docs.google.com/feeds/upload/create-session/default/private/full?convert=false&upload_id=AEnB2UoISsv6UkzrzkUs5MwLTobAYKQPPWFn0nVJdas9SVtUTOMk6rks66kuGTZ4kfixu05jl'f_x0ijmccdLcWgEj3vp3WlLw
* About to connect() to docs.google.com port 443 (#0)
* Trying 2607:f8b0:4008:803::1006... connected
* Connected to docs.google.com (2607:f8b0:4008:803::1006) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* warning: ignoring value of ssl.verifyhost
* skipping SSL peer certificate verification
* SSL connection using SSL_RSA_WITH_RC4_128_MD5
* Server certificate:
* subject: CN=*.google.com,O=Google Inc,L=Mountain View,ST=California,C=US
* start date: Jul 12 09:00:30 2013 GMT
* expire date: Oct 31 23:59:59 2013 GMT
* common name: *.google.com
* issuer: CN=Google Internet Authority,O=Google Inc,C=US
> POST /feeds/upload/create-session/default/private/full?convert=false&upload_id=AEnB2UoISsv6UkzrzkUs5MwLTobAYKQPPWFn0nVJdas9 HTTP/1.1rks66kuGTZ4kfixu05jlRf_x0ijmccdLcWgEj3vp3WlLw
> User-Agent: curl/7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.14.0.0 zlib/1.2.3 libidn/1.18 libssh2/1.4.2
> Host: docs.google.com
> Accept: */*
> Authorization: GoogleLogin auth=*snip*
> GData-Version: 3.0
> Content-Type: application/x-tar
> Slug: icecast.tar
> Content-Length: 64307200
> Expect: 100-continue
>
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 61.3M 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0< HTTP/1.1 100 Continue
} [data not shown]
100 61.3M 0 0 100 61.3M 0 2231k 0:00:28 0:00:28 --:--:-- 1883k< HTTP/1.1 201 Created
< Content-Type: application/atom+xml; charset=UTF-8; type=entry
< Expires: Fri, 02 Aug 2013 09:18:33 GMT
< Date: Fri, 02 Aug 2013 09:18:33 GMT
< Cache-Control: private, max-age=0, must-revalidate, no-transform
< Vary: Accept, X-GData-Authorization, GData-Version
< GData-Version: 3.0
< ETag: "A1BVSgpOHCt7ImBk"
< Location: https://docs.google.com/feeds/default/private/full/file%3A0BzpPZf0yXmAsa0dpRFBEU2o3MWs
< Content-Location: https://docs.google.com/feeds/default/private/full/file%3A0BzpPZf0yXmAsa0dpRFBEU2o3MWs
< Content-Length: 3223
< Server: HTTP Upload Server Built on Jul 24 2013 17:20:01 (1374711601)
<
{ [data not shown]
100 61.3M 100 3223 100 61.3M 112 2188k 0:00:28 0:00:28 --:--:-- 1561k* Connection #0 to host docs.google.com left intact
* Closing connection #0
[deanet@lumbung ~]$
maybe, you need make sure /etc/pki/tls/certs/ is readable.
thanks
How can I overwrite the file if there is existing one?
Hi,
Thanks a lot for this share guys, it helped me a lot.
I had hard time to make it works so i'll share my experience :
As /dev/stdout was not my default ouput, the script wasn't working for me. I'm not really experienced with bash scripts but i found a way working for me and i guess it will work in all cases. As a was updating the script, i also changed it to avoid file input/output on the system. You can see the new script below.
token=$(curl --data-urlencode Email=$username --data-urlencode Passwd=$password -d accountType=$accountype -d service=writely -d source=cURL "https://www.google.com/accounts/ClientLogin" | grep Auth | cut -d \= -f 2)
link=$(curl -X POST -H "Content-Length: 0" -H "Authorization: GoogleLogin auth=${token}" -H "GData-Version: 3.0" -H "Content-Type: $mimetype" -H "Slug: $file" "https://docs.google.com/feeds/upload/create-session/default/private/full/folder:$folderid/contents?convert=false" -D - | grep "Location:" | sed s/"Location: "//)
curl -X POST --data-binary "@$file" -H "Authorization: GoogleLogin auth=${token}" -H "GData-Version: 3.0" -H "Content-Type: $mimetype" -H "Slug: $file" "$link" > /dev/null
evolution happens:
- automatically gleans MIME type from file
- uploads multiple files
- removes directory prefix from filename
- works with filenames with spaces
- uses dotfile for configuration and token
- interactively configuring
- uploads to target folder if last argument looks like a folder id
- quieter output
- uses longer command line flags for readability
- throttle by adding
curl_args="--limit-rate 500K"
to $HOME/.gdrive.conf
#!/bin/bash
# based on https://gist.github.com/deanet/3427090
#
# useful $HOME/.gdrive.conf options:
# curl_args="--limit-rate 500K --progress-bar"
browser="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36"
destination_folder_id=${@: -1}
if expr "$destination_folder_id" : '^[A-Za-z0-9]\{28\}$' > /dev/null
then
# all but last word
set -- "${@:0:$#}"
else
# upload to root
unset destination_folder_id
fi
if [ -e $HOME/.gdrive.conf ]
then
. $HOME/.gdrive.conf
fi
old_umask=`umask`
umask 0077
if [ -z "$username" ]
then
read -p "username: " username
unset token
echo "username=$username" >> $HOME/.gdrive.conf
fi
if [ -z "$account_type" ]
then
if expr "$username" : '^[^@]*$' > /dev/null || expr "$username" : '.*@gmail.com$' > /dev/null
then
account_type=GOOGLE
else
account_type=HOSTED
fi
fi
if [ -z "$password$token" ]
then
read -s -p "password: " password
unset token
echo
fi
if [ -z "$token" ]
then
token=`curl --silent --data-urlencode Email=$username --data-urlencode Passwd="$password" --data accountType=$account_type --data service=writely --data source=cURL "https://www.google.com/accounts/ClientLogin" | sed -ne s/Auth=//p`
sed -ie '/^token=/d' $HOME/.gdrive.conf
echo "token=$token" >> $HOME/.gdrive.conf
fi
umask $old_umask
for file in "$@"
do
slug=`basename "$file"`
mime_type=`file --brief --mime-type "$file"`
upload_link=`curl --silent --show-error --insecure --request POST --header "Content-Length: 0" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "https://docs.google.com/feeds/upload/create-session/default/private/full${destination_folder_id+/folder:$destination_folder_id/contents}?convert=false" --dump-header - | sed -ne s/"Location: "//p`
echo "$file:"
curl --request POST --output /dev/null --data-binary "@$file" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "$upload_link" $curl_args
done
Great work idiomatic works like a charm!
Since I am trying to update the same file with a csv.file evertime I need to find a way to avoid a new file is being created. Any tips would be appreciated.
Axel
I would like to thank the excellent work of all the people before me. Follows my little modifications, maybe someone could find them useful too:
- parameter to set google documents file conversion (very useful to me for big XLS files). Set to "false" if not necessary.
- small check to exclude the upload of the script itself (it happened to me when I use to set a destination folder, which is my common use case)
Unfortunately I am still trying to figure out how to avoid the file duplication, if someone discovers out to solve this small issue I will be very grateful :)
#!/bin/bash
# based on https://gist.github.com/deanet/3427090
#
# useful $HOME/.gdrive.conf options:
# curl_args="--limit-rate 500K --progress-bar"
browser="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36"
convert_to_googledocs="true"
destination_folder_id=${@: -1}
if expr "$destination_folder_id" : '^[A-Za-z0-9]\{28\}$' > /dev/null
then
# all but last word
set -- "${@:0:$#}"
else
# upload to root
unset destination_folder_id
fi
if [ -e $HOME/.gdrive.conf ]
then
. $HOME/.gdrive.conf
fi
old_umask=`umask`
umask 0077
if [ -z "$username" ]
then
read -p "username: " username
unset token
echo "username=$username" >> $HOME/.gdrive.conf
fi
if [ -z "$account_type" ]
then
if expr "$username" : '^[^@]*$' > /dev/null || expr "$username" : '.*@gmail.com$' > /dev/null
then
account_type=GOOGLE
else
account_type=HOSTED
fi
fi
if [ -z "$password$token" ]
then
read -s -p "password: " password
unset token
echo
fi
if [ -z "$token" ]
then
token=`curl --silent --data-urlencode Email=$username --data-urlencode Passwd="$password" --data accountType=$account_type --data service=writely --data source=cURL "https://www.google.com/accounts/ClientLogin" | sed -ne s/Auth=//p`
sed -ie '/^token=/d' $HOME/.gdrive.conf
echo "token=$token" >> $HOME/.gdrive.conf
fi
umask $old_umask
for file in "$@"
do
if [ "$file" != "$0" ]
then
slug=`basename "$file"`
mime_type=`file --brief --mime-type "$file"`
upload_link=`curl --silent --show-error --insecure --request POST --header "Content-Length: 0" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "https://docs.google.com/feeds/upload/create-session/default/private/full${destination_folder_id+/folder:$destination_folder_id/contents}?convert=$convert_to_googledocs" --dump-header - | sed -ne s/"Location: "//p`
echo "$file:"
curl --request POST --output /dev/null --data-binary "@$file" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "$upload_link" $curl_args
fi
done
curl: (3) malformed
Drocsid: I had the same issue. When I looked at what was being returned, I was getting an authentication error because I have 2 factor authentication turned on. To get around this I have opted to create an application password for this script. Using that password and it all works fine again.
Updated the version by @andreapergola to use Zenity for username/passwords and upload progress making it cleaner to integrate via a nautilus action or similar. I had to use a trap to kill the curl
command if the Zenity progress is cancelled, if anyone has a better solution please update
#!/bin/bash
# based on https://gist.github.com/deanet/3427090
#
# useful $HOME/.gdrive.conf options:
# curl_args="--limit-rate 500K --progress-bar"
trap 'kill $(jobs -p)' EXIT
browser="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36"
convert_to_googledocs="true"
destination_folder_id=${@: -1}
if expr "$destination_folder_id" : '^[A-Za-z0-9]\{28\}$' > /dev/null
then
# all but last word
set -- "${@:0:$#}"
else
# upload to root
unset destination_folder_id
fi
if [ -e $HOME/.gdrive.conf ]
then
. $HOME/.gdrive.conf
fi
old_umask=`umask`
umask 0077
if [ -z "$username" ]
then
ENTRY=`zenity --password --username --title="Drive Login"`
case $? in
1)
exit
;;
-1)
exit
;;
esac
username=`echo $ENTRY | cut -d'|' -f1`
password=`echo $ENTRY | cut -d'|' -f2`
unset token
echo "username=$username" >> $HOME/.gdrive.conf
fi
if [ -z "$account_type" ]
then
if expr "$username" : '^[^@]*$' > /dev/null || expr "$username" : '.*@gmail.com$' > /dev/null
then
account_type=GOOGLE
else
account_type=HOSTED
fi
fi
if [ -z "$password$token" ]
then
ENTRY=`zenity --password --title="Password" --text="Enter your Google Drive password"`
password=$ENTRY
case $? in
1)
exit
;;
-1)
exit
;;
esac
unset token
fi
if [ -z "$token" ]
then
token=`curl --silent --data-urlencode Email=$username --data-urlencode Passwd="$password" --data accountType=$account_type --data service=writely --data source=cURL "https://www.google.com/accounts/ClientLogin" | sed -ne s/Auth=//p`
sed -ie '/^token=/d' $HOME/.gdrive.conf
echo "token=$token" >> $HOME/.gdrive.conf
fi
umask $old_umask
for file in "$@"
do
if [ "$file" != "$0" ]
then
slug=`basename "$file"`
mime_type=`file --brief --mime-type "$file"`
upload_link=`curl --silent --show-error --insecure --request POST --header "Content-Length: 0" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "https://docs.google.com/feeds/upload/create-session/default/private/full${destination_folder_id+/folder:$destination_folder_id/contents}?convert=$convert_to_googledocs" --dump-header - | sed -ne s/"Location: "//p`
echo "$file:"
curl --request POST --output /dev/null --data-binary "@$file" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "$upload_link" $curl_args -# 2>&1 | stdbuf -oL tr '\r' '\n' | grep -o --line-buffered '[0-9]*\.[0-9]' | zenity --progress --title="Uploading..." --text="${file##*/}" --auto-close --auto-kill
case $? in
1)
exit
;;
-1)
exit
;;
esac
fi
done
I've adapted (improved? You judge!) this script still further by allowing the destination_folder_id to be stored in the ~/.gdrive.conf file. I have written a HOWTO which describes how to use this tool as part of backing up medium-resolution family photos to Google Drive:
One more thing that might help get this working:
You may need to enable access to less secure apps at this page.
Also, there's a problem with the various enhanced scripts on this page: in order to determine if the type is HOSTED or GOOGLE, they check the username (email) to see if it's a Gmail address. This is not correct because you can now use a non-Gmail account to set up a Google account; that's what I'm doing. I have a non-Gmail account with email hosted somewhere else, but I'm using Google's other services (Drive).
I think I am looking for this one. but I don't understand how to use these script. anyone explain?
Edit: OK, I got it now, great works.
Great script - but having a problem getting it to run properly on a raspberry PI.
Weird problem. When I run it I'm getting "curl: (3) Illegal characters found in URL" in the final curl in the definition of $upload_link.
If I then edit the script and replace $upload_link variable with the actual $upload_link string it runs fine.
For example:
curl --request POST --output /dev/null --data-binary "@$file" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "$upload_link" $curl_args
replaced with:
curl --request POST --output /dev/null --data-binary "@$file" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "https://docs.google.com/feeds/upload/create-session/default/private/full?convert=true&upload_id=######" $curl_args
works fine.
Any ideas?
@nzbaxterman I ran into this problem too. The request to get the upload URL seems to have a line break in it now. Stripping that works for me:
upload_link = `curl ... | tr -d '\r\n'`
@kane-c tnx works for me
Nice work!
Anyone know how to avoid the file duplication?
Here have some info about how manage folders, could be nice get the list of folder names and overwrite the files to avoid duplication:
https://developers.google.com/drive/web/manage-uploads
Here have info about how make an update case the file exist:
https://developers.google.com/drive/v2/reference/files/update
Here the author is using the new google drive api, where you can specify the 'uploadType':
http://codeseekah.com/tag/curl/
https://github.com/soulseekah/bash-utils/blob/master/google-drive-upload/upload.sh
I think we have elements to make a complete sync client in bash!
does this still works? I always get
https://developers.google.com/accounts/docs/AuthForInstalledApps
when authenticating. Seems like Google deprecated this kind of authentication
@nicolabeghin it doesn't work anymore.
This code doesn't work anymore because google has enforced OAUTH 2.0.
Grive doesn't depend on Qt or Xorg.
Google enforcing OAuth this script doesn't work any more;
You will get a 404 on "https://www.google.com/accounts/ClientLogin" when requesting the "${token}"
This one works fine with OAuth 2 : https://github.com/labbots/google-drive-upload
You will need an API client and secret : Go to https://console.developers.google.com/apis/ and create a "Google Drive" credential of type "OAuth client ID", sub type "other"
(Adding that here as this gist still popups first on google search)
evolution happens:
* automatically gleans MIME type from file * uploads multiple files * removes directory prefix from filename * works with filenames with spaces * uses dotfile for configuration and token * interactively configuring * uploads to target folder if last argument looks like a folder id * quieter output * uses longer command line flags for readability * throttle by adding `curl_args="--limit-rate 500K"` to $HOME/.gdrive.conf
#!/bin/bash # based on https://gist.github.com/deanet/3427090 # # useful $HOME/.gdrive.conf options: # curl_args="--limit-rate 500K --progress-bar" browser="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36" destination_folder_id=${@: -1} if expr "$destination_folder_id" : '^[A-Za-z0-9]\{28\}$' > /dev/null then # all but last word set -- "${@:0:$#}" else # upload to root unset destination_folder_id fi if [ -e $HOME/.gdrive.conf ] then . $HOME/.gdrive.conf fi old_umask=`umask` umask 0077 if [ -z "$username" ] then read -p "username: " username unset token echo "username=$username" >> $HOME/.gdrive.conf fi if [ -z "$account_type" ] then if expr "$username" : '^[^@]*$' > /dev/null || expr "$username" : '.*@gmail.com$' > /dev/null then account_type=GOOGLE else account_type=HOSTED fi fi if [ -z "$password$token" ] then read -s -p "password: " password unset token echo fi if [ -z "$token" ] then token=`curl --silent --data-urlencode Email=$username --data-urlencode Passwd="$password" --data accountType=$account_type --data service=writely --data source=cURL "https://www.google.com/accounts/ClientLogin" | sed -ne s/Auth=//p` sed -ie '/^token=/d' $HOME/.gdrive.conf echo "token=$token" >> $HOME/.gdrive.conf fi umask $old_umask for file in "$@" do slug=`basename "$file"` mime_type=`file --brief --mime-type "$file"` upload_link=`curl --silent --show-error --insecure --request POST --header "Content-Length: 0" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "https://docs.google.com/feeds/upload/create-session/default/private/full${destination_folder_id+/folder:$destination_folder_id/contents}?convert=false" --dump-header - | sed -ne s/"Location: "//p` echo "$file:" curl --request POST --output /dev/null --data-binary "@$file" --header "Authorization: GoogleLogin auth=${token}" --header "GData-Version: 3.0" --header "Content-Type: $mime_type" --header "Slug: $slug" "$upload_link" $curl_args done
Still working, great !
Stumbled across "Google will block access from the location(s) you don't normally use - like home and you can't use the coffee shop - but if you go to https://accounts.google.com/DisplayUnlockCaptcha it give you a few minutes to login from there.