Skip to content

Instantly share code, notes, and snippets.

@idavydov
Last active July 4, 2022 19:52
Show Gist options
  • Save idavydov/de94c25eb394e61c29aa969df0858646 to your computer and use it in GitHub Desktop.
Save idavydov/de94c25eb394e61c29aa969df0858646 to your computer and use it in GitHub Desktop.
backup google photos, google drive, dropbox locally
To make this work you have to
a) install rclone https://rclone.org/
b) configure remote google photos (my_gphotos here). to do this: rclone config, new remote, etc: https://rclone.org/googlephotos/
c) same for my_gdrive, my_dropbox if needed
d) optionally, configure outgoing mail, to receive email reports: https://raspberry-projects.com/pi/software_utilities/email/ssmtp-to-send-emails
e) test that it works from command-line
f) add this script to cron, `crontab -e` (see the crontab file)
A few things to note:
1. It recommended to create your own client id (see here: https://rclone.org/drive/#making-your-own-client-id ).
I think it dramatically improves the performance. The problem is that google changed something on their side
related to oauth and you have to check if this recipe still works.
2. Google photos sometimes returns duplicate names. When I checked they usually correspond to the same photo/video
that's why I just filtered out the warning, but you have to check yourself if unsure. Also somethimes google
photos api returns photos/videos which cannot be opened at all. I do not know why. Majority of the photos/
videos are backed-up sucessfully.
3. Google photos remote returns the same photos sorted by various parameters. I highly recommend something similar to
include_file (attached). Otherwise it will take very long to backup (by day is very slow, I think).
In my config the same photos are still backuped twice: once by album (if they in the album) and once by month.
You can only use by-month to save space.
#!/bin/bash
# enable bash "safe" mode
set -euf -o pipefail
mybu () {
echo =="$1"==
ODIR=$BU_PATH/$1
BUDIR=${ODIR}-bu
TODAY=$(date +%F)
(
nice -n 19 \
rclone copy \
--log-level NOTICE \
--stats-log-level NOTICE \
--stats 60m \
--bwlimit 10M \
--tpslimit 10 \
--backup-dir $BUDIR \
--suffix "~$TODAY" \
${2:-} \
$1:/ $ODIR \
2>&1 \
|| true
) | grep -Ev ": Duplicate (directory|object) found in source - ignoring"
}
BU_PATH="<backup path or rclone destination, e.g. mybackuplocation:>"
umask 077
# show how much space is used before backup
df -h "<path to external drive>"
mybu my_gphotos --include-from=$HOME/gphotos_include.txt
mybu my_gdrive
mybu my_dropbox
# show how much space is used after backup
df -h "<path to external drive>"
# For more information see the manual pages of crontab(5) and cron(8)
#
# m h dom mon dow command
# you might need to configure outgoing mail on your machine
# to receive reports (not necessary)
[email protected]
# 11:00 every Wednesday
0 11 * * 3 /bin/bash <path_to>/backup.sh
/media/by-month/**
/album/**
/shared-album/**
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment