Skip to content

Instantly share code, notes, and snippets.

@joeyyax
Last active December 25, 2015 17:39
Show Gist options
  • Save joeyyax/7014663 to your computer and use it in GitHub Desktop.
Save joeyyax/7014663 to your computer and use it in GitHub Desktop.
Shell script to backup database and site files to both AWS S3 (using s3cmd) and mirror to a second server. Use a cronjob to call at set intervals.
#! /bin/sh/
TEMP_DIR=/root/scripts/temp
DATE=$(date +"%F.%T");
# Files Directories
LOCAL_DIR=/var/www/sites/...
REMOTE_DIR=/var/www/sites/...
# Database Info
LOCAL_DB_NAME='';
LOCAL_DB_USER='';
LOCAL_DB_PASS='';
REMOTE_SSH_ADDR='';
REMOTE_SSH_USER='';
REMOTE_SSH_PASS='';
REMOTE_DB_NAME='';
REMOTE_DB_USER='';
REMOTE_DB_PASS='';
# AWS S3
S3_BUCKET=s3://
# make temporary directory
echo "Preparing backup and sync...";
mkdir $TEMP_DIR;
cd $TEMP_DIR;
# sync files to staging/backup server
echo "Syncing files to backup/staging server...";
rsync -avzhr -e ssh $LOCAL_DIR/ $REMOTE_SSH_USER@$REMOTE_SSH_ADDR:$REMOTE_DIR/
# create tarball of site files
# echo 'Creating gzipped tarball of site files...'
tar --force-local -czpf - $LOCAL_DIR/ | split -d -b 512M - www.$DATE.tar.gz.part
# export database
echo "Exporting database...";
mysqldump --quick -u $LOCAL_DB_USER -p$LOCAL_DB_PASS $LOCAL_DB_NAME > $TEMP_DIR/"db.$DATE.sql"
# send db to staging server
echo "Copying database to backup/staging server...";
scp $TEMP_DIR/"db.$DATE.sql" $REMOTE_SSH_USER@$REMOTE_SSH_ADDR:$REMOTE_DIR/../private/backups/
# Connect to remote server, drop tables in db, import latest db
echo "Importing database into backup/staging server...";
ssh $REMOTE_SSH_USER@$REMOTE_SSH_ADDR "mysqldump -u $REMOTE_DB_USER -p$REMOTE_DB_PASS --add-drop-table $REMOTE_DB_NAME | grep ^DROP | mysql -u $REMOTE_DB_USER -p$REMOTE_DB_PASS $REMOTE_DB_NAME; mysql -u $REMOTE_DB_USER -p$REMOTE_DB_PASS $REMOTE_DB_NAME < $REMOTE_DIR/../private/backups/db.$DATE.sql; rm -rf $REMOTE_DIR/../private/backups/*; exit;"
# compress database backup
echo 'gzipping database backup for transport to AWS...'
tar --force-local -czpvf - db.$DATE.sql | split -d -b 512MiB - db.$DATE.sql.tar.gz.part
# log sync
echo "Sync to staging completed: $(date +"%F %T");" >> $TEMP_DIR/../backups.log
# move backups to amazon
echo "Connecting to AWS...";
echo "Mirroring content to AWS...";
s3cmd sync --recursive —delete-removed —force $LOCAL_DIR $S3_BUCKET
echo "Copying DB backup to AWS...";
s3cmd put --recursive db.$DATE.sql.tar.gz.part* $S3_BUCKET/compressed/db/$DATE/
echo "Copying Site files backup to AWS..."
s3cmd put --recursive www.$DATE.tar.gz.part* $S3_BUCKET/compressed/www/$DATE/
# log aws backup
echo "Sync to AWS completed: $(date +"%F %T");" >> $TEMP_DIR/../backups.log
# delete temp directory
echo "Cleaning up...";
cd $TEMP_DIR;
rm -rf $TEMP_DIR;
# sendemail -f "FROM" -t "TO" -u "SUBJECT" -m "MESSAGE";
echo "-> Done!";
echo
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment