Skip to content

Instantly share code, notes, and snippets.

@aamsur-mkt
Created October 18, 2019 15:21
Show Gist options
  • Save aamsur-mkt/45ca5aff3b66dc4c1756a13918d1f7f4 to your computer and use it in GitHub Desktop.
Save aamsur-mkt/45ca5aff3b66dc4c1756a13918d1f7f4 to your computer and use it in GitHub Desktop.
Auto backup mysql to s3
# install s3cmd
# config s3cmd first with s3cmd --configure
# run mkdir /backup/mysql_dump/logs -p
#!/bin/bash
#I use this to create a little bash script that will backup the database at regular intervals, and I’ll even chuck in deleting backups older than 15 days and move the dump_file in S3_bucket.
#create a few variables to contain the Database_credentials.
# Database credentials
USER="root"
PASSWORD="root"
HOST="localhost"
DB_NAME="yourfdatabasename"
#Backup_Directory_Locations
BACKUPROOT="/backup/mysql_dump"
TSTAMP=$(date +"%d-%b-%Y-%H-%M-%S")
S3BUCKET="s3://backup-db-pr/axe/mysql/"
#logging
LOG_ROOT="/backup/mysql_dump/logs/dump.log"
#Dump of Mysql Database into S3\
echo "$(tput setaf 2)creating backup of database start at $TSTAMP" >> "$LOG_ROOT"
#mysqldump -h <HOST> -u <USER> --database <DB_NAME> -p"password" > $BACKUPROOT/$DB_NAME-$TSTAMP.sql
mysqldump -h $HOST -u $USER -p$PASSWORD --databases $DB_NAME | gzip -c > "$BACKUPROOT/$DB_NAME-$TSTAMP.sql.gz"
echo "$(tput setaf 3)Finished backup of database and sending it in S3 Bucket at $TSTAMP" >> "$LOG_ROOT"
#Delete files older than 15 days
find $BACKUPROOT/* -mtime +15 -exec rm {} \;
s3cmd put --recursive $BACKUPROOT $S3BUCKET
echo "$(tput setaf 2)Moved the backup file from local to S3 bucket at $TSTAMP" >> "$LOG_ROOT"
echo "$(tput setaf 3)Coll!! Script have been executed successfully at $TSTAMP" >> "$LOG_ROOT"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment