Last active
July 29, 2021 22:52
-
-
Save ankitsam/450d7c21cc7d8d49ca39 to your computer and use it in GitHub Desktop.
Shell Script for daily backup EC2(Ubuntu) - Separate websites & MySql databases zips to S3
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin | |
# Variables | |
USER="root" | |
PASSWORD="" | |
OUTPUT="/home/ubuntu/backups" | |
WWW="/var/www/*" | |
BUCKET="Backup-Bucket" | |
databases=`mysql -u $USER -p$PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database` | |
# backup all databases | |
for db in $databases; do | |
if [[ "$db" != "information_schema" ]] && [[ "$db" != "performance_schema" ]] && [[ "$db" != "mysql" ]] && [[ "$db" != _* ]] ; then | |
echo "Dumping database: $db" | |
FILE=$OUTPUT"/db_"$db"_`date '+%m-%d-%Y'`.sql.gz" | |
mysqldump -u $USER -p$PASSWORD --databases $db | gzip >$FILE | |
# upload to s3 | |
aws s3 cp $FILE s3://$BUCKET | |
fi | |
done | |
# backup all websites | |
for f in $WWW; do | |
if [[ -d $f ]]; then | |
echo "Dumping Dir: $f" | |
VAR="$(basename $f)" | |
FILE=$OUTPUT"/file_"$VAR"_`date '+%m-%d-%Y'`.zip" | |
zip -r -q $FILE $f | |
# upload to s3 | |
aws s3 cp $FILE s3://$BUCKET | |
fi | |
done; | |
# remove old db files > 7 days | |
find $OUTPUT -type f -name \*.sql.gz -mtime +7 | sort -r | tail -n +2 | xargs -n50 /bin/rm -f | |
# remove old website files > 7 days | |
find $OUTPUT -type f -name \*.zip -mtime +7 | sort -r | tail -n +2 | xargs -n50 /bin/rm -f | |
echo "Backup Completed" |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Introduction:
Simple shell script to take backup of all websites and mysql databases to S3 daily.
It loop through all databases and dump the compressed zip in output directory
and uploads to S3 bucket.
Similarly, It loops through all websites in /var/www and create separate zip of each
website and uploads to S3.
Finally, deletes zips more than 7 days old from the EBS.
Also, S3 bucket lifecycle policy, keeps the backup in S3 for 30 days.
Prerequisite and Notes:
$ wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip
$ unzip awscli-bundle.zip
$ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws
$ aws configure
To get access id key and secret key:
Create IAM user with full access to S3 bucket policy
ZIP
sudo apt-get install zip unzip
S3 Bucket Lifecycle
http://docs.aws.amazon.com/AmazonS3/latest/UG/lifecycle-configuration-bucket-no-versioning.html
Set auto delete after 30 days
Set Crontab to run daily at night
$ crontab -e
0 0 * * * bash /home/ubuntu/ec2_mysql_public_html_backup.sh >/dev/null 2>&1