Skip to content

Instantly share code, notes, and snippets.

@oodavid
Created March 26, 2012 17:05
Show Gist options
  • Save oodavid/2206527 to your computer and use it in GitHub Desktop.
Save oodavid/2206527 to your computer and use it in GitHub Desktop.
Backup MySQL to Amazon S3

Backup MySQL to Amazon S3

This is a simple way to backup your MySQL tables to Amazon S3 for a nightly backup - this is all to be done on your server :-)

Sister Document - Restore MySQL from Amazon S3 - read that next

1 - Install s3cmd

this is for Centos 5.6, see http://s3tools.org/repositories for other systems like ubuntu etc

# Install s3cmd
cd /etc/yum.repos.d/
wget http://s3tools.org/repo/CentOS_5/s3tools.repo
yum install s3cmd
# Setup s3cmd
s3cmd --configure
    # You’ll need to enter your AWS access key and secret key here, everything is optional and can be ignored :-)

2 - Add your script

Upload a copy of s3mysqlbackup.sh (it will need some tweaks for your setup), make it executable and test it

# Add the executable bit
chmod +x s3mysqlbackup.sh
# Run the script to make sure it's all tickety boo
./s3mysqlbackup.sh

3 - Run it every night with CRON

Assuming the backup script is stored in /var/www/s3mysqlbackup.sh we need to add a crontask to run it automatically:

# Edit the crontab
env EDITOR=nano crontab -e
    # Add the following lines:
    # Run the database backup script at 3am
    0 3 * * * bash /var/www/s3mysqlbackup.sh >/dev/null 2>&1

4 - Don't expose the script!

If for some reason you put this script in a public folder (not sure why you would do this), you should add the following to your .htaccess or httpd.conf file to prevent public access to the files:

### Deny public access to shell files
<Files *.sh>
    Order allow,deny
    Deny from all
</Files>
#!/bin/bash
# Based on https://gist.github.com/2206527
# Be pretty
echo -e " "
echo -e " . ____ . ______________________________"
echo -e " |/ \| | |"
echo -e "[| \e[1;31m♥ ♥\e[00m |] | S3 MySQL Backup Script v.0.1 |"
echo -e " |___==___| / © oodavid 2012 |"
echo -e " |______________________________|"
echo -e " "
# Basic variables
mysqlpass="ROOTPASSWORD"
bucket="s3://bucketname"
# Timestamp (sortable AND readable)
stamp=`date +"%s - %A %d %B %Y @ %H%M"`
# List all the databases
databases=`mysql -u root -p$mysqlpass -e "SHOW DATABASES;" | tr -d "| " | grep -v "\(Database\|information_schema\|performance_schema\|mysql\|test\)"`
# Feedback
echo -e "Dumping to \e[1;32m$bucket/$stamp/\e[00m"
# Loop the databases
for db in $databases; do
# Define our filenames
filename="$stamp - $db.sql.gz"
tmpfile="/tmp/$filename"
object="$bucket/$stamp/$filename"
# Feedback
echo -e "\e[1;34m$db\e[00m"
# Dump and zip
echo -e " creating \e[0;35m$tmpfile\e[00m"
mysqldump -u root -p$mysqlpass --force --opt --databases "$db" | gzip -c > "$tmpfile"
# Upload
echo -e " uploading..."
s3cmd put "$tmpfile" "$object"
# Delete
rm -f "$tmpfile"
done;
# Jobs a goodun
echo -e "\e[1;32mJobs a goodun\e[00m"
@jalama
Copy link

jalama commented Apr 6, 2016

modified to include the concept of a host (ie MYSLQ on another server) https://gist.github.com/jalama/b237e020884aa5c874e284ccf0f52095

@marcolang
Copy link

Isn't working for me:

line 20: syntax error near unexpected token do line 20:for db in $databases; do

What am I doing wrong?

@MrOlek
Copy link

MrOlek commented Jun 16, 2016

The easiest way to backup your MySQL database to Amazon S3 automatically is to use this tool MySQLBackupFTP (http://mysqlbackupftp.com/).

@mbrooks
Copy link

mbrooks commented Jun 20, 2016

It really annoys me that this isn't in an actual git project. Mainly because I doesn't really have a workflow for reporting and fixing issues.

@overint
Copy link

overint commented Jul 8, 2016

Works well, thanks!

@gnr5
Copy link

gnr5 commented Aug 2, 2016

Just tried it and it works. However what if I want to keep only the latest day in S3, instead of a file for each day? Any solution for this?

@MrRaph
Copy link

MrRaph commented Aug 5, 2016

Thanks a lot ! :-)

@gnr5 : You can create deletion rule in your bucket's properties. You'll be able to delete files some days after they were uploaded ;)

@koldstar
Copy link

This works perfectly.

However, if you need to backup multiple servers, you can consider using our newly launched tool: https://www.backupbird.com/

@hlebarovcom
Copy link

hlebarovcom commented Nov 5, 2016

Great script! I have been inspired from it to create a simple script to backup website files to Amazon S3. I've uploaded it here for everyone to use - https://github.com/hlebarov-com/s3websitesbackup

Cheers!

@prasoon-sipl
Copy link

This path is no longer valid - wget http://s3tools.org/repo/CentOS_5/s3tools.repo
Update this to - wget http://s3tools.org/repo/RHEL_5/s3tools.repo

@inodecloud
Copy link

Awesome script! works amazingly well.

@craigderington
Copy link

craigderington commented Feb 15, 2017

Great job. This script rocks! Edited: 5-11-2017: OK, I finally resolved "Connection Reset by Peer" error by changing the S3CMD command from PUT to SYNC. (~ line 44) Now, it works flawlessly. Thanks very much!

@1sudo
Copy link

1sudo commented Mar 14, 2017

the grep was returning nothing for me... on the bright side I need to backup ALL of my databases, so I changed:

databases=mysql -u root -p$mysqlpass -e "SHOW DATABASES;" | tr -d "| " | grep -v "\(Database\|information_schema\|performance_schema\|mysql\|test\)"

to

databases=`mysql -u root -p$mysqlpass -e "SHOW DATABASES;" | tr -d "| "

And everything is peachy.

@amrutaDesai
Copy link

amrutaDesai commented Apr 12, 2017

Very nice script.
I faced the problem while running the script through crontab, denied to edit root level crontab as same is running on virtual box and without -u in below command nothing happens.
so need to edit crontab for specific user ,
sudo crontab -u specificUser -e

Hence, edited crontab for specificuser for which script should be run and it worked for me

@reinvanhaaren
Copy link

Great script! Exactly what i needed. Thank you very much!

@cdtweb
Copy link

cdtweb commented May 13, 2017

This idea will save me a bunch of money every month vs. using AWS RDS for my MySQL server.

@arpit1997
Copy link

Awesome !!!

@thecatontheflat
Copy link

Saver! Thank you! 👍

@karthik-ey
Copy link

Thank you. Keep up the good work!!

@buluxan
Copy link

buluxan commented Oct 31, 2017

i got this error when i try to run that command
A client error (RequestTimeTooSkewed) occurred when calling the ListBuckets operation: The difference between the request time and the current time is too large

@thapakazi
Copy link

@buluxan check your system time, its seems drifted to the time that s3 uses during that api call.
you can use ntpd or systemd's time-syncd to keep your clock in sync with ntp clocks.

@fatelgit
Copy link

Just wanted to add a quick note: In case your password contains any special characters, you have to use single quotes - otherwise you will get an error.

@francescobbo
Copy link

Besides the script, that looks great, it's worth specifying that you may want to properly protect the target S3 bucket, eg: https://francescoboffa.com/using-s3-to-store-your-mysql-or-postgresql-backups

@simbo1905
Copy link

You probably want to encrypt the file that you are uploading into S3. Here is a version of the script that uses gpg for symmetric encryption of the file before it is uploaded:

https://github.com/simbo1905/s3-mysql-backup/blob/e13145720d98e2456f5393c6e5fab0418e77bb89/s3mysqlbackup.sh#L46

@parmarthgit
Copy link

Nice One
But

mysqldump --host=$HOST --user=$USER --password=$PASSWORD $DB_NAME --routines --single-transaction | gzip -9 | aws s3 cp - s3://bucket/database/filename.sql.gz

will directly store file to s3.

@tobidsn
Copy link

tobidsn commented Sep 24, 2018

Backup MySQL to Amazon S3 Specific Table : https://gist.github.com/tobidsn/1abe9e75095a67f194c1cf6c00aac141

@NitsPatel1
Copy link

I have use above script .But I have not get automatically backup at 3.am. When I manually run script then I get backup.
also How to keep last 30 days backup. older than 30 days backup should be delete automatically with this script

@bsalim
Copy link

bsalim commented Jun 30, 2020

Is this tool still rocks on 2020? Haha I used it 7 years ago by the way.

@CoolCold
Copy link

CoolCold commented Jul 3, 2020

Nice One
But

mysqldump --host=$HOST --user=$USER --password=$PASSWORD $DB_NAME --routines --single-transaction | gzip -9 | aws s3 cp - s3://bucket/database/filename.sql.gz

will directly store file to s3.

Finally somone noted on single transaction to make backup consistent, not just a random set of tables!

@CoolCold
Copy link

CoolCold commented Jul 3, 2020

@oodavid - won't you make repo with the script ?:) a lot of updates here!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment