Backup MySQL database to Digital Ocean Spaces

Making the switch to Digital Ocean Spaces is very similar to an S3 setup. There are a few Prerequisites to cover before we start however.

Prerequisites

  • Some flavour of Linux, Ubuntu will do fine
  • MySQL 5.7+
  • Digital Ocean Spaces API key and secret
  • s3cmd setup and configured to use your spaces API credentials

I'll assume you have a server and MySQL already done, if not check out Digital Oceans tutorials on how to set these up, they are very good!

You can head over to a blog post i've done on setting up a Digital Ocean Spaces account.

Install and configure s3cmd on Ubuntu

Now moving on to s3cmd, i'll do a quick guide on how install and configure it.

My Ubuntu installation (v18.04) had s3cmd v2.0.1 in the package manager, anything version 2 or greater will do for this tutorial.

First start off by installing it:

sudo apt-get s3cmd

Once that's finished, lets configure it.

s3cmd --configure

You'll be presented with a series of options. When prompted replace details with your own. For prompts where it says "Leave Blank" just hit return and move onto the next step.

Access Key []: YOUR_ACCESS_KEY
Secret Key []: YOUR_SECRET_KEY
Default Region [US]: Leave Blank
S3 Endpoint [s3.amazonaws.com]: ams3.digitaloceanspaces.com
DNS-style bucket+hostname:port template for accessing a bucket []: %(bucket).ams3.digitaloceanspaces.com
Encryption password: Leave Blank or type a password - up to you
Path to GPG program [/usr/bin/gpg]: Leave Blank
Use HTTPS protocol [Yes]: Yes
HTTP Proxy server name: Leave Blank
Test access with supplied credentials? [Y/n] Y
Save settings? [y/N] Y

This will store a .s3cfg file in the root directory. You can edit this file or you can run the --configure command if you ever need to change the details again.

With the S3 setup completed, let's configure MySQL to login without a password and write our backup script.

Login to MySQL without a password

Usually when you connect to your database, you'd pass the --password flag which would then prompt for your password. However when running this in a bash script, it's not practical to use the password in the script. Luckily there is an easy way around this.

mysql_config_editor set --login-path=client --host=localhost --user=db_user --password

Replace the user with your MySQL user. This command will prompt you for your password and then create a file  '.mylogin.cnf' with an encrypted password.

You can now type just 'mysql' and you should be able to login just like that! You can check your config file with the following command:

mysql_config_editor print --login-path=client

You should see your password as asterisks *****

MySQL backup script

SSH into your webserver and create a new bash file. You can call it whatever you like. I called mine 'mysqlbackup.sh'

vim mysqlbackup.sh

Copy and paste the following code into it:

#!/usr/bin/env bash

# Backup Directory Locations
BACKUPPATH="/home/daniel/backups"
TSTAMP=$(date +"%d-%m-%Y-%H-%M-%S")
S3BUCKET="s3://YOUR_BUCKET_PATH.ams3.digitaloceanspaces.com"

# Dump of Mysql Database into S3\
echo "creating backup of database start at $TSTAMP"

mysqldump dbname | gzip > $BACKUPPATH/dbname-$TSTAMP.sql.gz
echo "MySQL dump completed, uploading to DigitalOcean Spaces at: $TSTAMP" # Delete files older than 15 days find $BACKUPPATH/* -maxdepth 0 -type f -mtime +15 -exec rm {} \;
s3cmd put --recursive $BACKUPPATH $S3BUCKET
echo "MySQL dump transferred to DigitalOcean Spaces at $TSTAMP" echo "Backup completed at $TSTAMP"

There are a few sections that you'll need to change.

  • BACKUPROOT - This was just my current working directory and a new folder to store the backups. Change this to anything you want.
  • S3BUCKET - This is your bucket path 
  • In the mysqldump command, replace 'dbname' with the name of your database
  • You can also specify the number of days after which older backups will be deleted

A brief summary of what this script does is as follows:

  1. Create a gzipped mysql dump of your database and stores in in your BACKUPPATH
  2. Clean up any existing files which are older than 15 days
  3. Use the s3cmd package to upload the backup(s) in the BACKUPPATH to DigitalOcean Spaces

The last thing to do before we test the script is to make it executable.

chmod +x mysqlbackup.sh

Now run your script!

./mysqlbackup.sh

You should see some output from the script and some progress bars as the file(s) upload. One final check is to look on DigitalOcean for your new backups.

To further automate this process, you could create a cronjob to run nightly!

More Posts

Laravel Queue Delays Not Working

Using Laravel 5.7 queues along with Redis I found that whilst the events were being dispatched and handled by the...