Linux web server backup script

At home i have a custom setup debian web server. It runs Debian arm64. As it is a virtual machine on my Raspberry Pi running VMware ESXi. On this server I run:

  • Apache 2 (Web server with a PHP backend)
  • Nginx (Reverse Proxy)
  • MariaDB (Database)

Most of these services are very common to most web servers. But what is not clear is how to best backup your crucal sites.

What should be backed up

So I keep all my web content in /var/www/vhosts/. This contains all the site spesific nginx and apache config files as well as the htdocs for the sites. These config files are then symbolic linked into the various locations that the services require. This makes it easier to lift sites from one server to another, but also makes backups a lot easier.

As well as the file structure, we have to backup databases. This can be a bit tricky if your not experienced with MySQL or MariaDB.

The gole is to backup and the database as an SQL file and the files as a archive file. Each of these at a per-site level.

Backup location

I use a SMB file share to store my backups, this keeps it off the same server and then easiy to find the backups. This is then mounted in FSTAB to: /mnt/backup/

The Script

I use a bash script to backup my sites. Below is a complete copy of it, there was no sensitve infomation to remove. Take note of the variables at the top, that allow you to easily customize its parameters.

I cannot take the credit for this script, but I cant remember where I got it from. However I have mondified it. It does also support RSYNC and S3 storage. Not that I use them as they cost money.

For the databases, make sure you can access them without a password. You can check this using the command below.

#: sudo mysql

So without anymore explenation here is the script. I store this in /home/sadmin/full_backup.sh. Bt you can put it wherever you want. I recomend keeping it in a home directory though.

#!/bin/bash

# BEGIN CONFIGURATION ==========================================================

BACKUP_DIR="/mnt/backup"  # The directory in which you want backups placed
DUMP_MYSQL=true
TAR_SITES=true
SYNC="none" # Either 's3sync', 'rsync', or 'none'
KEEP_MYSQL="60" # How many days worth of mysql dumps to keep
KEEP_SITES="30" # How many days worth of site tarballs to keep

MYSQL_HOST="localhost"
MYSQL_USER="root"
MYSQL_PASS=""
MYSQL_BACKUP_DIR="$BACKUP_DIR/mysql/"

SITES_DIR="/var/www/vhosts/"
SITES_BACKUP_DIR="$BACKUP_DIR/sites/"


# See s3sync info in README
S3SYNC_PATH="/usr/local/s3sync/s3sync.rb"
S3_BUCKET="my-fancy-bucket"
AWS_ACCESS_KEY_ID="YourAWSAccessKey" # Log in to your Amazon AWS account to get this
AWS_SECRET_ACCESS_KEY="YourAWSSecretAccessKey" # Log in to your Amazon AWS account to get this
USE_SSL="true"
SSL_CERT_DIR="/etc/ssl/certs" # Where your Cert Authority keys live; for verification
SSL_CERT_FILE="" # If you have just one PEM file for CA verification

# If you don't want to use S3, you can rsync to another server
RSYNC_USER="user"
RSYNC_SERVER="other.server.com"
RSYNC_DIR="web_site_backups"
RSYNC_PORT="22" # Change this if you've customized the SSH port of your backup system

# You probably won't have to change these
THE_DATE="$(date '+{dd02ca53089cac2432c56b1281023466f904f5e47d54aa45d3c7a4cebb0a242f}Y-{dd02ca53089cac2432c56b1281023466f904f5e47d54aa45d3c7a4cebb0a242f}m-{dd02ca53089cac2432c56b1281023466f904f5e47d54aa45d3c7a4cebb0a242f}d')"

MYSQL_PATH="$(which mysql)"
MYSQLDUMP_PATH="$(which mysqldump)"
FIND_PATH="$(which find)"
TAR_PATH="$(which tar)"
RSYNC_PATH="$(which rsync)"

# END CONFIGURATION ============================================================



# Announce the backup time
echo "Backup Started: $(date)"

# Create the backup dirs if they don't exist
if [[ ! -d $BACKUP_DIR ]]
  then
  mkdir -p "$BACKUP_DIR"
fi
if [[ ! -d $MYSQL_BACKUP_DIR ]]
  then
  mkdir -p "$MYSQL_BACKUP_DIR"
fi
if [[ ! -d $SITES_BACKUP_DIR ]]
  then
  mkdir -p "$SITES_BACKUP_DIR"
fi

if [ "$DUMP_MYSQL" = "true" ]
  then

  # Get a list of mysql databases and dump them one by one
  echo "------------------------------------"
  DBS="$($MYSQL_PATH -h $MYSQL_HOST -u$MYSQL_USER  -Bse 'show databases')"
  for db in $DBS
  do
    if [[ $db != "information_schema" && $db != "mysql" && $db != "performance_schema" ]]
      then
      echo "Dumping: $db..."
      $MYSQLDUMP_PATH --opt --skip-add-locks -h $MYSQL_HOST -u$MYSQL_USER  $db | gzip > $MYSQL_BACKUP_DIR$db\_$THE_DATE.sql.gz
    fi
  done

  # Delete old dumps
  echo "------------------------------------"
  echo "Deleting old backups..."
  # List dumps to be deleted to stdout (for report)
  $FIND_PATH $MYSQL_BACKUP_DIR*.sql.gz -mtime +$KEEP_MYSQL
  # Delete dumps older than specified number of days
  $FIND_PATH $MYSQL_BACKUP_DIR*.sql.gz -mtime +$KEEP_MYSQL -exec rm {} +

fi

if [ "$TAR_SITES" == "true" ]
  then

  # Get a list of files in the sites directory and tar them one by one
  echo "------------------------------------"
  cd $SITES_DIR
  for d in *
  do
    echo "Archiving $d..."
    $TAR_PATH --exclude="*/log" -C $SITES_DIR -czf $SITES_BACKUP_DIR/$d\_$THE_DATE.tgz $d
  done

  # Delete old site backups
  echo "------------------------------------"
  echo "Deleting old backups..."
  # List files to be deleted to stdout (for report)
  $FIND_PATH $SITES_BACKUP_DIR*.tgz -mtime +$KEEP_SITES
  # Delete files older than specified number of days
  $FIND_PATH $SITES_BACKUP_DIR*.tgz -mtime +$KEEP_SITES -exec rm {} +

fi

# Rsync everything with another server
if [[ "$SYNC" == "rsync" ]]
  then
  echo "------------------------------------"
  echo "Sending backups to backup server..."
  $RSYNC_PATH --del -vaze "ssh -p $RSYNC_PORT" $BACKUP_DIR/ $RSYNC_USER@$RSYNC_SERVER:$RSYNC_DIR

# OR s3sync everything with Amazon S3
elif [[ "$SYNC" == "s3sync" ]]
  then
  export AWS_ACCESS_KEY_ID
  export AWS_SECRET_ACCESS_KEY
  export SSL_CERT_DIR
  export SSL_CERT_FILE
  if [[ $USE_SSL == "true" ]]
    then
    SSL_OPTION=' --ssl '
    else
    SSL_OPTION=''
  fi
  echo "------------------------------------"
  echo "Sending backups to s3..."
  $S3SYNC_PATH --delete -v $SSL_OPTION -r $BACKUP_DIR/ $S3_BUCKET:backups
fi

# Announce the completion time
echo "------------------------------------"
echo "Backup Completed: $(date)"
touch /home/sadmin/backup.txt

Running the script automaticly.

To run this script automaticly, I make use of Crontab. This can be a hard tool to master, so I have copied my entry for you. I strongy recomend running this script as root, to avoid problems.

To access the root crontab file:

#: sudo crontab -e

Then add the following line to the bottom of the file:

* 0 * * * /bin/bash /home/sadmin/full_backup.sh

This will make the backup run at midnight.

Testing your backup

So apart from seeing if you get some content in your backup share. You may notice that at the end of script we touch the file /home/sadmin/backup.txt

This is so that we can see both if the backup ran, but also how long it took. For example when I run “ls -l /home/sadmin/”. I normally get a time of 00:02. Telling me that the script took two minutes to run.


Linux, Software, Tech
March 23, 2021
placeholder user
Author: John Hart

Leave a Reply

Your email address will not be published. Required fields are marked *