Like any system admins, we need to ensure that we have good backups. Our systems do not have tape backup devices, and it's easy to rebuild the OS, so we're just backing up our data and config files across the network via rsync tunneled through SSH.
We're also backing up locally to tarballs for easier access, and to have multiple versions of backups.
Create directories to store our backup files, and make sure they're not readable by anyone:
mkdir -p /var/backups /var/backups/etc /var/backups/home /var/backups/web /var/backups/mysql /var/backups/usr_local chmod -R 700 /var/backups
Create the backup script in /etc/cron.daily/backup-etc
:
#!/bin/sh SRCDIR=/etc BACKUPDIR=/var/backups/etc DATE=`date +%Y%m%d` BACKUP_FILE=$BACKUPDIR/etc-$DATE.tar.gz HOSTNAME=`hostname` MAILTO="craig@boochtek.com" MAILFROM="$SRCDIR backups <root@boochtek.com>" OUTPUT=`tar cfz $BACKUP_FILE -P $SRCDIR` if [ ! -z "$OUTPUT" ] then mail -a "From: $MAILFROM" -s "$SRCDIR backup report for $HOSTNAME" $MAILTO <<EOF $SRCDIR backup progress: $OUTPUT EOF fi exit 0;
Make the script executable:
chmod 755 /etc/cron.daily/backup-etc
Run the script to test that it works, and saves a file with the correct name in /var/backups/etc/
. Run tar tfz
on the resulting backup file to make sure it contains the expected files.
Create the backup script in /etc/cron.daily/backup-home
:
#!/bin/sh SRCDIR=/home BACKUPDIR=/var/backups/home DATE=`date +%Y%m%d` BACKUP_FILE=$BACKUPDIR/home-$DATE.tar.gz HOSTNAME=`hostname` MAILTO="craig@boochtek.com" MAILFROM="$SRCDIR backups <root@boochtek.com>" OUTPUT=`tar cfz $BACKUP_FILE -P $SRCDIR` if [ ! -z "$OUTPUT" ] then mail -a "From: $MAILFROM" -s "$SRCDIR backup report for $HOSTNAME" $MAILTO <<EOF $SRCDIR backup progress: $OUTPUT EOF fi exit 0;
Make the script executable:
chmod 755 /etc/cron.daily/backup-home
Run the script to test that it works, and saves a file with the correct name in /var/backups/home/
. Run tar tfz
on the resulting backup file to make sure it contains the expected files.
Create the backup script in /etc/cron.daily/backup-usr_local
:
#!/bin/sh SRCDIR=/usr/local BACKUPDIR=/var/backups/usr_local DATE=`date +%Y%m%d` BACKUP_FILE=$BACKUPDIR/usr_local-$DATE.tar.gz HOSTNAME=`hostname` MAILTO="craig@boochtek.com" MAILFROM="$SRCDIR backups <root@boochtek.com>" OUTPUT=`tar cfz $BACKUP_FILE -P $SRCDIR` if [ ! -z "$OUTPUT" ] then mail -a "From: $MAILFROM" -s "$SRCDIR backup report for $HOSTNAME" $MAILTO <<EOF $SRCDIR backup progress: $OUTPUT EOF fi exit 0;
Make the script executable:
chmod 755 /etc/cron.daily/backup-usr_local
Run the script to test that it works, and saves a file with the correct name in /var/backups/usr_local/
. Run tar tfz
on the resulting backup file to make sure it contains the expected files.
Create the backup script in /etc/cron.daily/backup-web
:
#!/bin/sh SRCDIR=/var/www BACKUPDIR=/var/backups/web DATE=`date +%Y%m%d` BACKUP_FILE=$BACKUPDIR/web-$DATE.tar.gz HOSTNAME=`hostname` MAILTO="craig@boochtek.com" MAILFROM="$SRCDIR backups <root@boochtek.com>" OUTPUT=`tar cfz $BACKUP_FILE -P $SRCDIR` if [ ! -z "$OUTPUT" ] then mail -a "From: $MAILFROM" -s "$SRCDIR backup report for $HOSTNAME" $MAILTO <<EOF $SRCDIR backup progress: $OUTPUT EOF fi exit 0;
Make the script executable:
chmod 755 /etc/cron.daily/backup-web
Run the script to test that it works, and saves a file with the correct name in /var/backups/web/
. Run tar tfz
on the resulting backup file to make sure it contains the expected files.
Decided not to use mysqlhotcopy
, as it does not handle InnoDB databases, which some of our apps may require. So we're using mysqldump
. The –add-locks
makes loading the resulting dump faster. The –allow-keywords
allows fields to have the same name as SQL keywords. The –create-options
includes MySQL-specific options on CREATE TABLE
statements.
Make sure that there's a [mysqldump]
section in /root/.my.cnf
which is a copy of the [mysql]
section:
[mysqldump] user = 'root' password = '$PASSWORD_GOES_HERE'
Create the backup script in /etc/cron.daily/backup-mysql
:
#!/bin/sh BACKUPDIR=/var/backups/mysql DATE=`date +%Y%m%d` BACKUP_FILE=$BACKUPDIR/mysql-$DATE.tar.gz HOSTNAME=`hostname` MAILTO="craig@boochtek.com" MAILFROM="$SRCDIR backups <root@boochtek.com>" OUTPUT=`mysqldump --all-databases --add-locks --allow-keywords --create-options > $BACKUPDIR/mysql-$DATE.sql && gzip $BACKUPDIR/mysql-$DATE.sql` if [ ! -z "$OUTPUT" ] then mail -a "From: $MAILFROM" -s "MySQL backup report for $HOSTNAME" $MAILTO <<EOF MySQL backup progress: $OUTPUT EOF fi exit 0;
Make the script executable:
chmod 755 /etc/cron.daily/backup-mysql
Run the script to test that it works, and saves a file with the correct name in /var/backups/mysql/
. Un-zip the resulting backup file to make sure it contains the expected content.
Copy the files in /var/backups
off the server to another server.
Take a look at mysqlsnapshot to see if it might work better for MySQL backups. Note that it has not been updated for MySQL 4.x or 5.x, but it's a Perl script, so it might still work.
Consider using bzip2 compression instead of gzip.
TODO: We need to delete tarballs that get old, so we don't keep too many around. Would be nice to keep: 1 backup per day for 1 week (or 2 weeks); 1 backup per week for a year (or 1-6 months); 1 backup per month forever.