Tribal Chicken

Security. Malware Research. Digital Forensics.

Incremental backups with Rsync (+ Weekly full backups).

I received an email from the VPS provider telling me I had used 90% of my allocated bandwidth… After scratching my head trying to figure out how on earth that happened, I noticed my backup script was broken and backup up 9GB every day… oops.

The backup script was a throw-together meant as a temporary thing anyway. Time to try to do things properly.

I have two servers that require backing up, so the idea is to do nightly incremental backups to each other, then do a weekly backup every Friday night to Amazon S3.

This is rather simple to do with rsync – you can even hard link to a previous directory so you get nice incremental backups, but each successive backup only takes up the disk space of the first (full) backup, plus/minus changes. Pretty cool.

First I created a new backup account and generated an ssh public/private key pair for it so I can do the backups over SSH – Then threw together this script:

#!/bin/bash

## NIGHTLY INCREMENTAL BACKUPS ##

# The host to backup REMOTE_HOST='001-au.tribalchicken.net' REMOTE_USER='backup' REMOTE_PATH='/var/www/html/'

# Where to keep the backups BACKUP_DIR='/backup/001-au/web'

# rsync options OPTIONS="-v"

# Who to notify NOTIFY=

# Days to keep DAYS=7

# Name of backup folder FOLDER_NAME=backup

## Meat of the script echo 'Changing to backup directory '$BACKUP_DIR cd $BACKUP_DIR

# Remove the oldest backup echo 'Removing oldest backup' rm -rf $FOLDER_NAME.$DAYS

# Move backup folders each one day echo 'Cycling backups' for i in $(eval echo "{$DAYS..1}") do mv $FOLDER_NAME.$(($i-1)) $FOLDER_NAME.${i} done

# Run the backup /usr/bin/rsync -a -e ssh -z --delete --link-dest=../backup.1 $REMOTE_USER@$REMOTE_HOST:$REMOTE_PATH $FOLDER_NAME.0 $OPTIONS

Essentially, it keeps 7 days worth of backups (or however many days you specify), deleting the oldest backup as it needs to to make a new backup.

Created a nightly cron job for it – Now to see if it works (I’m going to add email notification in later). Now I’ll setup the same script on the other server so they backup to each other, and just whack together a simple script to roll a tarball every Friday night and copy it up to S3.

Another throw-together, but hopefully this one will actually work properly.