rdiff-backup script using sshfs for larger backups

We have had a few customers who have much more data to back up, and taking a copy of that every day uses a LOT of disk space and can be time consuming. The answer is rdiff-backup . It does incremental backups whenever you run it, only backing up the difference from the initial complete backup. This saves you, and us, a lot of cost in disk and backup space.

Since we have enabled sftp on backupspace.rimuhosting.com you can now use rdiff-backup. The problem being that you can't send it directly to the server because of the limited commands, however you can mount it using sshfs (aka sshmnt ) and then do an rdiff-backup to it.

With this in mind, I have written a script which will automatically do this. Its tested and going on debian/ubuntu and Centos.
Steps to get this going

  1. Install sshfs and rdiff-backup ( see below for install instructions)
  2. Check you have /dev/fuse - otherwise mknod /dev/fuse -m 0666 c 10 229
  3. Download this script http://b.ri.mu/files/rdiff-backup-sshfs.sh
  4. Add your ssh key to backupspace. Commands to do this are in the script comments. This automates login without password. You need to make sure your backupspace.rimuhosting.com account is sftp enabled, this is done by emailing support and asking.
  5. Edit the script, it has several variables.
  6. Set the USERNAME variable at the top - this is your backupspace username
  7. Set the INCLUDES variable , configs and directories you want to backup using spaces between them. Escape all spaces and odd characters.
  8. Set the EXCLUDES for larger files you dont want backup. This uses standard regex.
  9. Create yourself a mysql user that can SELECT and LOCK TABLES if you want to backup databases. ( Details in the comments of script)
  10. Insert the user/pass variables into the script and set MYSQLBACKUP=1
  11. Set the OLDERTHAN variable - The time interval is an integer followed by the character s, m, h, D, W, M, or Y, indicating seconds, minutes, hours, days, weeks, months, or years respectively. ie 4W is 4 weeks
  12. If you have any special arguments you want to pass add them in ARGS variable
  13. chmod +x backup.sh (or whatever you named it)
  14. Do a test run manually ./backup.sh check for errors.
  15. If all goes well copy the backup file into /etc/cron.daily/backup or some similar name and you're done.
  16. Edit /etc/ssh/ssh_config and add the following lines to prevent sshfs timeout
    TCPKeepAlive yes
    ServerAliveInterval 15
    ServerAliveCountMax 3

Things to note: First time will take a while to sync things. Pays to test with a smaller dir of files. If you have any errors at all, have a look at the code and uncomment some of the debugging to check what its doing. If you get really stuck just drop an email into support and let us know.

To restore or list what backups you have, mount backup space

sshmnt username@backupspace.rimuhosting.com /mnt/

List the backups

rdiff-backup -l /mnt/vpsname

Restore from the backups

rdiff-backup -r now /mnt/file /local/file #you can set up a new vps using current backups
rdiff-backup -r 10D /mnt /tmp/file # 10 day old backups etc
rdiff-backup -r /mnt/hostname/rdiff-backup-data/increments/file.2003-03-05T12:21:41-07:00.diff.gz /local/file


If your database is not backing up or you see an error similar to this

./rdiff-backup-sshfs.sh: line 61: : No such file or directory

Chances are you downloaded a buggy version which was breifly released. You just need to add near the top setting the variable the following




Installing sshfs and rdiff-backup


apt-get install sshfs rdiff-backup


You will need to enable the DAG/rpmforge  repos

rpm -Uhv http://apt.sw.be/redhat/el5/en/i386/rpmforge/RPMS/rpmforge-release-0.3.6-1.el5.rf.i386.rpm

yum update

yum install sshfs rdiff-backup

OR download the rpms manually from http://dag.wieers.com/rpm/packages/rdiff-backup/ & http://dag.wieers.com/rpm/packages/fuse-sshfs/

This entry was posted in Featured, Rimuhosting and tagged , , , , , , , , , , , , . Bookmark the permalink.

3 Responses to rdiff-backup script using sshfs for larger backups

  1. Liz Quilty says:

    Its important to note, files put in /etc/cron.daily/ do not need/want a suffix (ie .sh )

  2. Liz Quilty says:

    If you want to use Amazon s3 for backup, have a look at this http://code.google.com/p/s3fs/wiki/FuseOverAmazon

    Once you have that installed, just change the sshmnt command to the mount command for the S3 :)

  3. Pingback: Quick and Easy Automatic Backup Script « RimuHosting Blog