We have had a few customers who have much more data to back up, and taking a copy of that every day uses a LOT of disk space and can be time consuming. The answer is rdiff-backup . It does incremental backups whenever you run it, only backing up the difference from the initial complete backup. This saves you, and us, a lot of cost in disk and backup space.
Since we have enabled sftp on backupspace.rimuhosting.com you can now use rdiff-backup. The problem being that you can't send it directly to the server because of the limited commands, however you can mount it using sshfs (aka sshmnt ) and then do an rdiff-backup to it.
With this in mind, I have written a script which will automatically do this. Its tested and going on debian/ubuntu and Centos.
Steps to get this going
- Install sshfs and rdiff-backup ( see below for install instructions)
- Check you have /dev/fuse - otherwise mknod /dev/fuse -m 0666 c 10 229
- Download this script http://b.ri.mu/files/rdiff-backup-sshfs.sh
- Add your ssh key to backupspace. Commands to do this are in the script comments. This automates login without password. You need to make sure your backupspace.rimuhosting.com account is sftp enabled, this is done by emailing support and asking.
- Edit the script, it has several variables.
- Set the USERNAME variable at the top - this is your backupspace username
- Set the INCLUDES variable , configs and directories you want to backup using spaces between them. Escape all spaces and odd characters.
- Set the EXCLUDES for larger files you dont want backup. This uses standard regex.
- Create yourself a mysql user that can SELECT and LOCK TABLES if you want to backup databases. ( Details in the comments of script)
- Insert the user/pass variables into the script and set MYSQLBACKUP=1
- Set the OLDERTHAN variable - The time interval is an integer followed by the character s, m, h, D, W, M, or Y, indicating seconds, minutes, hours, days, weeks, months, or years respectively. ie 4W is 4 weeks
- If you have any special arguments you want to pass add them in ARGS variable
- chmod +x backup.sh (or whatever you named it)
- Do a test run manually ./backup.sh check for errors.
- If all goes well copy the backup file into /etc/cron.daily/backup or some similar name and you're done.
- Edit /etc/ssh/ssh_config and add the following lines to prevent sshfs timeout
Things to note: First time will take a while to sync things. Pays to test with a smaller dir of files. If you have any errors at all, have a look at the code and uncomment some of the debugging to check what its doing. If you get really stuck just drop an email into support and let us know.
To restore or list what backups you have, mount backup space
sshmnt email@example.com /mnt/
List the backups
rdiff-backup -l /mnt/vpsname
Restore from the backups
rdiff-backup -r now /mnt/file /local/file #you can set up a new vps using current backups
rdiff-backup -r 10D /mnt /tmp/file # 10 day old backups etc
rdiff-backup -r /mnt/hostname/rdiff-backup-data/increments/file.2003-03-05T12:21:41-07:00.diff.gz /local/file
If your database is not backing up or you see an error similar to this
./rdiff-backup-sshfs.sh: line 61: : No such file or directory
Chances are you downloaded a buggy version which was breifly released. You just need to add near the top setting the variable the following
Installing sshfs and rdiff-backup
apt-get install sshfs rdiff-backup
You will need to enable the DAG/rpmforge repos
yum install sshfs rdiff-backup
OR download the rpms manually from http://dag.wieers.com/rpm/packages/rdiff-backup/ & http://dag.wieers.com/rpm/packages/fuse-sshfs/