Automated disclaimer: This post was written more than 15 years ago and I may not have looked at it since.
Older posts may not align with who I am today and how I would think or write, and may have been written in reaction to a cultural context that no longer applies. Some of my high school or college posts are just embarrassing. However, I have left them public because I believe in keeping old web pages alive—and it's interesting to see how I've changed.
Update: I haven't really been using this, since the bandwidth required is a bit... excessive. I think I'll stick to duplicity + external hard drive.
Duplicity is a backup program that only backs up the files (and parts of files) that have been modified since the last backup. Built on FLOSS (rsync, GnuPG, tar, and rdiff), it allows efficient, locally encrypted, remote backups.
Amazon S3 is a web service that provides cheap, distributed, redundant, web-accessible storage. S3 currently charges only $0.15 per GB-month storage and $0.10 per GB upload. The API is based on HTTP requests such as GET, POST, PUT, and DELETE.
The following is a description of how I made use of these to back up my laptop, which runs Ubuntu Feisty Fawn.
These packages are in the Ubuntu repositories:
- duplicity: Remote encrypted incremental backup
- python-boto: Allow Python to talk to S3
(Edit: Packages now available in repository, so no building necessary.)
Sign up for S3
Make sure to generate and write down your Secret Access Key, along with your Access ID Key. You'll need those for any application that interacts with S3.
You should encrypt your files so that they are safe from prying eyes in transit and in storage. Signing them protects the files from alteration in storage or transit.
Decide on a GPG key to use for encryption and signing. (The one I use is 3BBF4E12, my main key.) Make sure your encryption/signing key is in your GPG keyring. (You can use separate keys for encryption and signing, but I haven't in this case.)
Optional: Visual S3 interface
Unzip the download to a permanent location, and find the bin/cockpit.sh file. Allow it to be executable (
chmod u+x cockpit.sh). Edit it to set the environmental variables properly:
When you run cockpit.sh, you will be presented with a somewhat complicated login screen. The simplest is the Direct Login tab. Local Folder login stores authentication info locally in a password-protected file.
Learn and configure
After scanning over
man duplicity and playing with commands like
duplicity /home/myusername/junk file:///home/myusername/junkBackup, create a shell script to run duplicity backups automatically. Here's what mine looks like:
export AWS_ACCESS_KEY_ID=myAccessKeyID export AWS_SECRET_ACCESS_KEY=mySecretAccessKey duplicity --encrypt-key=myEncryptionKeyFingerprint --sign-key=mySigningKeyFingerprint --exclude=/sys --exclude=/dev --exclude=/proc --exclude=/tmp --exclude=/mnt --exclude=/media --include=/mnt/media / s3+http://myUniqueBucketName export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY=
You'll generally want to exclude /proc and /tmp from backups, since they contain constantly changing volatile runtime data that gets wiped every time you shutdown. /dev is full of file representations of your hardware, and /sys... I don't know, it screws up my backups, and has something to do with driver-daemon communication. I exclude /mnt and /media because I don't want to back up my external drives, and I re-include /mnt/media because that's my photos, music, and video partition. Duplicity supports all the rsync options, including fancy wildcards/globbing and the same-filesystem directive.
Notes on duplicity
- The bucket name must be unique among all S3 buckets owned by all S3 users. It does not need to exist before use.
- Duplicity supports a number of protocols: file, ftp, scp, and s3+http are the most relevant here.
- Switch the file path and the URL, and you're doing a restore.
- When using cryptographic signing, duplicity will ask you to type in your key's passphrase twice at the command line. Somewhat annoying if yours is 30-odd characters.