Backup with duplicity on Rackspace CloudFiles (including UK) script.

It seems that my post about using duplicity to backup your data on Rackspace CloudFiles got popular and people may be interested to use with the newly (Beta) released Rackspace Cloud UK. You would just need to have a environnement exported at the top of your backup script like this :


and it will use the UK auth server (the same goes for OpenStack auth server if you have your own Swift install).

To make things easier I have taken this script from :

and adapted it to make it work with Rackspace Cloud Files.

This is available here :

You need to make sure that you have python-cloudfiles installed, on a Debian or Ubuntu system you can do it like this :

sudo apt-get -y install python-stdeb 
sudo pypi-install python-cloudfiles

Check the documentation of your Operating System to install python-cloudfiles, usually it is very easy to do it via pip (pip install python-cloudfiles)

When you have installed duplicity and checkout the script (see the github page for documentation how to do it) you can start configuring it.

At the top there is a detailled explanation of the different variables that need to be configured. You can change it in the script or you can have them configured in an external configuration file in your home directory called ~/.dt-cf-backup.conf, this is an example :

export DEST="cf+http://duplicity_backup"
INCLIST=( /home/chmouel/ )
EXCLIST=( 	 "/home/chmouel/tmp"    "/**.DS_Store" "/**Icon?" "/**.AppleDouble"  )

You can then just run :

./ --backup  

to do your backup.

There is much more documentation in the README.txt.

I just would like to thanks again the author of dt-s3-backup for this script. I have just made a few modifications for Rackspace Cloud Files.

6 thoughts on “Backup with duplicity on Rackspace CloudFiles (including UK) script.”

  1. Hi,

    I’m trying to use duplicity with openstack without result, any hint?

  2. Hello, is it possible to restore an old backup giving a date?

  3. Just a quick comment to say thanks :)

    I have wasted a couple days trying to backup my servers using the same script but with S3, and it’s been a nightmare. Poor S3 performance both with uploads and downloads, and I was basically unable to fully test a restore consistently due to read timeouts etc. Disabling SSL transfer helped a little, but not much. 

    So I found this post and tried your version of the script after activating an account for Cloud Files. It’s so much better! UL/DL performance is better, and I seem to be able to reliably backup and restore without problems.

    I am not sure of what went wrong with S3, but after two days I am happy to have found a better (for me) solution. The S3 region I was using is the one in Europe (Ireland), so perhaps it would be better with some other region if that one is having problems…dunno. Anyway. no longer an issue.


  4. On line 192 (“if not all…”) requires Python 2.6 and does not work on CentOS 5.x which comes with Python 2.4.

    You will notice the following error if you run it using Python 2.4:
    Traceback (most recent call last):
    File “”, line 6, in ?
    NameError: name ‘all’ is not defined
    TEST RUN ONLY: Check the logfile for command output.

    You can get around this dependency by modifying the script (line 192) to read:

    if not api_username and api_key and authurl and container:

Comments are closed.