FTP server for Cloud Files

I have just committed an experiment of a FTP Server answering to Cloud Files. It act completely transparently to be able to use any FTP Client to connect to cloud-files.

There is probably a couple of bugs there but the basis of it seems to be working, please let me know if you find any problems with it.


By default it will bind to port 2021 and localhost to be able to be launched by user which can be changed via the command line option -p. The username password are your API Username and key.

Manual Install

FTP-Cloudfs require the pyftpdlib which can be installed from here :


and python-cloudfiles :


you can then checkout FTP-Cloudfs from here :


The way to install python package is pretty simple, simply do a python
setup.py install after uncompressing the tarball downloaded.

Automatic Install:

You can generate a debian package directly from the source if you have
dpkg-buildpackage installed on your system. It will give you a nice
initscripts as well to start automatically the ftp cloudfs process.


Albeit I am working for Rackspace Cloud this is not supported by
Rackspace but please feel free to send a comment here if you have any

32 thoughts on “FTP server for Cloud Files”

  1. I think this is agreat idea as I’m frustrated with the FF plug-in to upload a large number of files.

    I think I have it installed correctly on my windows 2008 server but the server.py doesn’t return anything and I can’t connect to localhost:2021.

    Any ideas how I’ve broken it?


    1. I haven’t tested yet on Windows 2008 to be honest, did you install the dependencies ? What version of python are you using the activestate one or cygwin?

  2. Is there a quick HowTo on connecting this to RS Cloud Files and using an FTP client to connect to it? I think this is great, just needs a guide how to actually use it

    1. I take it back. Its easy to use if you install it on the server where you want to copy files from (source). However, my ncftp client keeps getting disconnected… any ideas?

      1. tried setting some variables such as timeouts from the ncftp side…hope it works…;)

        1. using plain ftp and ncftp client gets me a “connection closed” error! :( any ideas?

  3. Hi,
    Can you just clarify for me, does this mean if i set this up at,
    subdomain.domain.tld then used http://ftp.subdomain.tld then it would act like normal ftp and behind the scene interact with cloud files, i.e create containers for folders, objects for files edit and delete etc like an ftp server normally would on a local machine?

    I’m sorta after someting of that nature because my site was built to use ftp to put and get files so it’ll be a pain making changes to work with cloud files.

    I tried implementing something like this on my own using the php api but it was too buggy. So this would work wonders for me.
    thanks for any response, be appreciated.

  4. I think i’ve got it installed. Thanks, nice work.

    How do i make it run as a long term process, i.e the server doesn’t shut down and how can i make it serve ftp via the public ip or domain name so that domain.tld:2021 is accessible?

    thanks in advance

  5. This is awesome. I would LOVE a yum install or RPM for CentOS if you can swing it. It’d save a ton of time installing and getting it working on our servers.

  6. @Will Sorry I don’t have such infrastrcture yet, I have done ppa packages but I don’t think there is such things as launchpad for RPM based distros (I would not mind to provide a spec if that’s help but I don’t have fedora thing to test)…

  7. @Courtney are you using the debian based init script ? then change the option in /etc/default/ftp-cloudfs. If not you have to provide the right argument to the ftpcloudfs binary :

    chmouel@lutece:/etc/default$ ftpcloudfs –help
    Usage: ftpcloudfs [OPTIONS]…..

    -h, --help show this help message and exit
    -p PORT, --port=PORT Port to bind the server default: 2021.
    -b BIND_ADDRESS, --bind-address=BIND_ADDRESS
    Address to bind default:
    -l LOG_FILE, --log-file=LOG_FILE
    Log File: Default stdout

  8. Hello!

    I installed this utility on my server but im getting this error:

    2010-03-18 12:29:32,160 - INFO - Serving FTP on
    2010-03-18 12:30:27,430 - INFO - [] Connected.
    2010-03-18 12:30:27,430 - DEBUG - ==> 220 Rackspace Cloud F iles 0.2 using pyftpdlib 0.5.1 ready.
    2010-03-18 12:30:27,500 - DEBUG - 331 Username ok, send password.
    2010-03-18 12:30:27,559 - DEBUG - 230 Welcome username
    2010-03-18 12:30:27,700 - INFO - [username]@ User username on logged in.
    2010-03-18 12:30:27,769 - DEBUG - 211 End FEAT.
    2010-03-18 12:30:28,109 - DEBUG - 257 "/" is the curren t directory.
    2010-03-18 12:30:28,220 - DEBUG - 200 Type set to: ASCI I.
    2010-03-18 12:30:28,289 - DEBUG - 227 Entering passive mode (67,23,19,162,225,107).
    2010-03-18 12:30:28,349 - DEBUG - <== MLSD
    2010-03-18 12:30:28,539 - ERROR - Traceback (most recent call last):
    File "/usr/lib64/python2.4/asyncore.py", line 69, in read
    File "/usr/lib64/python2.4/asyncore.py", line 391, in handle_read_event
    File "/usr/lib64/python2.4/asynchat.py", line 137, in handle_read
    File "/usr/lib/python2.4/site-packages/pyftpdlib/ftpserver.py", line 1801, in found_terminator
    File "/usr/lib/python2.4/site-packages/pyftpdlib/ftpserver.py", line 2310, in ftp_MLSD
    File "/usr/lib/python2.4/site-packages/ftpcloudfs/server.py", line 322, in for mat_mlsx
    raise OSError(40, 'unsupported')
    OSError: [Errno 40] unsupported

    2010-03-18 12:30:28,539 - INFO - [username]@ Disconnected.

    Do you have any idea? I installed python cloud files 1.5 and pyftpdlib 0.5.1 on python 2.4.3 CentOS.


  9. Hi using Webmin, I’ve created a bootup action with the following script


    case "$1" in
    ftpcloudfs -p 21 -b my.actual.ip.address
    echo "Usage: $0 { start | stop }"
    exit 0

    It doesn’t appear to work though – FTP clients can’t ftp in. Port 21 is definitely available and I am able to FTP in if I run in command line

    ftpcloudfs -p 21 -b my.actual.ip.address

    What did I miss? Thanks in advance.

    1. Hi,

      The server has not been tested a lot when binding to a non local IP I apologize and it’s probably need more work on that.


  10. Is there a way to get access to RS CF using FTP on a windows machine?

    I use ExpanDrive for local FTP drive access and to be able to get directly into CloundFiles this way would be just great.

  11. hi, I followed the installation, but I try to use the program gives me error:

    sudo ftpcloudfs -h

    Traceback (most recent call last):
    File “/usr/local/bin/ftpcloudfs”, line 4, in
    import pkg_resources
    File “/usr/local/lib/python2.6/dist-packages/setuptools-0.6c11-py2.6.egg/pkg_resources.py”, line 2603, in
    return cls.__mro__
    File “/usr/local/lib/python2.6/dist-packages/setuptools-0.6c11-py2.6.egg/pkg_resources.py”, line 666, in require
    File “/usr/local/lib/python2.6/dist-packages/setuptools-0.6c11-py2.6.egg/pkg_resources.py”, line 565, in resolve
    “””Find all activatable distributions in `plugin_env`
    pkg_resources.DistributionNotFound: cloudfiles

    can you help? also the plugin for nautilus does not work, can you tell me how to use it?

    Best regards.
    Massimo P.

  12. install python-cloudfiles before what distribution are you using ?

  13. Hi Chmouel,

    It certainly sounds like a great package!

    I can’t seem to get it right though…

    I run it and it gives me:
    2011-03-31 22:18:19,057 – INFO – Serving FTP on

    Then I try to connect to it from a remote machine but all I get is a “connection refused” message.

    Should I plain-text my username/api-key over this FTP?

    Any thoughts?

  14. Ahum, foggedaboudid…
    I was listening on a wrong port

    It actually worked like a charm from the moment I figured that out!

    Thanks :-)

  15. Hey Chmouel,

    Thanks for this! I ran into a problem, though — python-cloudfiles expects programs that use it to implement markers and limits to handle more than 10,000 containers, but ftp-cloudfs doesn’t. That means that operations that get a list of containers only give the first 10,000.

    (The same problem would happen with containers containing more than 10,000 files, but we don’t have that situation so I didn’t run into it!)

    I’m insufficiently good at Python to fix that myself, unfortunately, or I’d give you a patch, so I hope you don’t mind a bug report instead. :-)

    (I do wish python-cloudfiles handled that limit itself — get_all_containers NOT returning ALL containers is silly.)

  16. Hello Chmouel,

    Thanks for the blog but I have a question how could I be able to connect an ftp client like filezilla with cloufs ? 

  17. your account:username (url encoded for the colon) for ftp username and key for password should works…

  18. Alright but I’m little confused so do u mean that I open filezilla and put the username of swift in the section of username and the same thing with the password? if not could you please give me more  details.

    Also, before doing that i am just wondering if there are no configurations to do with cloudfs so that it will be able to “exchange” datas with filezilla for eg. Because the real point is that I need to create my own ftp client and connect it with cloudfs to be able to communicate with swift (upload, download,etc…) so that’s why I want to understand how it works with filezilla.

    I hope it’s clear …thanks….:)

  19. Hi. This is an awesome project. I’m wondering if one can connect to a heroku app with this? Is there any way of accessing heroku apps’ file dir’s? 
    Or does heroku not work with hierachies?

  20. Cloud servers are secure and a site could earn good features and functionality to make things work like nothing else.

Comments are closed.