This post is heavily inspired by a presentation I saw at the last Paris.py meetup.
The speaker (Nicolas Mussat, CTO at MeilleursAgent.com), was kind enough to allow me to publish this on my blog.
You can find the original slides (in French) on slideshare.
There are plenty of cases when using a PaaS is a good idea, but let’s assume that for various reasons you’d like to use your own server.
The really easy way (too easy :P)
You could be tempted to use something like:
# in requirements.txt flask path.py
# on the server: $ cd my-app $ git pull $ pip install -r requirements.txt $ systemctl restart uwsgi
This has several problems:
What about when
flaskgets updated? You may end up with a version that is no longer compatible with the rest of your source code.
The same issue may occur if one of the dependencies of flask gets updated in a non-backward compatible way.
It’s complicated to rollback to a previous version.
Bad things will happen if github or pypi is down.
How do you make sure all tests have been run and that they pass?
There has to be a better way! 1
Separate build and deployment
A good idea would be to separate building and testing the application and its deployment. 2
If you think about it, you can separate the files used by your code in two parts:
The runtime, that is the Python interpreter itself, the
uwsgiservice, and so on ..
The application, which is all the code source and all its dependencies. (Which could be written in Python or be compiled
If you build your package on the same linux distribution as your server, then you can assume that all the files of the application you build on one machine will run on the other.
In the worst case scenario, you’ll build a
C extension, which
will crash if the
postgresql libraries are not installed, but this is easy to
fix and should not happen too often. 3
So, to build the application package, we have to:
- Handle the versions of all the dependencies, recursively
- Bundle all the dependencies with the rest of the code.
Building the package
This can be done with three different
requirements-dev.txt: Used by developers and continuous integration servers to run the tests, the linters, build the documentation and what not.
requirements-app.txt: the dependencies of the application itself
requirements.txt: the list of the “frozen” dependencies, generated by something like:
$ virtualenv ~/my-app $ source ~/my-app/bin/activate $ pip install -r requirements-app.txt $ pip freeze > requirements.txt
If you are concerned about security here, you could:
- Use pip’s new hash-checking mode.
- Create a local
pypiindex. (Lots of solutions, such as devpy-server, pip2pi, or even bandersnatch)
Using a target directory for pip
The first trick is to tell
pip to install the dependencies not in a
virtualenv or in your system, but rather in a special
$ pip install --target vendors -r requirements.txt . ├── app.py ├── requirements.txt └── vendors ├── click ├── click-6.6.dist-info ├── flask ├── Flask-0.11.1.dist-info ├── itsdangerous-0.24.dist-info ├── itsdangerous.py ├── jinja2 ├── Jinja2-2.8.dist-info ├── markupsafe ├── MarkupSafe-0.23.dist-info ├── werkzeug └── Werkzeug-0.11.11.dist-info
Configuring the interpreter
The second trick is to use the site
module and call
import site site.addsitedir('vendors') from flask import Flask app = Flask(__name__) @app.route('/') def index(): return '\o/'
You could get away with setting
PYTHONPATH or using
'vendors'), but using
addsitedir will make sure Python sees the
which may be required by some modules to work.
Using setup.py to build the archive
To build the package, no need to re-invent the wheel, you can use
# setup.py from setuptools import setup setup( name="myapp", version="0.1", py_modules=["app"], ... )
# MANIFEST.in include myapp.conf ... graft vendors
Here we make sure the configuration file for the application and the whole
directory are included in the final package. 5
With this in place, you can now build the package like this:
$ python setup.py sdist $ ls dist/ myapp-0.1.tar.gz
Now that we know how to generate the package, here’s an example on how we can implement continuous delivery:
- A developer pushes a tag on the git repository
- A Jenkins job is triggered, and the tests are run
- If they pass, a package is made and uploaded to a local storage
- The latest package is fetched from the storage
- The whole archive gets uploaded to the server
uwsgiservice is restarted on the server
An other solution
You might prefer this one.
Personally, I’m still using the “really easy” method for my pet projects, so, as always, try and experiment for yourself before trusting a random stranger on the Internet :)
- Famous quote from Raymond Hettinger, I suggest you got watch his many Python talks. [return]
- For more about this, feel free to read Release Management Done Right, by Alex Papdimoulis [return]
- Using a tool like ansible might help. [return]
- IMHO, using docker for simple Python applications is overkill anyway, and may not work in production as well as you’d think [return]
graftis one of the many available commands in a