Python packaging in March 2016

Hooray! I finally finished porting my lolslacktest code over to be a proper Python module, installable with pip, and with a virtual environment. What a pain! Some notes on what I learned. But first, some reflection.

This project has been one of the least fun things I’ve done in a long time. I can tell by the way I’ve been foot dragging. It’s confusing and hard to debug packages. The docs are inconsistent and suffer from years of accretion. Some things are genuinely confusing, like Python’s surprise lack of support for circular imports. And in the end result is my product works just like it did before. No user visible changes. Behind the scenes things are better; the install is cleaner, I have a virtual environment for proper external dependencies, etc. But nothing fun, just slightly less crappy ops.

The state of the art for Python Packaging

As of March 2016 the Python 3 state of the art for package installation is pip, pyvenv, and setuptools. pyvenv and pip comes with Python 3.4 but setuptools is still extra. (On Ubuntu you have to install python3.4-venv with apt.)

The state of the art for managing Python packages keeps changing. This Stack answer explains the landscape as of September 2014. This packaging user guide is updated as of September 2015 and is mostly very good, but even it references things that don’t work such as “python bdist_wheel”. I also found this guide useful but it was mostly written in 2012 (although occasionally updated).

The pip installed by python 3.4 (Ubuntu) is version 1.5.4, you want version 8.0 or later if you want to do things like install gevent with a precompiled wheel file. setuptools is old too. So first thing you should do after setting up a pyvenv is “pip install -U pip setuptools”.

The setuptools docs are not very good. They assume you know how distutils works. Also they spend a lot of time talking about easy_install and none at all talking about pip.

Honestly all this Python packaging stuff is a big mess, I feel like you have to understand the history to be able to use the current state of the art. It’s a shame Distutils2 didn’t work out for Python 3.3 and rationalize it all. To be fair, packaging in almost every system sucks: npm, rvm, homebrew, they’re all a mess. All but my beloved apt, and that’s because the Ubuntu package management team works super hard.

Some hacks and code things

I had to do a few hacks and other things to make my project work.

The biggest code change was just wholesale replacing absolute imports like “import lib” with relative imports “from . import lib”. I think that’s actually correct, but I was doing it kind of blind for every single file.

I had to modify my code because Python really doesn’t support circular import dependencies. You don’t really notice until you are using relative imports. The accepted solution for circular imports is to break the circularity, refactor code. Screw that. My workaround was to move some of the import statements inside functions, so they execute at run time and not import time. That’s bad and inefficient but expedient.

There’s two similar-looking ways for a Python package to specify dependencies. You can add an install_requires stanza to your setuptools setup() method, or you can use pip to install a bunch of stuff listed in a requirements.txt file. The setup() dependencies are installed automatically when a package is installed, but nothing installs requirements.txt automatically. OTOH the requirements.txt option is more powerful, for instance pip can install things from github URLs whereas setuptools can’t. (The setuptools docs say you can do this with dependency_links, but I couldn’t make it work.) I’ve ended up using a mix of both, preferring setuptools where I can.

I have some shell scripts named in the “scripts” section of my, so they are installed in the virtualenv bin directory. But I want them to work even if the virtualenv isn’t activated, so those scripts have to source their own virtualenv. Mostly for execution in cron; getting cron to activate a virtualenv is not easy. The hack I did for this was this shim of code at the top of each script:

VENVDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
source $VENVDIR/activate

The bash magic in the first line sets VENVDIR to be the directory where the bash script itself is. Conveniently, that’s the same directory that has the activate script.

I have no idea how to put version numbers on my program. It’s a private thing only I’m installing, so for now I’m going with the dbschema version. Part of me wants to just put the git commit nonce there.

My deploy script used to be rsync from dev to prod. Now it’s sshing into prod and having it do a “git pull” followed by “pip install -U”. I took this approach at migurski’s suggestion. It means I’m not using any of the fancy distribution builds and versioning stuff that setuptools/pip enables. But I don’t really need those right now, they make more sense for public code hosted at PyPI

Note to self: you can’t move a pyvenv environment once its created. They have paths hard coded.



6 thoughts on “Python packaging in March 2016

  1. What’s the intersection between pip and friends and virtual env, etc, for when you’re packaging for pip? Or did you just need virtual env to test your packaging is working properly?

  2. I run absolutely everything from the virtualenv: pip, python, all python packages are all coming from my venv. Everywhere, all the time.

  3. I think I asked this horribly unclearly, sorry. I meant when you were packaging, was there anything extra you needed to do so that everything is happy with virtualenv and the like? Asking for, uh, a friend who’s a virtualenv conscientious objector/luddite .

  4. No, nothing really special for virtualenv. Using other packages in virtualenv is as easy as you could want. Creating a package for a virtualenv is really no different than creating a package for setuptools. All virtualenv (pyvenv) does is have a private install directory.

  5. Couple of things.

    1. is the canonical documentation as it’s maintained by the PyPA (Python Packaging Authority); nothing has really happened since Sep 2015, hence no newer date of having read through it (but the docs are on GitHub if you have something to change)
    2. `bdist_wheel` doesn’t work if you don’t install the `wheel` package (indirectly listed as a step in
    3. pip installs setuptools by default because it realizes everyone just assumes it’s there
    4. setuptools talks about easy_install because that’s setuptool’s custom solution to pip, not because it’s the right answer for anyone at this point
    5. requirements.txt is for apps, is for libraries; they are different things for different purposes:

    1. Thank you very much for the detailed information. On point 2, my impression was the command got renamed from “bdist_wheel” to just plain “wheel”.

Comments are closed.