Being one of the most senior python developers at work means I often get flack for pythons versioning practices.
“Why does python have to reinvent the wheel every few years.”
> How do I install a package in python?
Jokes like these come up constantly as superficial pokes at python’s package ecosystem.
But beyond the superficial laughs, there are some deeper problems with pip. At work we have had serious problems delivering non python files that a dependnency relied on across local development, CI, and server deployments. We have issues agreeing on where a dependincies requirements should live:
requirements directory with a dev.txt, common.txt, or prod.txt
pip freeze > requirements.txt
Each of these have their pluses and minuses.
requirements.txt has basically
become an accepted standard amongst the python community. However inside it makes
no differentiation between install and development dependencies. You have some
safety when you use
pip freeze > requirements.txt in that everything is pinned
to the version you have running; however removing dependencies becomes a mess.
extra_requires is a nice touch, but
that involves using
setuptools and figuring out python paths and letting developers
know how to set it up. It is definitely tools setup for distribution not for
development, and requires the most hacks to get into a comfortable and stable place
for package management.
I have used the
requirements directory a lot before. Being full stack and bouncing
pip, I really like the concept of dev dependencies. Having a
requirements/common.txt that has a projects base requirements, and a
that looks like:
to allow me to independnently change the requirements for common and dev is a really nice pattern. However in involves a lot of explaining to devs what is going on.
When I heard that python had another package management tool I internally groaned knowing that the python packaging jokes would start to fly again. However after checking out pipenv, it certainly did a lot of bragging.
workon command has saved me so much time and hassle.
It is a bit of pain to setup and get new developers running on, but the ease after
that initial work is unmatched. The idea was to make it super easy to get into
your virtualenv, install your packages and run.
pipenv takes a different approach. Since the package manager is handling the
virtualenv for you, you dont need to enter it unless you absolutely need to.
allows you to run commands as if you were inside of your virtualenv, allowing it
to handle all of the context switching for you.
It is also smart and if you are python 3.4+ it will automatically use
builtin virtualenv. Or if one already exists, it will continue to use that.
The pipfile is a beautiful format when compared against
requirements.txt. It is
not meant to be a flat list of requirements, instead it behaves closer to an
file with sections and parameters.
The example from their website:
Pipfile allows us to define packages and dev packages, specifiy that the
versions are unpinned. It is nice a readable, clearly sectioned off, and very
self explanatory. When installing a new package with
pipenv install package, it
will automatically save your new package to
[packages]. If you want to save it
[dev-packages] add the
--dev flag to your install. Note, it will only
add the package you specified to the
Pipfile, not all of its depenencies. This
means if you remove it, it can easily remove all the orphaned packages as well,
keeping your requirements as lean as possible.
Pipfile.lock allows us to go from general packages to pinned packages and
their dependencies. It is a large json structure specifying platform information,
versions, dependency versions, and sha256 hashes of resolved dependencies.
If you specify a version in your
Pipfile like so,
pipenv will try to ensure that python version is installed in the virtualenv
it will create. If you ahve
pyenv installed, it will automatically install
that version of python and use it.
pipenv automatically honors a
.env file. Inside this, you can set a list of
environment variables and their values. When running your application using pipenv,
pipenv will load these environment variables before executing. This makes it very
easy to follow a 12 factor app pattern of keeping your
apps environment agnostic, and instead loading in environment specific configuration
via ENV VARS. With
pipenv, you specify your environment specific configuration
.env file for local development, keep it out of source control, and
now your app has no need to have separate config files for separate environments.
Like mentioned before,
pipenv offers the
pipenv run command that allows you
to run a command from outside of the virtualenv as if you were inside it. It also
.env environment variables before running. If you want more access, you
pipenv shell, which opens a new shell inside of your virtualenv with the
environment variables loaded.
If you are on osx, I highly suggest installing via
brew install pipenv as it
provides a sandboxed area for pipenv to play and if you have done
brew install pyenv,
they play well together that way. If not, windows is a first class citizen, and
pip install pipenv --user is also perfectly acceptable way to install it. After
install, it can easily migrate your
requirements.txt to a
Pipfile and reuse
your current virtualenv.
Overall I think
pipenv is a giant step in the right direction. It doesnt solve
some of the problems that
pip has like nonpython distribution, however for the
majority case, it solves most of the big pain-points we experience from python
packaging. I am just not looking forward to:
> How do I install a python package?