Recently on of my projects at work we had to help our clients update their codebase to be Python 3 compatible. The first step that you have to take to do that is to make sure your dependencies are compatible with Python 3. That's the moment we decided to use pip-tools after considering other options such as pipenv.

pip-tools is a Python package that gives you two programs: pip-compile and pip-sync.

This post is going to be focused more on pip-compile and is based on a post by James Cooke.

What is pip-compile and how do I use it in my project?

As said in the post by James Cooke:

pip-compile consults the PyPI index for each top level package required, looking up the package versions available, outputting a specific list of pinned packages in a .txt file. This extra layer of abstraction (.in files containing top level requirements rather than just outputting .txt files with pip freeze) is very helpful for managing requirements, but does create some complications which mean that a solid workflow is essential for stable package management.

Let's imagine a following case. You have three types of requirements:

  • production requirements that you need to run the application/use the library
  • testing requirements that you need to run the testsuite for your project
  • development requirements that will be useful to the developers of the project

What you would do then is create the requirements/ directory with three files:


In you would specify your core high-level dependencies most of which should go unpinned, although some of them may be pinned, for example, Django:


Let's compile this file with pip-compile:

$ pip-compile
# This file is autogenerated by pip-compile
# To update, run:
#    pip-compile --output-file production.txt

As you can see, pip-compile will generate pinned dependencies that will help you make reproducable environments.

In and in you should set the dependence on production.txt and testing.txt respectively. The reason for depending on .txt rather then .in files is that you need to make sure that testing requirements are a strict superset of production requirements and development requirements are a strict superset of testing requirements. Therefore, contains:

-r production.txt

And contains:

-r testing.txt


$ pip-compile
# This file is autogenerated by pip-compile
# To update, run:
#    pip-compile --output-file testing.txt
pluggy==0.4.0             # via tox
py==1.4.34                # via pytest, tox
virtualenv==15.1.0        # via tox


$ pip-compile
# This file is autogenerated by pip-compile
# To update, run:
#    pip-compile --output-file development.txt
astroid==1.5.3            # via pylint
click==6.7                # via pip-tools
decorator==4.0.11         # via ipython, traitlets
first==2.0.1              # via pip-tools
ipython-genutils==0.2.0   # via traitlets
isort==4.2.15             # via pylint
jedi==0.10.2              # via ipython
lazy-object-proxy==1.3.1  # via astroid
mccabe==0.6.1             # via pylint
pexpect==4.2.1            # via ipython
pickleshare==0.7.4        # via ipython
prompt-toolkit==1.0.14    # via ipython
ptyprocess==0.5.2         # via pexpect
pygments==2.2.0           # via ipython
simplegeneric==0.8.1      # via ipython
six==1.10.0               # via astroid, pip-tools, prompt-toolkit, pylint, traitlets
traitlets==4.3.2          # via ipython
wcwidth==0.1.7            # via prompt-toolkit
wrapt==1.10.10            # via astroid

You should store both .in and .txt files in your version control. This allows for shipping of the compiled .txt files for installation, but more importantly, it presents the opportunity to check the diff of .txt files when upgrading packages. Also, it's a good idea to keep your requirements sorted alphabetically.

Automate the workflow using Makefile

As you may have noticed, our files depend on each other's products of compilation. It would make sense to express that dependency using some tool. make is exactly the tool we need for that and we are going to express the dependency using Makefile:

.PHONY: all check clean update

objects = $(wildcard *.in)
outputs := $(

PIPC=pip-compile -v

all: $(outputs)

%.txt: FORCE
  $(PIPC) $<

testing.txt: production.txt FORCE

development.txt: testing.txt FORCE


  @which pip-compile > /dev/null

clean: check
  - rm *.txt

# a special target to force recompilation on each invocation of make

The structure of the Makefile is explained here.

Build one or more requirements files

To update all requirements use the default all recipe.

$ make all

or just

$ make

To update a particular file, ask for it by name:

$ make production.txt

Add a dependency

To add a dependency, locate the appropriate .in file and add the new package name there. The version number is only required if a particular version of the library is required. The latest version will be chosen by default when compiling.

Update a package

$ make update PACKAGE=<package_name>

Update all requirements

A full update of all requirements to the latest version (including updating all packages that are not pinned in the .in file with a particular version number) can be achieved with:

$ make clean all

The clean recipe will clean out all *.txt files if you have pip-tools installed. Then the all recipe will rebuild them all in dependency order.


pip-sync is a command to make your environment reflect the specified .txt file.

To install new and remove unnecessary and old packages run:

$ pip-sync <requirements_file>

For development that would be:

$ pip-sync requirements/development.txt