pip-tools keeps your pinned dependencies fresh.
Project description
pip-tools = pip-compile + pip-sync
A set of command line tools to help you keep your pip-based packages fresh, even when you’ve pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)
Installation
Similar to pip, pip-tools must be installed in each of your project’s virtual environments:
$ source /path/to/venv/bin/activate
(venv)$ python -m pip install pip-tools
Note: all of the remaining example commands assume you’ve activated your project’s virtual environment.
Example usage for pip-compile
The pip-compile command lets you compile a requirements.txt file from your dependencies, specified in either setup.py or requirements.in.
Run it with pip-compile or python -m piptools compile. If you use multiple Python versions, you can run pip-compile as py -X.Y -m piptools compile on Windows and pythonX.Y -m piptools compile on other systems.
pip-compile should be run from the same virtual environment as your project so conditional dependencies that require a specific Python version, or other environment markers, resolve relative to your project’s environment.
Note: ensure you don’t have requirements.txt if you compile setup.py or requirements.in from scratch, otherwise, it might interfere.
Requirements from setup.py
Suppose you have a Django project, and want to pin it for production. If you have a setup.py with install_requires=['django'], then run pip-compile without any arguments:
$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile
#
asgiref==3.2.3 # via django
django==3.0.3 # via my_django_project (setup.py)
pytz==2019.3 # via django
sqlparse==0.3.0 # via django
pip-compile will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.
Without setup.py
If you don’t use setup.py (it’s easy to write one), you can create a requirements.in file to declare the Django dependency:
# requirements.in
django
Now, run pip-compile requirements.in:
$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile requirements.in
#
asgiref==3.2.3 # via django
django==3.0.3 # via -r requirements.in
pytz==2019.3 # via django
sqlparse==0.3.0 # via django
And it will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.
Using hashes
If you would like to use Hash-Checking Mode available in pip since version 8.0, pip-compile offers --generate-hashes flag:
$ pip-compile --generate-hashes requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile --generate-hashes requirements.in
#
asgiref==3.2.3 \
--hash=sha256:7e06d934a7718bf3975acbf87780ba678957b87c7adc056f13b6215d610695a0 \
--hash=sha256:ea448f92fc35a0ef4b1508f53a04c4670255a3f33d22a81c8fc9c872036adbe5 \
# via django
django==3.0.3 \
--hash=sha256:2f1ba1db8648484dd5c238fb62504777b7ad090c81c5f1fd8d5eb5ec21b5f283 \
--hash=sha256:c91c91a7ad6ef67a874a4f76f58ba534f9208412692a840e1d125eb5c279cb0a \
# via -r requirements.in
pytz==2019.3 \
--hash=sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d \
--hash=sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be \
# via django
sqlparse==0.3.0 \
--hash=sha256:40afe6b8d4b1117e7dff5504d7a8ce07d9a1b15aeeade8a2d10f130a834f8177 \
--hash=sha256:7c3dca29c022744e95b547e867cee89f4fce4373f3549ccd8797d8eb52cdb873 \
# via django
Updating requirements
To update all packages, periodically re-run pip-compile --upgrade.
To update a specific package to the latest or a specific version use the --upgrade-package or -P flag:
# only update the django package
$ pip-compile --upgrade-package django
# update both the django and requests packages
$ pip-compile --upgrade-package django --upgrade-package requests
# update the django package to the latest, and requests to v2.0.0
$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0
You can combine --upgrade and --upgrade-package in one command, to provide constraints on the allowed upgrades. For example to upgrade all packages whilst constraining requests to the latest version less than 3.0:
$ pip-compile --upgrade --upgrade-package 'requests<3.0'
Output File
To output the pinned requirements in a filename other than requirements.txt, use --output-file. This might be useful for compiling multiple files, for example with different constraints on django to test a library with both versions using tox:
$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt
$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txt
Or to output to standard output, use --output-file=-:
$ pip-compile --output-file=- > requirements.txt
$ pip-compile - --output-file=- < requirements.in > requirements.txt
Forwarding options to pip
Any valid pip flags or arguments may be passed on with pip-compile’s --pip-args option, e.g.
$ pip-compile requirements.in --pip-args '--retries 10 --timeout 30'
Configuration
You might be wrapping the pip-compile command in another script. To avoid confusing consumers of your custom script you can override the update command generated at the top of requirements files by setting the CUSTOM_COMPILE_COMMAND environment variable.
$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# ./pipcompilewrapper
#
asgiref==3.2.3 # via django
django==3.0.3 # via -r requirements.in
pytz==2019.3 # via django
sqlparse==0.3.0 # via django
Workflow for layered requirements
If you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.
For example, if you have a Django project where you want the newest 2.1 release in production and when developing you want to use the Django debug toolbar, then you can create two *.in files, one for each layer:
# requirements.in
django<2.2
At the top of the development requirements dev-requirements.in you use -c requirements.txt to constrain the dev requirements to packages already selected for production in requirements.txt.
# dev-requirements.in
-c requirements.txt
django-debug-toolbar
First, compile requirements.txt as usual:
$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile
#
django==2.1.15 # via -r requirements.in
pytz==2019.3 # via django
Now compile the dev requirements and the requirements.txt file is used as a constraint:
$ pip-compile dev-requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile dev-requirements.in
#
django-debug-toolbar==2.2 # via -r dev-requirements.in
django==2.1.15 # via -c requirements.txt, django-debug-toolbar
pytz==2019.3 # via -c requirements.txt, django
sqlparse==0.3.0 # via django-debug-toolbar
As you can see above, even though a 2.2 release of Django is available, the dev requirements only include a 2.1 version of Django because they were constrained. Now both compiled requirements files can be installed safely in the dev environment.
To install requirements in production stage use:
$ pip-sync
You can install requirements in development stage by:
$ pip-sync requirements.txt dev-requirements.txt
Version control integration
You might use pip-compile as a hook for the pre-commit. See pre-commit docs for instructions. Sample .pre-commit-config.yaml:
repos:
- repo: https://github.com/jazzband/pip-tools
rev: 5.0.0
hooks:
- id: pip-compile
You might want to customize pip-compile args by configuring args and/or files, for example:
repos:
- repo: https://github.com/jazzband/pip-tools
rev: 5.0.0
hooks:
- id: pip-compile
files: ^requirements/production\.(in|txt)$
args: [--index-url=https://example.com, requirements/production.in]
Example usage for pip-sync
Now that you have a requirements.txt, you can use pip-sync to update your virtual environment to reflect exactly what’s in there. This will install/upgrade/uninstall everything necessary to match the requirements.txt contents.
Run it with pip-sync or python -m piptools sync. If you use multiple Python versions, you can also run py -X.Y -m piptools sync on Windows and pythonX.Y -m piptools sync on other systems.
pip-sync must be installed into and run from the same virtual environment as your project to identify which packages to install or upgrade.
Be careful: pip-sync is meant to be used only with a requirements.txt generated by pip-compile.
$ pip-sync
Uninstalling flake8-2.4.1:
Successfully uninstalled flake8-2.4.1
Collecting click==4.1
Downloading click-4.1-py2.py3-none-any.whl (62kB)
100% |................................| 65kB 1.8MB/s
Found existing installation: click 4.0
Uninstalling click-4.0:
Successfully uninstalled click-4.0
Successfully installed click-4.1
To sync multiple *.txt dependency lists, just pass them in via command line arguments, e.g.
$ pip-sync dev-requirements.txt requirements.txt
Passing in empty arguments would cause it to default to requirements.txt.
Any valid pip install flags or arguments may be passed with pip-sync’s --pip-args option, e.g.
$ pip-sync requirements.txt --pip-args '--no-cache-dir --no-deps'
If you use multiple Python versions, you can run pip-sync as py -X.Y -m piptools sync ... on Windows and pythonX.Y -m piptools sync ... on other systems.
Note: pip-sync will not upgrade or uninstall packaging tools like setuptools, pip, or pip-tools itself. Use python -m pip install --upgrade to upgrade those packages.
Should I commit requirements.in and requirements.txt to source control?
Generally, yes. If you want a reproducible environment installation available from your source control, then yes, you should commit both requirements.in and requirements.txt to source control.
Note that if you are deploying on multiple Python environments (read the section below), then you must commit a seperate output file for each Python environment. We suggest to use the {env}-requirements.txt format (ex: win32-py2.7-requirements.txt, macos-py3.6-requirements.txt, etc.).
Cross-environment usage of requirements.in/requirements.txt and pip-compile
The dependencies of a package can change depending on the Python environment in which it is installed. Here, we define a Python environment as the combination of Operating System, Python version (2.7, 3.6, etc.), and Python implementation (CPython, PyPy, etc.). For an exact definition, refer to the possible combinations of PEP 508 environment markers.
As the resulting requirements.txt can differ for each environment, users must execute pip-compile on each Python environment separately to generate a requirements.txt valid for each said environment. The same requirements.in can be used as the source file for all environments, using PEP 508 environment markers as needed, the same way it would be done for regular pip cross-environment usage.
If the generated requirements.txt remains exactly the same for all Python environments, then it can be used across Python environments safely. But users should be careful as any package update can introduce environment-dependant dependencies, making any newly generated requirements.txt environment-dependant too. As a general rule, it’s advised that users should still always execute pip-compile on each targeted Python environment to avoid issues.
Other useful tools
pipdeptree to print the dependency tree of the installed packages.
requirements.in/requirements.txt syntax highlighting:
requirements.txt.vim for Vim.
Python extension for VS Code for VS Code.
Deprecations
This section lists pip-tools features that are currently deprecated.
--index/--no-index command-line options, use instead --emit-index-url/--no-emit-index-url (since 5.2.0).
Versions and compatibility
The table below summarizes the latest pip-tools versions with the required pip versions.
pip-tools |
pip |
---|---|
4.5.x |
8.1.3 - 20.0.x |
5.x |
20.0.x - 20.1.x |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pip-tools-5.3.0.tar.gz
.
File metadata
- Download URL: pip-tools-5.3.0.tar.gz
- Upload date:
- Size: 117.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 13ab79c33778e4071500fe6390ab99b6ab8e552e40fe0fcf3e32419963f1bd5c |
|
MD5 | 1fa4de15cbc7672a4aff910b0f155c4e |
|
BLAKE2b-256 | 95f676278270a1aa3e0d217e13ad0b5f6f9b982403ae981f573149a9b4c65ac2 |
File details
Details for the file pip_tools-5.3.0-py2.py3-none-any.whl
.
File metadata
- Download URL: pip_tools-5.3.0-py2.py3-none-any.whl
- Upload date:
- Size: 45.1 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.44.1 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1c971e38643400ec6d0f6d528c7585d25855078c0af639806f00c4cb87455adf |
|
MD5 | b4daec48486975084e8956dcfda21c84 |
|
BLAKE2b-256 | 5e8dc37230a7cf9e26cf5e3d48cc20a42b33e3614667777520c0b08b4c32b753 |