For each candidate item, pip needs to know the project name and version. For
wheels (identified by the .whl file extension) this can be obtained from
the filename, as per the Wheel spec. For local directories, or explicitly
specified sdist files, the setup.pyegg_info command is used to determine
the project metadata. For sdists located via an index, the filename is parsed
for the name and project version (this is in theory slightly less reliable
than using the egg_info command, but avoids downloading and processing
unnecessary numbers of files).

Any URL may use the #egg=name syntax (see VCS Support) to
explicitly state the project name.

Once pip has the set of requirements to satisfy, it chooses which version of
each requirement to install using the simple rule that the latest version that
satisfies the given constraints will be installed (but see here
for an exception regarding pre-release versions). Where more than one source of
the chosen version is available, it is assumed that any source is acceptable
(as otherwise the versions would differ).

As of v6.1.0, pip installs dependencies before their dependents, i.e. in
"topological order". This is the only commitment pip currently makes related
to order. While it may be coincidentally true that pip will install things in
the order of the install arguments or in the order of the items in a
requirements file, this is not a promise.

In the event of a dependency cycle (aka "circular dependency"), the current
implementation (which might possibly change later) has it such that the first
encountered member of the cycle is installed last.

For instance, if quux depends on foo which depends on bar which depends on baz,
which depends on foo:

The decision to install topologically is based on the principle that
installations should proceed in a way that leaves the environment usable at each
step. This has two main practical benefits:

Concurrent use of the environment during the install is more likely to work.

A failed install is less likely to leave a broken environment. Although pip
would like to support failure rollbacks eventually, in the mean time, this is
an improvement.

Although the new install order is not intended to replace (and does not replace)
the use of setup_requires to declare build dependencies, it may help certain
projects install from sdist (that might previously fail) that fit the following
profile:

They have build dependencies that are also declared as install dependencies
using install_requires.

pythonsetup.pyegg_info works without their build dependencies being
installed.

For whatever reason, they don't or won't declare their build dependencies using
setup_requires.

Since version 10, pip supports the use of environment variables inside the
requirements file. You can now store sensitive data (tokens, keys, etc.) in
environment variables and only specify the variable name for your requirements,
letting pip lookup the value at runtime. This approach aligns with the commonly
used 12-factor configuration pattern.

You have to use the POSIX format for variable names including brackets around
the uppercase name as shown in this example: ${API_TOKEN}. pip will attempt
to find the corresponding environment variable defined on the host system at
runtime.

Note

There is no support for other variable expansion syntaxes such as
$VARIABLE and %VARIABLE%.

######## example-requirements.txt ############## Requirements without Version Specifiers ######nosenose-covbeautifulsoup4####### Requirements with Version Specifiers ####### See https://www.python.org/dev/peps/pep-0440/#version-specifiersdocopt==0.6.1# Version Matching. Must be version 0.6.1keyring>=4.1.1# Minimum version 4.1.1coverage!=3.5# Version Exclusion. Anything except version 3.5Mopidy-Dirble~=1.1# Compatible release. Same as >= 1.1, == 1.*####### Refer to other requirements files ######-rother-requirements.txt######## A particular file ######./downloads/numpy-1.9.2-cp34-none-win32.whlhttp://wxpython.org/Phoenix/snapshot-builds/wxPython_Phoenix-3.0.3.dev1820+49a8884-cp34-none-win_amd64.whl####### Additional Requirements without Version Specifiers ####### Same as 1st section, just here to show that you can put things in any order.rejectedgreen#

pip supports installing from a package index using a requirement
specifier. Generally speaking, a requirement
specifier is composed of a project name followed by optional version
specifiers. PEP508 contains a full specification
of the format of a requirement (pip does not support the url_req form
of specifier at this time).

Since version 7.0 pip supports controlling the command line options given to
setup.py via requirements files. This disables the use of wheels (cached or
otherwise) for that package, as setup.py does not exist for wheels.

The --global-option and --install-option options are used to pass
options to setup.py. For example:

Note that the only way of giving more than one option to setup.py
is through multiple --global-option and --install-option
options, as shown in the example above. The value of each option is
passed as a single argument to the setup.py script. Therefore, a
line such as the following is invalid and would result in an
installation error.

Starting with v1.4, pip will only install stable versions as specified by
PEP426 by default. If a version cannot be parsed as a compliant PEP426
version then it is assumed to be a pre-release.

If a Requirement specifier includes a pre-release or development version
(e.g. >=0.0.dev0) then pip will allow pre-release and development versions
for that requirement. This does not include the != flag.

The pipinstall command also supports a --pre flag
that will enable installing pre-releases and development releases.

For editable installs, the clone location by default is "<venv
path>/src/SomeProject" in virtual environments, and "<cwd>/src/SomeProject"
for global installs. The --src option can be used to
modify this location.

For non-editable installs, the project is built locally in a temp dir and then
installed normally. Note that if a satisfactory version of the package is
already installed, the VCS source will not overwrite it without an --upgrade
flag. VCS requirements pin the package version (specified in the setup.py
file) of the target commit, not necessarily the commit itself.

The pip freeze subcommand will record the VCS requirement specifier
(referencing a specific commit) if and only if the install is done using the
editable option.

The "project name" component of the url suffix "egg=<project name>-<version>"
is used by pip in its dependency logic to identify the project prior
to pip downloading and analyzing the metadata. The optional "version"
component of the egg name is not functionally important. It merely
provides a human-readable clue as to what version is in use. For projects
where setup.py is not in the root of project, "subdirectory" component
is used. Value of "subdirectory" component should be a path starting from root
of the project to where setup.py is located.

So if your repository layout is:

pkg_dir/

setup.py # setup.py for package pkg

some_module.py

other_dir/

some_file

some_other_file

You'll need to use pipinstall-evcs+protocol://repo_url/#egg=pkg&subdirectory=pkg_dir.

Since version 10, pip also makes it possible to use environment variables which
makes it possible to reference private repositories without having to store
access tokens in the requirements file. For example, a private git repository
allowing Basic Auth for authentication can be refenced like this:

pip offers a number of Package Index Options for modifying how packages are found.

pip looks for packages in a number of places, on PyPI (if not disabled via
`--no-index`), in the local filesystem, and in any additional repositories
specified via `--find-links` or `--index-url`. There is no ordering in
the locations that are searched, rather they are all checked, and the "best"
match for the requirements (in terms of version number - see PEP440 for
details) is selected.

Starting with v6.0, pip provides an on-by-default cache which functions
similarly to that of a web browser. While the cache is on by default and is
designed do the right thing by default you can disable the cache and always
access PyPI by utilizing the --no-cache-dir option.

When making any HTTP request pip will first check its local cache to determine
if it has a suitable response stored for that request which has not expired. If
it does then it simply returns that response and doesn't make the request.

If it has a response stored, but it has expired, then it will attempt to make a
conditional request to refresh the cache which will either return an empty
response telling pip to simply use the cached item (and refresh the expiration
timer) or it will return a whole new response which pip can then store in the
cache.

When storing items in the cache, pip will respect the CacheControl header
if it exists, or it will fall back to the Expires header if that exists.
This allows pip to function as a browser would, and allows the index server
to communicate to pip how long it is reasonable to cache any particular item.

While this cache attempts to minimize network activity, it does not prevent
network access altogether. If you want a local install solution that
circumvents accessing PyPI, see Installing from local packages.

The default location for the cache directory depends on the Operating System:

Pip will read from the subdirectory wheels within the pip cache directory
and use any packages found there. This is disabled via the same
--no-cache-dir option that disables the HTTP cache. The internal structure
of that is not part of the pip API. As of 7.0, pip makes a subdirectory for
each sdist that wheels are built from and places the resulting wheels inside.

Pip attempts to choose the best wheels from those built in preference to
building a new wheel. Note that this means when a package has both optional
C extensions and builds py tagged wheels when the C extension can't be built
that pip will not attempt to build a better wheel for Pythons that would have
supported it, once any generic wheel is built. To correct this, make sure that
the wheels are built with Python specific tags - e.g. pp on PyPy.

When no wheels are found for an sdist, pip will attempt to build a wheel
automatically and insert it into the wheel cache.

(The ability to use multiple hashes is important when a package has both
binary and source distributions or when it offers binary distributions for a
variety of platforms.)

The recommended hash algorithm at the moment is sha256, but stronger ones are
allowed, including all those supported by hashlib. However, weaker ones
such as md5, sha1, and sha224 are excluded to avoid giving a false sense of
security.

Hash verification is an all-or-nothing proposition. Specifying a --hash
against any requirement not only checks that hash but also activates a global
hash-checking mode, which imposes several other security restrictions:

Hashes are required for all requirements. This is because a partially-hashed
requirements file is of little use and thus likely an error: a malicious
actor could slip bad code into the installation via one of the unhashed
requirements. Note that hashes embedded in URL-style requirements via the
#md5=... syntax suffice to satisfy this rule (regardless of hash
strength, for legacy reasons), though you should use a stronger
hash like sha256 whenever possible.

Hashes are required for all dependencies. An error results if there is a
dependency that is not spelled out and hashed in the requirements file.

Requirements that take the form of project names (rather than URLs or local
filesystem paths) must be pinned to a specific version using ==. This
prevents a surprising hash mismatch upon the release of a new version
that matches the requirement specifier.

--egg is disallowed, because it delegates installation of dependencies
to setuptools, giving up pip's ability to enforce any of the above.

Hash-checking mode can be forced on with the --require-hashes command-line
option:

$ pip install --require-hashes -r requirements.txt
...
Hashes are required in --require-hashes mode (implicitly on when a hash is
specified for any package). These requirements were missing hashes,
leaving them open to tampering. These are the hashes the downloaded
archives actually had. You can add lines like these to your requirements
files to prevent tampering.
pyelasticsearch==1.0 --hash=sha256:44ddfb1225054d7d6b1d02e9338e7d4809be94edbe9929a2ec0807d38df993fa
more-itertools==2.2 --hash=sha256:93e62e05c7ad3da1a233def6731e8285156701e3419a5fe279017c429ec67ce0

This can be useful in deploy scripts, to ensure that the author of the
requirements file provided hashes. It is also a convenient way to bootstrap
your list of hashes, since it shows the hashes of the packages it fetched. It
fetches only the preferred archive for each package, so you may still need to
add hashes for alternatives archives using pip hash: for instance if
there is both a binary and a source distribution.

The wheel cache is disabled in hash-checking mode to
prevent spurious hash mismatch errors. These would otherwise occur while
installing sdists that had already been automatically built into cached wheels:
those wheels would be selected for installation, but their hashes would not
match the sdist ones from the requirements file. A further complication is that
locally built wheels are nondeterministic: contemporary modification times make
their way into the archive, making hashes unpredictable across machines and
cache flushes. Compilation of C code adds further nondeterminism, as many
compilers include random-seeded values in their output. However, wheels fetched
from index servers are the same every time. They land in pip's HTTP cache, not
its wheel cache, and are used normally in hash-checking mode. The only downside
of having the wheel cache disabled is thus extra build time for sdists, and
this can be solved by making sure pre-built wheels are available from the index
server.

Beware of the setup_requires keyword arg in setup.py. The
(rare) packages that use it will cause those dependencies to be downloaded
by setuptools directly, skipping pip's hash-checking. If you need to use
such a package, see Controlling
setup_requires.

Warning

Be careful not to nullify all your security work when you install your
actual project by using setuptools directly: for example, by calling
pythonsetup.pyinstall, pythonsetup.pydevelop, or
easy_install. Setuptools will happily go out and download, unchecked,
anything you missed in your requirements file—and it’s easy to miss things
as your project evolves. To be safe, install your project using pip and
--no-deps.

PyPI provides an MD5 hash in the fragment portion of each package download URL,
like #md5=123..., which pip checks as a protection against download
corruption. Other hash algorithms that have guaranteed support from hashlib
are also supported here: sha1, sha224, sha384, sha256, and sha512. Since this
hash originates remotely, it is not a useful guard against tampering and thus
does not satisfy the --require-hashes demand that every package have a
local hash.

(See the VCS Support section above for more information on VCS-related syntax.)

For local projects, the "SomeProject.egg-info" directory is created relative to
the project path. This is one advantage over just using setup.pydevelop,
which creates the "egg-info" directly relative the current working directory.

Setuptools offers the setup_requiressetup() keyword
for specifying dependencies that need to be present in order for the setup.py
script to run. Internally, Setuptools uses easy_install to fulfill these
dependencies.

pip has no way to control how these dependencies are located. None of the
Package Index Options have an effect.

Directory to unpack packages into and build in. Note that an initial build still takes place in a temporary directory. The location of temporary directories can be controlled by setting the TMPDIR environment variable (TEMP on Windows) appropriately.

Determines how dependency upgrading should be handled (default: %(default)s). "eager" - dependencies are upgraded regardless of whether the currently installed version satisfies the requirements of the upgraded package(s). "only-if-needed" - are upgraded only when they do not satisfy the requirements of the upgraded package(s).

Extra arguments to be supplied to the setup.py install command (use like --install-option="--install-scripts=/usr/local/bin"). Use multiple --install-option options to pass multiple options to setup.py install. If you are using an option with a directory path, be sure to use absolute path.

Do not use binary packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all binary packages, :none: to empty the set, or one or more package names with commas between them. Note that some packages are tricky to compile and may fail to install when this option is used on them.

Do not use source packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all source packages, :none: to empty the set, or one or more package names with commas between them. Packages without binary distributions will fail to install when this option is used on them.

Base URL of Python Package Index (default https://pypi.python.org/simple). This should point to a repository compliant with PEP 503 (the simple repository API) or a local directory laid out in the same format.