Как установить через pip python
Перейти к содержимому

Как установить через pip python

  • автор:

# pip: PyPI Package Manager

pip is the most widely-used package manager for the Python Package Index, installed by default with recent versions of Python.

# Install Packages

To install the latest version of a package named SomePackage :

To install a specific version of a package:

To specify a minimum version to install for a package:

If commands shows permission denied error on Linux/Unix then use sudo with the commands

# Install from requirements files

Each line of the requirements file indicates something to be installed, and like arguments to pip install, Details on the format of the files are here: Requirements File Format

After install the package you can check it using freeze command:

# To list all packages installed using pip

To list installed packages:

To list outdated packages, and show the latest version available:

# Upgrade Packages

will upgrade package SomePackage and all its dependencies. Also, pip automatically removes older version of the package before upgrade.

To upgrade pip itself, do

on Windows machines.

# Uninstall Packages

To uninstall a package:

# Updating all outdated packages on Linux

pip doesn’t current contain a flag to allow a user to update all outdated packages in one shot. However, this can be accomplished by piping commands together in a Linux environment:

This command takes all packages in the local virtualenv and checks if they are outdated. From that list, it gets the package name and then pipes that to a pip install -U command. At the end of this process, all local packages should be updated.

# Updating all outdated packages on Windows

pip doesn’t current contain a flag to allow a user to update all outdated packages in one shot. However, this can be accomplished by piping commands together in a Windows environment:

This command takes all packages in the local virtualenv and checks if they are outdated. From that list, it gets the package name and then pipes that to a pip install -U command. At the end of this process, all local packages should be updated.

# Create a requirements.txt file of all packages on the system

pip assists in creating requirements.txt files by providing the freeze

This will save a list of all packages and their version installed on the system to a file named requirements.txt in the current folder.

# Create a requirements.txt file of packages only in the current virtualenv

pip assists in creating requirements.txt files by providing the freeze

(opens new window) parameter will only output a list of packages and versions that are installed locally to a virtualenv. Global packages will not be listed.

# Using a certain Python version with pip

If you have both Python 3 and Python 2 installed, you can specify which version of Python you would like pip to use. This is useful when packages only support Python 2 or 3 or when you wish to test with both.

If you want to install packages for Python 2, run either:

If you would like to install packages for Python 3, do:

You can also invoke installation of a package to a specific python installation with:

On OS-X/Linux/Unix platforms it is important to be aware of the distinction between the system version of python, (which upgrading make render your system inoperable), and the user version(s) of python. You may, depending on which you are trying to upgrade, need to prefix these commands with sudo and input a password.

Likewise on Windows some python installations, especially those that are a part of another package, can end up installed in system directories — those you will have to upgrade from a command window running in Admin mode — if you find that it looks like you need to do this it is a very good idea to check which python installation you are trying to upgrade with a command such as python -c"import sys;print(sys.path);" or py -3.5 -c"import sys;print(sys.path);" you can also check which pip you are trying to run with pip —version

On Windows, if you have both python 2 and python 3 installed, and on your path and your python 3 is greater than 3.4 then you will probably also have the python launcher py on your system path. You can then do tricks like:

If you are running & maintaining multiple versions of python I would strongly recommend reading up about the python virtualenv or venv virtual enviroments

(opens new window) which allow you to isolate both the version of python and which packages are present.

# Installing packages not yet on pip as wheels

Many, pure python, packages are not yet available on the Python Package Index as wheels but still install fine. However, some packages on Windows give the dreaded vcvarsall.bat not found error.

The problem is that the package that you are trying to install contains a C or C++ extension and is not currently available as a pre-built wheel from the python package index, pypi, and on windows you do not have the tool chain needed to build such items.

The simplest answer is to go to Christoph Gohlke’s

(opens new window) excellent site and locate the appropriate version of the libraries that you need. By appropriate in the package name a -cp**NN******- has to match your version of python, i.e. if you are using windows 32 bit python even on win64 the name must include -win32- and if using the 64 bit python it must include -win_amd64- and then the python version must match, i.e. for Python 34 the filename must include -cp34-, etc. this is basically the magic that pip does for you on the pypi site.

Alternatively, you need to get the appropriate windows development kit for the version of python that you are using, the headers for any library that the package you are trying to build interfaces to, possibly the python headers for the version of python, etc.

Python 2.7 used Visual Studio 2008, Python 3.3 and 3.4 used Visual Studio 2010, and Python 3.5+ uses Visual Studio 2015.

Then you may need to locate the header files, at the matching revision for any libraries that your desired package links to and download those to an appropriate locations.

Finally you can let pip do your build — of course if the package has dependencies that you don’t yet have you may also need to find the header files for them as well.

Alternatives: It is also worth looking out, both on pypi or Christop’s site, for any slightly earlier version of the package that you are looking for that is either pure python or pre-built for your platform and python version and possibly using those, if found, until your package does become available. Likewise if you are using the very latest version of python you may find that it takes the package maintainers a little time to catch up so for projects that really need a specific package you may have to use a slightly older python for the moment. You can also check the packages source site to see if there is a forked version that is available pre-built or as pure python and searching for alternative packages that provide the functionality that you require but are available — one example that springs to mind is the Pillow

(opens new window) , actively maintained, drop in replacement for PIL

(opens new window) currently not updated in 6 years and not available for python 3.

Afterword, I would encourage anybody who is having this problem to go to the bug tracker for the package and add to, or raise if there isn’t one already, a ticket politely requesting that the package maintainers provide a wheel on pypi for your specific combination of platform and python, if this is done then normally things will get better with time, some package maintainers don’t realise that they have missed a given combination that people may be using.

# Note on Installing Pre-Releases

Pip follows the rules of Semantic Versioning

(opens new window) and by default prefers released packages over pre-releases. So if a given package has been released as V0.98 and there is also a release candidate V1.0-rc1 the default behaviour of pip install will be to install V0.98 — if you wish to install the release candidate, you are advised to test in a virtual environment first, you can enable do so with —pip install —pre package-name or —pip install —pre —upgrade package-name. In many cases pre-releases or release candidates may not have wheels built for all platform & version combinations so you are more likely to encounter the issues above.

# Note on Installing Development Versions

You can also use pip to install development versions of packages from github and other locations, since such code is in flux it is very unlikely to have wheels built for it, so any impure packages will require the presence of the build tools, and they may be broken at any time so the user is strongly encouraged to only install such packages in a virtual environment.

Three options exist for such installations:

  1. Download compressed snapshot, most online version control systems have the option to download a compressed snapshot of the code. This can be downloaded manually and then installed with pip install path/to/downloaded/file note that for most compression formats pip will handle unpacking to a cache area, etc.
  2. Let pip handle the download & install for you with: pip install URL/of/package/repository — you may also need to use the —trusted-host , —client-cert and/or —proxy flags for this to work correctly, especially in a corporate environment. e.g:

Note the git+ prefix to the URL.

  1. Clone the repository using git , mercurial or other acceptable tool, preferably a DVCS tool, and use pip install path/to/cloned/repo — this will both process any requires.text file and perform the build and setup steps, you can manually change directory to your cloned repository and run pip install -r requires.txt and then python setup.py install to get the same effect. The big advantages of this approach is that while the initial clone operation may take longer than the snapshot download you can update to the latest with, in the case of git: git pull origin master and if the current version contains errors you can use pip uninstall package-name then use git checkout commands to move back through the repository history to earlier version(s) and re-try.
# Syntax
    install

      — Install packages

      Output installed packages in requirements format

      List installed packages

      Show information about installed packages

      Search PyPI for packages

      Build wheels from your requirements

      Zip individual packages (deprecated)

      Unzip individual packages (deprecated)

      Create pybundles (deprecated)

      Show help for commands

      # Remarks

      Sometimes, pip will perfom a manual compilation of native code. On Linux python will automatically choose an available C compiler on your system. Refer to the table below for the required Visual Studio/Visual C++ version on Windows (newer versions will not work.).

      Name already in use

      pip / docs / html / user_guide.rst

      • Go to file T
      • Go to line L
      • Copy path
      • Copy permalink
      • Open with Desktop
      • View raw
      • Copy raw contents Copy raw contents

      Copy raw contents

      Copy raw contents

      pip is a command line program. When you install pip, a pip command is added to your system, which can be run from the command prompt as follows:

      pip supports installing from PyPI, version control, local projects, and directly from distribution files.

      The most common scenario is to install from PyPI using :ref:`Requirement Specifiers`

      For more information and examples, see the :ref:`pip install` reference.

      Basic Authentication Credentials

      Using a Proxy Server

      When installing packages from PyPI, pip requires internet access, which in many corporate environments requires an outbound HTTP proxy server.

      pip can be configured to connect through a proxy server in various ways:

      • using the —proxy command-line option to specify a proxy in the form scheme://[user:passwd@]proxy.server:port
      • using proxy in a :ref:`config-file`
      • by setting the standard environment-variables http_proxy , https_proxy and no_proxy .
      • using the environment variable PIP_USER_AGENT_USER_DATA to include a JSON-encoded string in the user-agent variable used in pip’s requests.

      «Requirements files» are files containing a list of items to be installed using :ref:`pip install` like so:

      Details on the format of the files are here: :ref:`requirements-file-format` .

      Logically, a Requirements file is just a list of :ref:`pip install` arguments placed in a file. Note that you should not rely on the items in the file being installed by pip in any particular order.

      Requirements files can also be served via a URL, e.g. http://example.com/requirements.txt besides as local files, so that they can be stored and served in a centralized place.

      In practice, there are 4 common uses of Requirements files:

      Requirements files are used to hold the result from :ref:`pip freeze` for the purpose of achieving :doc:`topics/repeatable-installs` . In this case, your requirement file contains a pinned version of everything that was installed when pip freeze was run.

      Requirements files are used to force pip to properly resolve dependencies. pip 20.2 and earlier doesn’t have true dependency resolution, but instead simply uses the first specification it finds for a project. E.g. if pkg1 requires pkg3>=1.0 and pkg2 requires pkg3>=1.0,<=2.0 , and if pkg1 is resolved first, pip will only use pkg3>=1.0 , and could easily end up installing a version of pkg3 that conflicts with the needs of pkg2 . To solve this problem, you can place pkg3>=1.0,<=2.0 (i.e. the correct specification) into your requirements file directly along with the other top level requirements. Like so:

      Requirements files are used to force pip to install an alternate version of a sub-dependency. For example, suppose ProjectA in your requirements file requires ProjectB , but the latest version (v1.3) has a bug, you can force pip to accept earlier versions like so:

      Requirements files are used to override a dependency with a local patch that lives in version control. For example, suppose a dependency SomeDependency from PyPI has a bug, and you can’t wait for an upstream fix. You could clone/copy the src, make the fix, and place it in VCS with the tag sometag . You’d reference it in your requirements file with a line like so:

      If SomeDependency was previously a top-level requirement in your requirements file, then replace that line with the new line. If SomeDependency is a sub-dependency, then add the new line.

      It’s important to be clear that pip determines package dependencies using install_requires metadata, not by discovering requirements.txt files embedded in projects.

      Constraints files are requirements files that only control which version of a requirement is installed, not whether it is installed or not. Their syntax and contents is a subset of :ref:`Requirements Files` , with several kinds of syntax not allowed: constraints must have a name, they cannot be editable, and they cannot specify extras. In terms of semantics, there is one key difference: Including a package in a constraints file does not trigger installation of the package.

      Use a constraints file like so:

      Constraints files are used for exactly the same reason as requirements files when you don’t know exactly what things you want to install. For instance, say that the «helloworld» package doesn’t work in your environment, so you have a local patched version. Some things you install depend on «helloworld», and some don’t.

      One way to ensure that the patched version is used consistently is to manually audit the dependencies of everything you install, and if «helloworld» is present, write a requirements file to use when installing that thing.

      Constraints files offer a better way: write a single constraints file for your organisation and use that everywhere. If the thing being installed requires «helloworld» to be installed, your fixed version specified in your constraints file will be used.

      Constraints file support was added in pip 7.1. In :ref:`Resolver changes 2020` we did a fairly comprehensive overhaul, removing several undocumented and unsupported quirks from the previous implementation, and stripped constraints files down to being purely a way to specify global (version) limits for packages.

      Same as requirements files, constraints files can also be served via a URL, e.g. http://example.com/constraints.txt, so that your organization can store and serve them in a centralized place.

      Installing from Wheels

      «Wheel» is a built, archive format that can greatly speed installation compared to building and installing from source archives. For more information, see the Wheel docs , PEP 427, and PEP 425.

      pip prefers Wheels where they are available. To disable this, use the :ref:`—no-binary <install_—no-binary>` flag for :ref:`pip install` .

      If no satisfactory wheels are found, pip will default to finding source archives.

      To install directly from a wheel archive:

      To include optional dependencies provided in the provides_extras metadata in the wheel, you must add quotes around the install target name:

      In the future, the path[extras] syntax may become deprecated. It is recommended to use PEP 508 syntax wherever possible.

      For the cases where wheels are not available, pip offers :ref:`pip wheel` as a convenience, to build wheels for all your requirements and dependencies.

      :ref:`pip wheel` requires the wheel package to be installed, which provides the «bdist_wheel» setuptools extension that it uses.

      To build wheels for your requirements and all their dependencies to a local directory:

      And then to install those requirements just using your local directory of wheels (and not from PyPI):

      pip is able to uninstall most packages like so:

      pip also performs an automatic uninstall of an old version of a package before upgrading to a newer version.

      For more information and examples, see the :ref:`pip uninstall` reference.

      To list installed packages:

      To list outdated packages, and show the latest version available:

      To show details about an installed package:

      For more information and examples, see the :ref:`pip list` and :ref:`pip show` reference pages.

      Searching for Packages

      pip can search PyPI for packages using the pip search command:

      The query will be used to search the names and summaries of all packages.

      For more information and examples, see the :ref:`pip search` reference.

      pip comes with support for command line completion in bash, zsh and fish.

      To setup for bash:

      To setup for zsh:

      To setup for fish:

      To setup for powershell:

      Alternatively, you can use the result of the completion command directly with the eval function of your shell, e.g. by adding the following to your startup file:

      Installing from local packages

      In some cases, you may want to install from local packages only, with no traffic to PyPI.

      First, download the archives that fulfill your requirements:

      Note that pip download will look in your wheel cache first, before trying to download from PyPI. If you’ve never installed your requirements before, you won’t have a wheel cache for those items. In that case, if some of your requirements don’t come as wheels from PyPI, and you want wheels, then run this instead:

      «Only if needed» Recursive Upgrade

      pip install —upgrade now has a —upgrade-strategy option which controls how pip handles upgrading of dependencies. There are 2 upgrade strategies supported:

      • eager : upgrades all dependencies regardless of whether they still satisfy the new parent requirements
      • only-if-needed : upgrades a dependency only if it does not satisfy the new parent requirements

      The default strategy is only-if-needed . This was changed in pip 10.0 due to the breaking nature of eager when upgrading conflicting dependencies.

      It is important to note that —upgrade affects direct requirements (e.g. those specified on the command-line or via a requirements file) while —upgrade-strategy affects indirect requirements (dependencies of direct requirements).

      As an example, say SomePackage has a dependency, SomeDependency , and both of them are already installed but are not the latest available versions:

      • pip install SomePackage : will not upgrade the existing SomePackage or SomeDependency .
      • pip install —upgrade SomePackage : will upgrade SomePackage , but not SomeDependency (unless a minimum requirement is not met).
      • pip install —upgrade SomePackage —upgrade-strategy=eager : upgrades both SomePackage and SomeDependency .

      As an historic note, an earlier «fix» for getting the only-if-needed behaviour was:

      A proposal for an upgrade-all command is being considered as a safer alternative to the behaviour of eager upgrading.

      With Python 2.6 came the «user scheme» for installation, which means that all Python distributions support an alternative install location that is specific to a user. The default location for each OS is explained in the python documentation for the site.USER_BASE variable. This mode of installation can be turned on by specifying the :ref:`—user <install_—user>` option to pip install .

      Moreover, the «user scheme» can be customized by setting the PYTHONUSERBASE environment variable, which updates the value of site.USER_BASE .

      To install «SomePackage» into an environment with site.USER_BASE customized to ‘/myappenv’, do the following:

      pip install —user follows four rules:

      1. When globally installed packages are on the python path, and they conflict with the installation requirements, they are ignored, and not uninstalled.
      2. When globally installed packages are on the python path, and they satisfy the installation requirements, pip does nothing, and reports that requirement is satisfied (similar to how global packages can satisfy requirements when installing packages in a —system-site-packages virtualenv).
      3. pip will not perform a —user install in a —no-site-packages virtualenv (i.e. the default kind of virtualenv), due to the user site not being on the python path. The installation would be pointless.
      4. In a —system-site-packages virtualenv, pip will not install a package that conflicts with a package in the virtualenv site-packages. The —user installation would lack sys.path precedence and be pointless.

      To make the rules clearer, here are some examples:

      From within a —no-site-packages virtualenv (i.e. the default kind):

      From within a —system-site-packages virtualenv where SomePackage==0.3 is already installed in the virtualenv:

      From within a real python, where SomePackage is not installed globally:

      From within a real python, where SomePackage is installed globally, but is not the latest version:

      From within a real python, where SomePackage is installed globally, and is the latest version:

      Fixing conflicting dependencies

      Using pip from your program

      As noted previously, pip is a command line program. While it is implemented in Python, and so is available from your Python code via import pip , you must not use pip’s internal APIs in this way. There are a number of reasons for this:

      1. The pip code assumes that it is in sole control of the global state of the program. pip manages things like the logging system configuration, or the values of the standard IO streams, without considering the possibility that user code might be affected.
      2. pip’s code is not thread safe. If you were to run pip in a thread, there is no guarantee that either your code or pip’s would work as you expect.
      3. pip assumes that once it has finished its work, the process will terminate. It doesn’t need to handle the possibility that other code will continue to run after that point, so (for example) calling pip twice in the same process is likely to have issues.

      This does not mean that the pip developers are opposed in principle to the idea that pip could be used as a library — it’s just that this isn’t how it was written, and it would be a lot of work to redesign the internals for use as a library, handling all of the above issues, and designing a usable, robust and stable API that we could guarantee would remain available across multiple releases of pip. And we simply don’t currently have the resources to even consider such a task.

      What this means in practice is that everything inside of pip is considered an implementation detail. Even the fact that the import name is pip is subject to change without notice. While we do try not to break things as much as possible, all the internal APIs can change at any time, for any reason. It also means that we generally won’t fix issues that are a result of using pip in an unsupported way.

      It should also be noted that installing packages into sys.path in a running Python process is something that should only be done with care. The import system caches certain data, and installing new packages while a program is running may not always behave as expected. In practice, there is rarely an issue, but it is something to be aware of.

      Having said all of the above, it is worth covering the options available if you decide that you do want to run pip from within your program. The most reliable approach, and the one that is fully supported, is to run pip in a subprocess. This is easily done using the standard subprocess module:

      If you want to process the output further, use one of the other APIs in the module. We are using freeze here which outputs installed packages in requirements format.:

      If you don’t want to use pip’s command line functionality, but are rather trying to implement code that works with Python packages, their metadata, or PyPI, then you should consider other, supported, packages that offer this type of ability. Some examples that you could consider include:

      • packaging — Utilities to work with standard package metadata (versions, requirements, etc.)
      • setuptools (specifically pkg_resources ) — Functions for querying what packages the user has installed on their system.
      • distlib — Packaging and distribution utilities (including functions for interacting with PyPI).

      Changes to the pip dependency resolver in 20.3 (2020)

      pip 20.3 has a new dependency resolver, on by default for Python 3 users. (pip 20.1 and 20.2 included pre-release versions of the new dependency resolver, hidden behind optional user flags.) Read below for a migration guide, how to invoke the legacy resolver, and the deprecation timeline. We also made a two-minute video explanation you can watch.

      We will continue to improve the pip dependency resolver in response to testers’ feedback. Please give us feedback through the resolver testing survey.

      The big change in this release is to the pip dependency resolver within pip.

      Computers need to know the right order to install pieces of software («to install x , you need to install y first»). So, when Python programmers share software as packages, they have to precisely describe those installation prerequisites, and pip needs to navigate tricky situations where it’s getting conflicting instructions. This new dependency resolver will make pip better at handling that tricky logic, and make pip easier for you to use and troubleshoot.

      The most significant changes to the resolver are:

      • It will reduce inconsistency: it will no longer install a combination of packages that is mutually inconsistent. In older versions of pip, it is possible for pip to install a package which does not satisfy the declared requirements of another installed package. For example, in pip 20.0, pip install «six<1.12» «virtualenv==20.0.2» does the wrong thing, “successfully” installing six==1.11 , even though virtualenv==20.0.2 requires six>=1.12.0,<2 (defined here). The new resolver, instead, outright rejects installing anything if it gets that input.
      • It will be stricter — if you ask pip to install two packages with incompatible requirements, it will refuse (rather than installing a broken combination, like it did in previous versions).

      So, if you have been using workarounds to force pip to deal with incompatible or inconsistent requirements combinations, now’s a good time to fix the underlying problem in the packages, because pip will be stricter from here on out.

      This also means that, when you run a pip install command, pip only considers the packages you are installing in that command, and may break already-installed packages. It will not guarantee that your environment will be consistent all the time. If you pip install x and then pip install y , it’s possible that the version of y you get will be different than it would be if you had run pip install x y in a single command. We are considering changing this behavior (per :issue:`7744` ) and would like your thoughts on what pip’s behavior should be; please answer our survey on upgrades that create conflicts.

      We are also changing our support for :ref:`Constraints Files` , editable installs, and related functionality. We did a fairly comprehensive overhaul and stripped constraints files down to being purely a way to specify global (version) limits for packages, and so some combinations that used to be allowed will now cause errors. Specifically:

      • Constraints don’t override the existing requirements; they simply constrain what versions are visible as input to the resolver (see :issue:`9020` )
      • Providing an editable requirement ( -e . ) does not cause pip to ignore version specifiers or constraints (see :issue:`8076` ), and if you have a conflict between a pinned requirement and a local directory then pip will indicate that it cannot find a version satisfying both (see :issue:`8307` )
      • Hash-checking mode requires that all requirements are specified as a == match on a version and may not work well in combination with constraints (see :issue:`9020` and :issue:`8792` )
      • If necessary to satisfy constraints, pip will happily reinstall packages, upgrading or downgrading, without needing any additional command-line options (see :issue:`8115` and :doc:`development/architecture/upgrade-options` )
      • Unnamed requirements are not allowed as constraints (see :issue:`6628` and :issue:`8210` )
      • Links are not allowed as constraints (see :issue:`8253` )
      • Constraints cannot have extras (see :issue:`6628` )

      Per our :ref:`Python 2 Support` policy, pip 20.3 users who are using Python 2 will use the legacy resolver by default. Python 2 users should upgrade to Python 3 as soon as possible, since in pip 21.0 in January 2021, pip dropped support for Python 2 altogether.

      How to upgrade and migrate

      Install pip 20.3 with python -m pip install —upgrade pip .

      Validate your current environment by running pip check . This will report if you have any inconsistencies in your set of installed packages. Having a clean installation will make it much less likely that you will hit issues with the new resolver (and may address hidden problems in your current environment!). If you run pip check and run into stuff you can’t figure out, please ask for help in our issue tracker or chat.

      Test the new version of pip.

      While we have tried to make sure that pip’s test suite covers as many cases as we can, we are very aware that there are people using pip with many different workflows and build processes, and we will not be able to cover all of those without your help.

      • If you use pip to install your software, try out the new resolver and let us know if it works for you with pip install . Try:
        • installing several packages simultaneously
        • re-creating an environment using a requirements.txt file
        • using pip install —force-reinstall to check whether it does what you think it should
        • using constraints files
        • the «Setups to test with special attention» and «Examples to try» below

        Troubleshoot and try these workarounds if necessary.

        • If pip is taking longer to install packages, read :doc:`Dependency resolution backtracking <topics/dependency-resolution>` for ways to reduce the time pip spends backtracking due to dependency conflicts.
        • If you don’t want pip to actually resolve dependencies, use the —no-deps option. This is useful when you have a set of package versions that work together in reality, even though their metadata says that they conflict. For guidance on a long-term fix, read :ref:`Fixing conflicting dependencies` .
        • If you run into resolution errors and need a workaround while you’re fixing their root causes, you can choose the old resolver behavior using the flag —use-deprecated=legacy-resolver . This will work until we release pip 21.0 (see :ref:`Deprecation timeline for 2020 resolver changes` ).

        Please report bugs through the resolver testing survey.

        Setups to test with special attention

        • Requirements files with 100+ packages
        • Installation workflows that involve multiple requirements files
        • Requirements files that include hashes ( :ref:`hash-checking mode` ) or pinned dependencies (perhaps as output from pip-compile within pip-tools )
        • Using :ref:`Constraints Files`
        • Continuous integration/continuous deployment setups
        • Installing from any kind of version control systems (i.e., Git, Subversion, Mercurial, or CVS), per :doc:`topics/vcs-support`
        • Installing from source code held in local directories

        Examples to try

        • pip install
        • pip uninstall
        • pip check
        • pip cache

        Specific things we’d love to get feedback on:

        • Cases where the new resolver produces the wrong result, obviously. We hope there won’t be too many of these, but we’d like to trap such bugs before we remove the legacy resolver.
        • Cases where the resolver produced an error when you believe it should have been able to work out what to do.
        • Cases where the resolver gives an error because there’s a problem with your requirements, but you need better information to work out what’s wrong.
        • If you have workarounds to address issues with the current resolver, does the new resolver let you remove those workarounds? Tell us!

        Please let us know through the resolver testing survey.

        We plan for the resolver changeover to proceed as follows, using :ref:`Feature Flags` and following our :ref:`Release Cadence` :

        • pip 20.1: an alpha version of the new resolver was available, opt-in, using the optional flag —unstable-feature=resolver . pip defaulted to legacy behavior.
        • pip 20.2: a beta of the new resolver was available, opt-in, using the flag —use-feature=2020-resolver . pip defaulted to legacy behavior. Users of pip 20.2 who want pip to default to using the new resolver can run pip config set global.use-feature 2020-resolver (for more on that and the alternate PIP_USE_FEATURE environment variable option, see issue 8661).
        • pip 20.3: pip defaults to the new resolver in Python 3 environments, but a user can opt-out and choose the old resolver behavior, using the flag —use-deprecated=legacy-resolver . In Python 2 environments, pip defaults to the old resolver, and the new one is available using the flag —use-feature=2020-resolver .
        • pip 21.0: pip uses new resolver by default, and the old resolver is no longer supported. It will be removed after a currently undecided amount of time, as the removal is dependent on pip’s volunteer maintainers’ availability. Python 2 support is removed per our :ref:`Python 2 Support` policy.

        Since this work will not change user-visible behavior described in the pip documentation, this change is not covered by the :ref:`Deprecation Policy` .

        Context and followup

        As discussed in our announcement on the PSF blog, the pip team are in the process of developing a new «dependency resolver» (the part of pip that works out what to install based on your requirements).

        We’re tracking our rollout in :issue:`6536` and you can watch for announcements on the low-traffic packaging announcements list and the official Python blog.

        Создание виртуальных окружений и установка библиотек для Python 3 в IDE PyCharm

        Язык программирования Python считается достаточно простым. На нем легче и быстрее пишутся программы, по сравнению с компилируемыми языками программирования. Для Python существует множество библиотек, позволяющих решать практически любые задачи. Есть, конечно, и минусы и другие нюансы, но это отдельная тема.

        Довольно часто я вижу, как мои знакомые и друзья начинают изучать Python и сталкиваются с проблемой установки и использования сторонних библиотек. Они могут несколько часов потратить на установку библиотеки, и даже, могут не справиться с этим и забить на неё. В то время как, в большинстве случаев, это можно было сделать за несколько минут.

        Статья начинается с базовых вещей: с установки Python 3, инструментов разработки Pip и Virtualenv и среды разработки PyCharm в Windows и в Ubuntu. Для многих это не представляет трудностей и возможно, что уже всё установлено.

        После чего будет то, ради чего задумывалась статья, я покажу как в PyCharm создавать и использовать виртуальные окружения и устанавливать в них библиотеки с помощью Pip.

        Установка Python и Pip

        Pip является менеджером пакетов для Python. Именно с помощью него обычно устанавливаются модули/библиотеки для разработки в виде пакетов. В Windows Pip можно установить через стандартный установщик Python. В Ubuntu Pip ставится отдельно.

        Установка Python и Pip в Windows

        Для windows заходим на официальную страницу загрузки, где затем переходим на страницу загрузки определенной версии Python. У меня используется Python 3.6.8, из-за того, что LLVM 9 требует установленного Python 3.6.

        Далее в таблице с файлами выбираем "Windows x86-64 executable installer" для 64-битной системы или "Windows x86 executable installer" для 32-битной. И запускаем скачанный установщик, например, для версии Python 3.8.1 он называется python-3.8.1-amd64.exe .

        Во время установки ставим галочку возле Add Python 3.x to PATH и нажимаем Install Now:

        Установка Python 3 в Windows 10

        Установка Python и Pip в Ubuntu

        В Ubuntu установить Python 3 можно через терминал. Запускаем его и вводим команду установки. Вторая команда выводит версию Python.

        Далее устанавливаем Pip и обновляем его. После обновления необходимо перезапустить текущую сессию (или перезагрузить компьютер), иначе возникнет ошибка во время вызова Pip.

        Основные команды Pip

        Рассмотрим основные команды при работе с Pip в командой строке Windows и в терминале Ubuntu.

        Команда Описание
        pip help Справка по командам
        pip search package_name Поиск пакета
        pip show package_name Информация об пакете
        pip install package_name Установка пакета(ов)
        pip uninstall package_name Удаление пакета(ов)
        pip list Список установленных пакетов
        pip install -U Обновление пакета(ов)

        Если виртуальные окружения не используются, то во время установки пакета(ов) полезно использовать дополнительно ключ —user , устанавливая пакет(ы) локально только для текущего пользователя.

        Установка VirtualEnv и VirtualEnvWrapper

        VirtualEnv используется для создания виртуальных окружений для Python программ. Это необходимо для избежания конфликтов, позволяя установить одну версию библиотеки для одной программы, и другу для второй. Всё удобство использования VirtualEnv постигается на практике.

        Установка VirtualEnv и VirtualEnvWrapper в Windows

        В командной строке выполняем команды:

        Установка VirtualEnv и VirtualEnvWrapper в Ubuntu

        Для Ubuntu команда установки будет следующей:

        После которой в конец

        При новом запуске терминала должны будут появиться сообщения, начинающиеся на virtualenvwrapper.user_scripts creating , что говорит об успешном завершении установки.

        Работа с виртуальным окружением VirtualEnv

        Рассмотрим основные команды при работе с VirtualEnv в командой строке Windows и в терминале Ubuntu.

        Команда Описание
        mkvirtualenv env-name Создаем новое окружение
        workon Смотрим список окружений
        workon env-name Меняем окружение
        deactivate Выходим из окружения
        rmvirtualenv env-name Удаляем окружение

        Находясь в одном из окружений, можно ставить пакеты через Pip, как обычно и нет необходимости добавлять ключ —user :

        Для Windows можно указать в переменных среды WORKON_HOME для переопределения пути, где хранятся виртуальные окружения. По умолчанию, используется путь %USERPROFILE%\Envs .

        Установка PyCharm

        PyCharm — интегрированная среда разработки для языка программирования Python. Обладает всеми базовыми вещами необходимых для разработки. В нашем случае огромное значение имеет хорошее взаимодействие PyCharm с VirtualEnv и Pip, чем мы и будем пользоваться.

        Установка PyCharm в Windows

        Скачиваем установщик PyCharm Community для Windows с официального сайта JetBrains. Если умеете проверять контрольные суммы у скаченных файлов, то не забываем это сделать.

        В самой установке ничего особенного нету. По сути только нажимаем на кнопки next, и в завершение на кнопку Install. Единственно, можно убрать версию из имени папки установки, т.к. PyCharm постоянно обновляется и указанная версия в будущем станет не правильной.

        Установка PyCharm в Ubuntu

        Скачиваем установщик PyCharm Community для Linux с официального сайта JetBrains. Очень хорошей практикой является проверка контрольных сумм, так что если умеете, не ленитесь с проверкой.

        Распаковываем архив с PyCharm и переименовываем папку с программой в pycharm-community , убрав версию из названия.

        Теперь в директории

        /.local (Ctrl + H — Показ скрытый файлов), создаем папку opt , куда и перемещаем pycharm-community . В результате по пути /.local/opt/pycharm-community должны размещаться папки bin , help и т.д. Таким образом PyCharm будет находится в своём скромном месте и никому не будет мешать.

        Далее выполняем команды в терминале:

        Производим установку. И очень важно в конце не забыть создать desktop файл для запуска PyCharm. Для этого в Окне приветствия в нижнем правом углу нажимаем на ConfigureCreate Desktop Entry.

        Создание desktop файла

        Установка PyCharm в Ubuntu из snap-пакета

        PyCharm теперь можно устанавливать из snap-пакета. Если вы используете Ubuntu 16.04 или более позднюю версию, можете установить PyCharm из командной строки.

        Использование VirtualEnv и Pip в PyCharm

        Поддержка Pip и Virtualenv в PyCharm появилась уже довольно давно. Иногда конечно возникают проблемы, но взаимодействие работает в основном стабильно.

        Рассмотрим два варианта работы с виртуальными окружениями:

        1. Создаём проект со своим собственным виртуальным окружением, куда затем будут устанавливаться необходимые библиотеки;
        2. Предварительно создаём виртуальное окружение, куда установим нужные библиотеки. И затем при создании проекта в PyCharm можно будет его выбирать, т.е. использовать для нескольких проектов.

        Первый пример: использование собственного виртуального окружения для проекта

        Создадим программу, генерирующую изображение с тремя графиками нормального распределения Гаусса Для этого будут использоваться библиотеки matplotlib и numpy, которые будут установлены в специальное созданное виртуальное окружение для программы.

        Запускаем PyCharm и окне приветствия выбираем Create New Project.

        В мастере создания проекта, указываем в поле Location путь расположения создаваемого проекта. Имя конечной директории также является именем проекта. В примере директория называется ‘first_program’.

        Далее разворачиваем параметры окружения, щелкая по Project Interpreter. И выбираем New environment using Virtualenv. Путь расположения окружения генерируется автоматически. В Windows можно поменять в пути папку venv на Envs , чтобы команда workon находила создаваемые в PyCharm окружения. Ставить дополнительно галочки — нет необходимости. И нажимаем на Create.

        Настройка первой программы в PyCharm

        Теперь установим библиотеки, которые будем использовать в программе. С помощью главного меню переходим в настройки FileSettings. Где переходим в Project: project_nameProject Interpreter.

        Чистое окружение у проекта

        Здесь мы видим таблицу со списком установленных пакетов. В начале установлено только два пакета: pip и setuptools.

        Справа от таблицы имеется панель управления с четырьмя кнопками:

        • Кнопка с плюсом добавляет пакет в окружение;
        • Кнопка с минусом удаляет пакет из окружения;
        • Кнопка с треугольником обновляет пакет;
        • Кнопка с глазом включает отображение ранних релизов для пакетов.

        Для добавления (установки) библиотеки в окружение нажимаем на плюс. В поле поиска вводим название библиотеки. В данном примере будем устанавливать matplotlib. Дополнительно, через Specify version можно указать версию устанавливаемого пакета и через Options указать параметры. Сейчас для matplotlib нет необходимости в дополнительных параметрах. Для установки нажимаем Install Package.

        Установка библиотеки matplotlib

        После установки закрываем окно добавления пакетов в проект и видим, что в окружение проекта добавился пакет matplotlib с его зависимостями. В том, числе был установлен пакет с библиотекой numpy. Выходим из настроек.

        Теперь мы можем создать файл с кодом в проекте, например, first.py. Код программы имеет следующий вид:

        Для запуска программы, необходимо создать профиль с конфигурацией. Для этого в верхнем правом углу нажимаем на кнопку Add Configuration. . Откроется окно Run/Debug Configurations, где нажимаем на кнопку с плюсом (Add New Configuration) в правом верхнем углу и выбираем Python.

        Далее указываем в поле Name имя конфигурации и в поле Script path расположение Python файла с кодом программы. Остальные параметры не трогаем. В завершение нажимаем на Apply, затем на OK.

        Создание конфигурации для Python программы

        Теперь можно выполнить программу и в директории с программой появится файл gauss.png :

        Графики нормального распределение гаусса

        Второй пример: использование предварительно созданного виртуального окружения

        Данный пример можно использовать во время изучения работы с библиотекой. Например, изучаем PySide2 и нам придется создать множество проектов. Создание для каждого проекта отдельного окружения довольно накладно. Это нужно каждый раз скачивать пакеты, также свободное место на локальных дисках ограничено.

        Более практично заранее подготовить окружение с установленными нужными библиотеками. И во время создания проектов использовать это окружение.

        В этом примере мы создадим виртуальное окружения PySide2, куда установим данную библиотеку. Затем создадим программу, использующую библиотеку PySide2 из предварительно созданного виртуального окружения. Программа будет показывать метку, отображающую версию установленной библиотеки PySide2.

        Начнем с экран приветствия PyCharm. Для этого нужно выйти из текущего проекта. На экране приветствия в нижнем правом углу через ConfigureSettings переходим в настройки. Затем переходим в раздел Project Interpreter. В верхнем правом углу есть кнопка с шестерёнкой, нажимаем на неё и выбираем Add. , создавая новое окружение. И указываем расположение для нового окружения. Имя конечной директории будет также именем самого окружения, в данном примере — pyside2 . В Windows можно поменять в пути папку venv на Envs , чтобы команда workon находила создаваемые в PyCharm окружения. Нажимаем на ОК.

        Создание окружения для PySide2

        Далее в созданном окружении устанавливаем пакет с библиотекой PySide2, также как мы устанавливали matplotlib. И выходим из настроек.

        Теперь мы можем создавать новый проект использующий библиотеку PySide2. В окне приветствия выбираем Create New Project.

        В мастере создания проекта, указываем имя расположения проекта в поле Location. Разворачиваем параметры окружения, щелкая по Project Interpreter, где выбираем Existing interpreter и указываем нужное нам окружение pyside2 .

        Создание нового проекта использующего библиотеку PySide2

        Для проверки работы библиотеки создаем файл second.py со следующий кодом:

        Далее создаем конфигурацию запуска программы, также как создавали для первого примера. После чего можно выполнить программу.

        Заключение

        У меня нет богатого опыта программирования на Python. И я не знаком с другими IDE для Python. Поэтому, возможно, данные IDE также умеют работать с Pip и Virtualenv. Использовать Pip и Virtualenv можно в командой строке или в терминале. Установка библиотеки через Pip может завершиться ошибкой. Есть способы установки библиотек без Pip. Также создавать виртуальные окружения можно не только с помощью Virtualenv.

        В общем, я лишь поделился небольшой частью опыта из данной области. Но, если не вдаваться в глубокие дебри, то этого вполне достаточно знать, чтобы писать простые программы на Python с использованием сторонних библиотек.

        pip install¶

        pip also supports installing from "requirements files", which provide an easy way to specify a whole environment to be installed.

        Overview¶

        Pip install has several stages:

        1. Identify the base requirements. The user supplied arguments are processed here.
        2. Resolve dependencies. What will be installed is determined here.
        3. Build wheels. All the dependencies that can be are built into wheels.
        4. Install the packages (and uninstall anything being upgraded/replaced).

        Argument Handling¶

        When looking at the items to be installed, pip checks what type of item each is, in the following order:

        1. Project or archive URL.
        2. Local directory (which must contain a setup.py , or pip will report an error).
        3. Local file (a sdist or wheel format archive, following the naming conventions for those formats).
        4. A requirement, as specified in PEP 440.

        Each item identified is added to the set of requirements to be satisfied by the install.

        Working Out the Name and Version¶

        For each candidate item, pip needs to know the project name and version. For wheels (identified by the .whl file extension) this can be obtained from the filename, as per the Wheel spec. For local directories, or explicitly specified sdist files, the setup.py egg_info command is used to determine the project metadata. For sdists located via an index, the filename is parsed for the name and project version (this is in theory slightly less reliable than using the egg_info command, but avoids downloading and processing unnecessary numbers of files).

        Any URL may use the #egg=name syntax (see VCS Support ) to explicitly state the project name.

        Satisfying Requirements¶

        Once pip has the set of requirements to satisfy, it chooses which version of each requirement to install using the simple rule that the latest version that satisfies the given constraints will be installed (but see here for an exception regarding pre-release versions). Where more than one source of the chosen version is available, it is assumed that any source is acceptable (as otherwise the versions would differ).

        Installation Order¶

        As of v6.1.0, pip installs dependencies before their dependents, i.e. in "topological order". This is the only commitment pip currently makes related to order. While it may be coincidentally true that pip will install things in the order of the install arguments or in the order of the items in a requirements file, this is not a promise.

        In the event of a dependency cycle (aka "circular dependency"), the current implementation (which might possibly change later) has it such that the first encountered member of the cycle is installed last.

        For instance, if quux depends on foo which depends on bar which depends on baz, which depends on foo:

        Prior to v6.1.0, pip made no commitments about install order.

        The decision to install topologically is based on the principle that installations should proceed in a way that leaves the environment usable at each step. This has two main practical benefits:

        1. Concurrent use of the environment during the install is more likely to work.
        2. A failed install is less likely to leave a broken environment. Although pip would like to support failure rollbacks eventually, in the mean time, this is an improvement.

        Although the new install order is not intended to replace (and does not replace) the use of setup_requires to declare build dependencies, it may help certain projects install from sdist (that might previously fail) that fit the following profile:

        1. They have build dependencies that are also declared as install dependencies using install_requires .
        2. python setup.py egg_info works without their build dependencies being installed.
        3. For whatever reason, they don’t or won’t declare their build dependencies using setup_requires .

        Requirements File Format¶

        Each line of the requirements file indicates something to be installed, and like arguments to pip install , the following forms are supported:

        For details on requirement specifiers, see Requirement Specifiers .

        See the pip install Examples for examples of all these forms.

        A line that begins with # is treated as a comment and ignored. Whitespace followed by a # causes the # and the remainder of the line to be treated as a comment.

        A line ending in an unescaped \ is treated as a line continuation and the newline following it is effectively ignored.

        Comments are stripped before line continuations are processed.

        The following options are supported:

        For example, to specify —no-index and 2 —find-links locations:

        If you wish, you can refer to other requirements files, like this:

        You can also refer to constraints files , like this:

        Example Requirements File¶

        Use pip install -r example-requirements.txt to install:

        Requirement Specifiers¶

        pip supports installing from a package index using a requirement specifier . Generally speaking, a requirement specifier is composed of a project name followed by optional version specifiers . PEP508 contains a full specification of the format of a requirement ( pip does not support the url_req form of specifier at this time).

        Since version 6.0, pip also supports specifiers containing environment markers like so:

        Environment markers are supported in the command line and in requirements files.

        Use quotes around specifiers in the shell when using > , < , or when using environment markers. Don’t use quotes in requirement files. [1]

        Per-requirement Overrides¶

        Since version 7.0 pip supports controlling the command line options given to setup.py via requirements files. This disables the use of wheels (cached or otherwise) for that package, as setup.py does not exist for wheels.

        The —global-option and —install-option options are used to pass options to setup.py . For example:

        The above translates roughly into running FooProject’s setup.py script as:

        Note that the only way of giving more than one option to setup.py is through multiple —global-option and —install-option options, as shown in the example above. The value of each option is passed as a single argument to the setup.py script. Therefore, a line such as the following is invalid and would result in an installation error.

        Pre-release Versions¶

        Starting with v1.4, pip will only install stable versions as specified by PEP426 by default. If a version cannot be parsed as a compliant PEP426 version then it is assumed to be a pre-release.

        If a Requirement specifier includes a pre-release or development version (e.g. >=0.0.dev0 ) then pip will allow pre-release and development versions for that requirement. This does not include the != flag.

        The pip install command also supports a —pre flag that will enable installing pre-releases and development releases.

        VCS Support¶

        pip supports installing from Git, Mercurial, Subversion and Bazaar, and detects the type of VCS using url prefixes: "git+", "hg+", "bzr+", "svn+".

        pip requires a working VCS command on your path: git, hg, svn, or bzr.

        VCS projects can be installed in editable mode (using the —editable option) or not.

        • For editable installs, the clone location by default is "<venv path>/src/SomeProject" in virtual environments, and "<cwd>/src/SomeProject" for global installs. The —src option can be used to modify this location.
        • For non-editable installs, the project is built locally in a temp dir and then installed normally. Note that if a satisfactory version of the package is already installed, the VCS source will not overwrite it without an —upgrade flag. VCS requirements pin the package version (specified in the setup.py file) of the target commit, not necessarily the commit itself.

        The "project name" component of the url suffix "egg=<project name>-<version>" is used by pip in its dependency logic to identify the project prior to pip downloading and analyzing the metadata. The optional "version" component of the egg name is not functionally important. It merely provides a human-readable clue as to what version is in use. For projects where setup.py is not in the root of project, "subdirectory" component is used. Value of "subdirectory" component should be a path starting from root of the project to where setup.py is located.

        So if your repository layout is:

        • pkg_dir/
          • setup.py # setup.py for package pkg
          • some_module.py
          • some_file

          You’ll need to use pip install -e vcs+protocol://repo_url/#egg=pkg&subdirectory=pkg_dir .

          pip currently supports cloning over git , git+http , git+https , git+ssh , git+git and git+file :

          Here are the supported forms:

          Passing branch names, a commit hash or a tag name is possible like so:

          Mercurial¶

          The supported schemes are: hg+http , hg+https , hg+static-http and hg+ssh .

          Here are the supported forms:

          You can also specify a revision number, a revision hash, a tag name or a local branch name like so:

          Subversion¶

          pip supports the URL schemes svn , svn+svn , svn+http , svn+https , svn+ssh .

          You can also give specific revisions to an SVN URL, like so:

          which will check out revision 2019. @ <20080101>would also check out the revision from 2008-01-01. You can only check out specific revisions using -e svn+. .

          Bazaar¶

          pip supports Bazaar using the bzr+http , bzr+https , bzr+ssh , bzr+sftp , bzr+ftp and bzr+lp schemes.

          Here are the supported forms:

          Tags or revisions can be installed like so:

          Finding Packages¶

          pip searches for packages on PyPI using the http simple interface, which is documented here and there

          pip offers a number of Package Index Options for modifying how packages are found.

          pip looks for packages in a number of places, on PyPI (if not disabled via `—no-index` ), in the local filesystem, and in any additional repositories specified via `—find-links` or `—index-url` . There is no ordering in the locations that are searched, rather they are all checked, and the "best" match for the requirements (in terms of version number — see PEP440 for details) is selected.

          SSL Certificate Verification¶

          Starting with v1.3, pip provides SSL certificate verification over https, to prevent man-in-the-middle attacks against PyPI downloads.

          Caching¶

          Starting with v6.0, pip provides an on-by-default cache which functions similarly to that of a web browser. While the cache is on by default and is designed do the right thing by default you can disable the cache and always access PyPI by utilizing the —no-cache-dir option.

          When making any HTTP request pip will first check its local cache to determine if it has a suitable response stored for that request which has not expired. If it does then it simply returns that response and doesn’t make the request.

          If it has a response stored, but it has expired, then it will attempt to make a conditional request to refresh the cache which will either return an empty response telling pip to simply use the cached item (and refresh the expiration timer) or it will return a whole new response which pip can then store in the cache.

          When storing items in the cache, pip will respect the CacheControl header if it exists, or it will fall back to the Expires header if that exists. This allows pip to function as a browser would, and allows the index server to communicate to pip how long it is reasonable to cache any particular item.

          While this cache attempts to minimize network activity, it does not prevent network access altogether. If you want a local install solution that circumvents accessing PyPI, see Installing from local packages .

          The default location for the cache directory depends on the Operating System:

          /.cache/pip and it respects the XDG_CACHE_HOME directory. macOS

          /Library/Caches/pip . Windows <CSIDL_LOCAL_APPDATA>\pip\Cache

          Wheel Cache¶

          Pip will read from the subdirectory wheels within the pip cache directory and use any packages found there. This is disabled via the same —no-cache-dir option that disables the HTTP cache. The internal structure of that is not part of the pip API. As of 7.0, pip makes a subdirectory for each sdist that wheels are built from and places the resulting wheels inside.

          Pip attempts to choose the best wheels from those built in preference to building a new wheel. Note that this means when a package has both optional C extensions and builds py tagged wheels when the C extension can’t be built that pip will not attempt to build a better wheel for Pythons that would have supported it, once any generic wheel is built. To correct this, make sure that the wheels are built with Python specific tags — e.g. pp on Pypy.

          When no wheels are found for an sdist, pip will attempt to build a wheel automatically and insert it into the wheel cache.

          Hash-Checking Mode¶

          Since version 8.0, pip can check downloaded package archives against local hashes to protect against remote tampering. To verify a package against one or more hashes, add them to the end of the line:

          (The ability to use multiple hashes is important when a package has both binary and source distributions or when it offers binary distributions for a variety of platforms.)

          The recommended hash algorithm at the moment is sha256, but stronger ones are allowed, including all those supported by hashlib . However, weaker ones such as md5, sha1, and sha224 are excluded to avoid giving a false sense of security.

          Hash verification is an all-or-nothing proposition. Specifying a —hash against any requirement not only checks that hash but also activates a global hash-checking mode, which imposes several other security restrictions:

          • Hashes are required for all requirements. This is because a partially-hashed requirements file is of little use and thus likely an error: a malicious actor could slip bad code into the installation via one of the unhashed requirements. Note that hashes embedded in URL-style requirements via the #md5=. syntax suffice to satisfy this rule (regardless of hash strength, for legacy reasons), though you should use a stronger hash like sha256 whenever possible.
          • Hashes are required for all dependencies. An error results if there is a dependency that is not spelled out and hashed in the requirements file.
          • Requirements that take the form of project names (rather than URLs or local filesystem paths) must be pinned to a specific version using == . This prevents a surprising hash mismatch upon the release of a new version that matches the requirement specifier.
          • —egg is disallowed, because it delegates installation of dependencies to setuptools, giving up pip’s ability to enforce any of the above.

          Hash-checking mode can be forced on with the —require-hashes command-line option:

          This can be useful in deploy scripts, to ensure that the author of the requirements file provided hashes. It is also a convenient way to bootstrap your list of hashes, since it shows the hashes of the packages it fetched. It fetches only the preferred archive for each package, so you may still need to add hashes for alternatives archives using pip hash : for instance if there is both a binary and a source distribution.

          The wheel cache is disabled in hash-checking mode to prevent spurious hash mismatch errors. These would otherwise occur while installing sdists that had already been automatically built into cached wheels: those wheels would be selected for installation, but their hashes would not match the sdist ones from the requirements file. A further complication is that locally built wheels are nondeterministic: contemporary modification times make their way into the archive, making hashes unpredictable across machines and cache flushes. Compilation of C code adds further nondeterminism, as many compilers include random-seeded values in their output. However, wheels fetched from index servers are the same every time. They land in pip’s HTTP cache, not its wheel cache, and are used normally in hash-checking mode. The only downside of having the wheel cache disabled is thus extra build time for sdists, and this can be solved by making sure pre-built wheels are available from the index server.

          Hash-checking mode also works with pip download and pip wheel . A comparison of hash-checking mode with other repeatability strategies is available in the User Guide.

          Beware of the setup_requires keyword arg in setup.py . The (rare) packages that use it will cause those dependencies to be downloaded by setuptools directly, skipping pip’s hash-checking. If you need to use such a package, see Controlling setup_requires .

          Be careful not to nullify all your security work when you install your actual project by using setuptools directly: for example, by calling python setup.py install , python setup.py develop , or easy_install . Setuptools will happily go out and download, unchecked, anything you missed in your requirements file—and it’s easy to miss things as your project evolves. To be safe, install your project using pip and —no-deps .

          Instead of python setup.py develop , use.

          Instead of python setup.py install , use.

          Hashes from PyPI¶

          PyPI provides an MD5 hash in the fragment portion of each package download URL, like #md5=123. , which pip checks as a protection against download corruption. Other hash algorithms that have guaranteed support from hashlib are also supported here: sha1, sha224, sha384, sha256, and sha512. Since this hash originates remotely, it is not a useful guard against tampering and thus does not satisfy the —require-hashes demand that every package have a local hash.

          "Editable" Installs¶

          You can install local projects or VCS projects in "editable" mode:

          (See the VCS Support section above for more information on VCS-related syntax.)

          For local projects, the "SomeProject.egg-info" directory is created relative to the project path. This is one advantage over just using setup.py develop , which creates the "egg-info" directly relative the current working directory.

          Controlling setup_requires¶

          Setuptools offers the setup_requires setup() keyword for specifying dependencies that need to be present in order for the setup.py script to run. Internally, Setuptools uses easy_install to fulfill these dependencies.

          pip has no way to control how these dependencies are located. None of the Package Index Options have an effect.

          The solution is to configure a "system" or "personal" Distutils configuration file to manage the fulfillment.

          For example, to have the dependency located at an alternate index, add this:

          To have the dependency located from a local directory and not crawl PyPI, add this:

          Build System Interface¶

          In order for pip to install a package from source, setup.py must implement the following commands:

          The egg_info command should create egg metadata for the package, as described in the setuptools documentation at https://setuptools.readthedocs.io/en/latest/setuptools.html#egg-info-create-egg-metadata-and-set-build-tags

          The install command should implement the complete process of installing the package to the target directory XXX.

          To install a package in "editable" mode ( pip install -e ), setup.py must implement the following command:

          This should implement the complete process of installing the package in "editable" mode.

          All packages will be attempted to built into wheels:

          One further setup.py command is invoked by pip install :

          This command is invoked to clean up temporary commands from the build. (TODO: Investigate in more detail when this command is required).

          No other build system commands are invoked by the pip install command.

          Installing a package from a wheel does not invoke the build system at all.

          Options¶

          -c , —constraint <file> ¶

          Constrain versions using the given constraints file. This option can be used multiple times.

          -e , —editable <path/url> ¶

          Install a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.

          -r , —requirement <file> ¶

          Install from the given requirements file. This option can be used multiple times.

          Directory to unpack packages into and build in.

          Install packages into <dir>. By default this will not replace existing files/folders in <dir>. Use —upgrade to replace existing packages in <dir> with new versions.

          Download packages into <dir> instead of installing them, regardless of what’s already installed.

          Directory to check out editable projects into. The default in a virtualenv is "<venv path>/src". The default for global installs is "<current dir>/src".

          Upgrade all specified packages to the newest available version. The handling of dependencies depends on the upgrade-strategy used.

          Determines how dependency upgrading should be handled. "eager" — dependencies are upgraded regardless of whether the currently installed version satisfies the requirements of the upgraded package(s). "only-if-needed" — are upgraded only when they do not satisfy the requirements of the upgraded package(s).

          When upgrading, reinstall all packages even if they are already up-to-date.

          Ignore the installed packages (reinstalling instead).

          Ignore the Requires-Python information.

          Don’t install package dependencies.

          Extra arguments to be supplied to the setup.py install command (use like —install-option="—install-scripts=/usr/local/bin"). Use multiple —install-option options to pass multiple options to setup.py install. If you are using an option with a directory path, be sure to use absolute path.

          Extra global options to be supplied to the setup.py call before the install command.

          Install to the Python user install directory for your platform. Typically

          /.local/, or %APPDATA%Python on Windows. (See the Python documentation for site.USER_BASE for full details.)

          Install packages as eggs, not ‘flat’, like pip normally does. This option is not about installing from eggs. (WARNING: Because this option overrides pip’s normal install logic, requirements files may not behave as expected.)

          Install everything relative to this alternate root directory.

          Installation prefix where lib, bin and other top-level folders are placed

          Compile py files to pyc

          Do not compile py files to pyc

          Do not Find and prefer wheel archives when searching indexes and find-links locations. DEPRECATED in favour of —no-binary.

          Do not use binary packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all binary packages, :none: to empty the set, or one or more package names with commas between them. Note that some packages are tricky to compile and may fail to install when this option is used on them.

          Do not use source packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all source packages, :none: to empty the set, or one or more package names with commas between them. Packages without binary distributions will fail to install when this option is used on them.

          Include pre-release and development versions. By default, pip only finds stable versions.

          Don’t clean up build directories.

          Require a hash to check each requirement against, for repeatable installs. This option is implied when any package in a requirements file has a —hash option.

          Base URL of Python Package Index (default https://pypi.python.org/simple). This should point to a repository compliant with PEP 503 (the simple repository API) or a local directory laid out in the same format.

          Extra URLs of package indexes to use in addition to —index-url. Should follow the same rules as —index-url.

          Ignore package index (only looking at —find-links URLs instead).

          If a url or path to an html file, then parse for links to archives. If a local path or file:// url that’s a directory, then look for archives in the directory listing.

          Enable the processing of dependency links.

          Examples¶

          Install SomePackage and its dependencies from PyPI using Requirement Specifiers

          Install a list of requirements specified in a file. See the Requirements files .

          Upgrade an already installed SomePackage to the latest from PyPI.

          Install a local project in "editable" mode. See the section on Editable Installs .

          Install a project from VCS in "editable" mode. See the sections on VCS Support and Editable Installs .

          Install a package with setuptools extras.

          Install a particular source archive file.

          Install from alternative package repositories.

          Install from a different index, and not PyPI

          Search an additional index during install, in addition to PyPI

          Install from a local flat directory containing archives (and don’t scan indexes):

          Find pre-release and development versions, in addition to stable versions. By default, pip only finds stable versions.

Добавить комментарий

Ваш адрес email не будет опубликован. Обязательные поля помечены *