pipenv: pipenv install is very slow

Running pipenv install after changing one dependency takes about ~5 minutes for me, on a windows 10 machine with an SSD.

The vast majority of that time is spent inside Locking [packages] dependencies...

It seems like there might be some quadratic-or-worse complexity in this step?

I’ve included most of our pipfile below, but I had to remove some of our private repo dependencies:

[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true

[packages]
alembic = "==0.8.4"
amqp = "==1.4.7"
analytics-python = "==1.2.5"
anyjson = "==0.3.3"
billiard = "==3.3.0.20"
braintree = "==3.20.0"
celery = "==3.1.18"
coverage = "==4.0.3"
docopt = "==0.4.0"
eventlet = "==0.19.0"
flake8 = "==3.0.4"
Flask-Cors = "==2.1.2"
Flask-Login = "==0.3.2"
Flask = "==0.12.1"
funcsigs = "==0.4"
fuzzywuzzy = "==0.12.0"
gcloud = "==0.14.0"
html2text = "==2016.9.19"
itsdangerous = "==0.24"
Jinja2 = "==2.8"
jsonpatch = "==1.15"
jsonschema = "==2.5.1"
PyJWT = "==1.4.2"
kombu = "==3.0.30"
LayerClient = "==0.1.9"
MarkupSafe = "==0.23"
mixpanel = "==4.3.0"
mock = "==1.3.0"
nose-exclude = "==0.4.1"
nose = "==1.3.7"
numpy = "==1.12.1"
pdfrw = "==0.3"
Pillow = "==4.1.0"
pusher = "==1.6"
pycountry = "==1.20"
pycryptodome = "==3.4.5"
pymongo = "==3.2"
PyMySQL = "==0.7.4"
python-dateutil = "<=2.5.1"
python-Levenshtein = "==0.12.0"
python-magic = "==0.4.6"
python-coveralls = "==2.9.0"
pytz = "==2015.6"
raygun4py = "==3.1.2"
"repoze.retry" = "==1.3"
requests = "==2.8.1"
sendgrid = "==2.2.1"
slacker = "==0.7.3"
SQLAlchemy-Enum34 = "==1.0.1"
SQLAlchemy-Utils = "==0.31.6"
SQLAlchemy = "==1.1.9"
typing = "==3.5.2.2"
twilio = "==5.6.0"
Unidecode = "==0.4.19"
voluptuous = "==0.8.11"
Wand = "==0.4.4"
watchdog = "==0.8.3"
Werkzeug = "==0.12.1"
wheel = "==0.24.0"
WTForms = "==2.0.2"
xmltodict = "==0.9.2"
zeep = "==0.24.0"

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Reactions: 69
  • Comments: 108 (53 by maintainers)

Commits related to this issue

Most upvoted comments

Why are all the issues to this topic closed? I can’t pipenv install a single thing due to the lock-step hang.

Right now for me it is not slow, it is freezing…

A pipenv install my_package or a simple pipenv install does not give me any output, after 20 minutes.

EDIT: Confirmation, still nothing after a few hours. Is it the same problem? Usually it was slow, but it ended after 5 to 10 minutes.

Note for future visitors: Please refrain from posting your slow installation result. We know it can be slow. We know why it is slow. Your result does not add anything to this topic.

Given that most of the packages are going to stay the same over time, would it be possible to cache the downloaded packages?

Same here. Pipenv very slow. Takes an hour to lock and install.

I know this issue is closed, But for me while installing pandas taking lot of time. The verbose output is this

pipenv install pandas --verbose
Installing pandas…
⠋ Installing...Installing 'pandas'
$ ['/Users/sinscary/.local/share/virtualenvs/signzy-MSzur11z/bin/pip', 'install', '--verbose', '--upgrade', 'pandas', '-i', 'https://pypi.org/simple']
Adding pandas to Pipfile's [packages]…
✔ Installation Succeeded
Pipfile.lock not found, creating…
Locking [dev-packages] dependencies…
Locking [packages] dependencies…
⠦ Locking...

It’s stuck at Locking for more than 30 minutes. I am using python 3.7.0, macos mojave. Any help with that.

pipenv is awesome, but this issue seems to be still existing. will be glad to see any progress. --skip-lock did not work.

Would it be possible to have some information or progress bar like apt-get or wget (download speed, size downloaded, total size) during the libraries downloads? I guess this is the issue here, pipenv seemed slow for me but it was just the library download, I had to open a system monitor to understand that pipenv was downloading the files and how much was already downloaded, what speed etc

@jkp Allow me to assure you that each and every core developer of this project (not that there are many of us to begin with) is very well aware of this issue, and is as troubled by it as you are, if not more. This is, however, by no means an easy problem, and we have already throw everything we could at it to make it as usable as possible, without having to tear everything down in Python packaging. We do also have a lot on our plate at the moment, and need to also focus on those other issues. The inevitable decision me must make, then, is to prioritise issues we actually know how to solve, and only start thinking about our next steps after those are done, to maximise the effect of our effort.

Now, I fully acknowledge that your priority may be different from ours. This performace issue can be the single most largest problem in your workflow, and you want to put it up as the most important thing in this project. Please bear in mind, however, that you are not the only user of this tool, and we need to prioritise the need of all, even the need of our own, in front of yours. I acknowledge that. I urge, therefore, anyone sharing the situation to join minds on this issue, and try to think of a way to solve it. Once we know what to actually do, we can do it.

The issue is kept closed because we do not know what we can do, and would serve only as noise in the issue tracker when we try to manage it. There is no point, as least in our workflow, to have an issue that nobody can work on.

This is definitely still a problem (5+ mins), with latest python3.6, pip, and pipenv versions, and installing a simple package like torch. I don’t think this issue should be marked as closed.

hava same issue: Locking [packages] dependencies... hang forever my environment:

  • macOS High Sierra 10.13.6
  • Python: Python 3.6.4
  • pipenv: version 2018.7.1

We sprinted on this together at PyCon. It’ll be faster soon.

Please, please provide useful output. I just upgraded my pipenv from 9.1.0 to 11.10.0 in order to resolve the Invalid Marker failure at package lock step as per, e.g., #1622 — now, I have a pipfile with ipykernel, pandas, jupyter, numpy, and matplotlib in there and with my latest attempt to use pipenv install to get the lock file going, I’ve been sitting in locking [packages] dependencies… for upwards of 10 minutes.

Because there’s no output, I can’t tell whether there is something actually going (like building numpy from source) or if it’s just hanging. The best I can do is sort of squint at top and conclude that maybe it’s doing something because a python process seems to be hanging around… but I’m going to have to trash this virtualenv and start fresh if something doesn’t move soon.

I’m happy to contribute to some work on this if needed.

Also vote for this issue to be reopened. It is far from solved…take somewhere between 10-20 minutes to lock my project within a Docker container. It also uses an insane amount of memory - such that I had to increase the allocation to Docker to stop it from killing the process.

The following docker image takes more than 30 minutes to build on my laptop (i7/16Gb), the pipenv install ... command runs for ages…

Dockerfile

FROM python:3.7-alpine

# Update package list.
RUN set -ex && apk update

# Install apk dependencies.
RUN set -ex && apk add --no-cache musl-dev gcc libffi-dev openssl-dev make

# Install Pipenv.
RUN set -ex && pip install pipenv --upgrade

# Copy Pipfiles.
RUN mkdir /website
COPY Pipfile /website

# Install Pipenv dependencies.
WORKDIR /website
RUN set -ex && pipenv install --system --skip-lock --verbose

Pipfile

[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"


[requires]
python_version = "3.7"

[packages]
sanic = "*"
jinja2 = "*"
asyncpg = "*"
uvloop = "*"
munch = "*"

[dev-packages]

[pipenv]
allow_prereleases = true

Is this normal? Can someone reproduce?

Update: Be careful with Alpine Linux

I realized that the issue is not on pipenv’s side…

I replaced the Alpine base docker-image with one that is built on Debian-Slim and now pipenv install finishes within seconds.

The issue in my example is that Alpine Linux will always try to build packages which contain cython-extension or c-extensions from source, which can take forever in a Docker container, whereas Debian Linux installs them using the wheel format, which happens A LOT faster (within seconds).

More on this: https://stackoverflow.com/questions/49037742/why-does-it-take-ages-to-install-pandas-on-alpine-linux

@crifan there is no need to post the same message on every issue open or closed which mentions locking speed. We will see your comment no matter how many times you say the same thing. If you want to be helpful, you will need to provide a reproducible example case. Chiming in to say “me too” simply doesn’t add anything besides extra traffic on the issue tracker. Please be mindful of that.

Appreciate your workflow so if that is how you manage issues that’s fine. I’ll try to add any information I can to help track down the problem.

I did some debugging by installing mitmproxy between pipenv and the net to trace the requests being made. I found a couple of interesting things.

  1. We are using a private pypi index that doesn’t support the json-api yet. This slows things down significantly since it looks like the fallback is to bruteforce and download everything listed in http index in order to extract metadata etc. One suggestion here would be just to add some simple logging to warn that this fallback method is being used - it might save someone else having to dig deeper to figure this out.

  2. Using the brute-force method it seems that the code downloads packages that aren’t relevant to the architecture in use. For example on a linux machine it was downloading win32 or osx specific wheel packages. This feels like something that could be detected and avoided since it’s clear that binary packages built for other architectures are not going to be of any use.

I will continue to debug and report back any useful info as I find it.

Here I am… 15 mins later

Oh, thank you. I think that was the only thing holding my back from pushing everyone over to pipenv at work.

We have done some performance enhancements to pipenv in recent times, including a big install optimization which released 2022.8.31. For an independent comparison of benchmarks, please have a look at: lincolnloop.github.io/python-package-manager-shootout

Please use virtualenv for time being. until better solution available

I think this issue has been answered well here: https://github.com/pypa/pipenv/issues/1914#issuecomment-378846926

Python dependencies require us to fully download and execute the setup files of each package to resolve and compute. That’s just the reality, it’s a bit slow. If you can’t wait 2 minutes or you feel it’s not worth the tradeoff, you can always pass --skip-lock.

We should probably move the discussion to #2284. It is actually the locking part that is slow (install is essentially TOML manipulation + lock + sync), not installing.

Hi, getting the same problem. here is the verbose

pipenv lock --verbose
Locking [dev-packages] dependencies…
Using pip: -i https://pypi.python.org/simple

                          ROUND 1                           
Current constraints:
  pylint

Finding the best candidates:
  found candidate pylint==1.9.1 (constraint was <any>)

Finding secondary dependencies:
  pylint==1.9.1             requires astroid<2.0,>=1.6, isort>=4.2.5, mccabe, six

New dependencies found in this round:
  adding ['astroid', '<2.0,>=1.6', '[]']
  adding ['isort', '>=4.2.5', '[]']
  adding ['mccabe', '', '[]']
  adding ['six', '', '[]']
Removed dependencies in this round:
Unsafe dependencies in this round:
------------------------------------------------------------
Result of round 1: not stable

                          ROUND 2                           
Current constraints:
  astroid<2.0,>=1.6
  isort>=4.2.5
  mccabe
  pylint
  six

Finding the best candidates:
  found candidate astroid==1.6.4 (constraint was >=1.6,<2.0)
  found candidate isort==4.3.4 (constraint was >=4.2.5)
  found candidate mccabe==0.6.1 (constraint was <any>)
  found candidate pylint==1.9.1 (constraint was <any>)
  found candidate six==1.11.0 (constraint was <any>)

Finding secondary dependencies:
  pylint==1.9.1             requires astroid<2.0,>=1.6, isort>=4.2.5, mccabe, six
  astroid==1.6.4            requires lazy-object-proxy, six, wrapt
  isort==4.3.4              requires -
  mccabe==0.6.1             requires -
  six==1.11.0               requires -

New dependencies found in this round:
  adding ['lazy-object-proxy', '', '[]']
  adding ['wrapt', '', '[]']
Removed dependencies in this round:
Unsafe dependencies in this round:
------------------------------------------------------------
Result of round 2: not stable

                          ROUND 3                           
Current constraints:
  astroid<2.0,>=1.6
  isort>=4.2.5
  lazy-object-proxy
  mccabe
  pylint
  six
  wrapt

Finding the best candidates:
  found candidate astroid==1.6.4 (constraint was >=1.6,<2.0)
  found candidate isort==4.3.4 (constraint was >=4.2.5)
  found candidate lazy-object-proxy==1.3.1 (constraint was <any>)
  found candidate mccabe==0.6.1 (constraint was <any>)
  found candidate pylint==1.9.1 (constraint was <any>)
  found candidate six==1.11.0 (constraint was <any>)
  found candidate wrapt==1.10.11 (constraint was <any>)

Finding secondary dependencies:
  astroid==1.6.4            requires lazy-object-proxy, six, wrapt
  wrapt==1.10.11            requires -
  lazy-object-proxy==1.3.1  requires -
  six==1.11.0               requires -
  pylint==1.9.1             requires astroid<2.0,>=1.6, isort>=4.2.5, mccabe, six
  mccabe==0.6.1             requires -
  isort==4.3.4              requires -
------------------------------------------------------------
Result of round 3: stable, done

Locking [packages] dependencies…
  

Here is the pipenv --version: pipenv, version 2018.05.18

Also i don’t know why this is happening no specific reason/ error is occurring. In my case when I do pipenv lock it starts but it never ends as far as I know. I have been waiting for 2 hrs now still no sign of completion. And has given me ReadTimeoutError twice this is the third time I am doing this. Using Python 3.6.4

Any help would be of great help as my projects due date is closer.

Let me know if there’s any info I can provide to help diagnose. For the moment I’ll just go back to pip + virtualenv + pip-tools 😕

It takes quite a lot to run on my machine. macOS 10.13.4, pipenv, version 11.10.0

Download runs almost immediately, but then it get’s stuck on Locking [packages] dependencies…. Here’s taking half a minute for two dependencies, and then 6 minutes for another 3 dependencies, if I throw all my project’s dependencies at it, it just hangs indefinitely at the locking step

pablo@batman scanr (develop) $ time pipenv install flask flask_pymongo
Installing flask…
Looking in indexes: https://pypi.python.org/simple
Collecting flask
  Using cached https://files.pythonhosted.org/packages/77/32/e3597cb19ffffe724ad4bf0beca4153419918e7fa4ba6a34b04ee4da3371/Flask-0.12.2-py2.py3-none-any.whl
Collecting Werkzeug>=0.7 (from flask)
  Using cached https://files.pythonhosted.org/packages/20/c4/12e3e56473e52375aa29c4764e70d1b8f3efa6682bef8d0aae04fe335243/Werkzeug-0.14.1-py2.py3-none-any.whl
Collecting itsdangerous>=0.21 (from flask)
Collecting Jinja2>=2.4 (from flask)
  Using cached https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl
Collecting click>=2.0 (from flask)
  Using cached https://files.pythonhosted.org/packages/34/c1/8806f99713ddb993c5366c362b2f908f18269f8d792aff1abfd700775a77/click-6.7-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from Jinja2>=2.4->flask)
Installing collected packages: Werkzeug, itsdangerous, MarkupSafe, Jinja2, click, flask
Successfully installed Jinja2-2.10 MarkupSafe-1.0 Werkzeug-0.14.1 click-6.7 flask-0.12.2 itsdangerous-0.24

Adding flask to Pipfile's [packages]…
Installing flask_pymongo…
Looking in indexes: https://pypi.python.org/simple
Collecting flask_pymongo
  Using cached https://files.pythonhosted.org/packages/fa/71/ab920741dedd605ef4adbd57d0c7d9f43a6b6fe4068604fffbc6f64b2c9c/Flask_PyMongo-0.5.1-py3-none-any.whl
Requirement already satisfied: Flask>=0.8 in /Users/pablo/.local/share/virtualenvs/scanr-2m6AW0PB/lib/python3.6/site-packages (from flask_pymongo) (0.12.2)
Collecting PyMongo>=2.5 (from flask_pymongo)
  Using cached https://files.pythonhosted.org/packages/5c/7f/1f7240883ec3fa768d7e066c9cbd42ceb42d699ba1a0fb9d231c098a542d/pymongo-3.6.1-cp36-cp36m-macosx_10_6_intel.whl
Requirement already satisfied: click>=2.0 in /Users/pablo/.local/share/virtualenvs/scanr-2m6AW0PB/lib/python3.6/site-packages (from Flask>=0.8->flask_pymongo) (6.7)
Requirement already satisfied: itsdangerous>=0.21 in /Users/pablo/.local/share/virtualenvs/scanr-2m6AW0PB/lib/python3.6/site-packages (from Flask>=0.8->flask_pymongo) (0.24)
Requirement already satisfied: Werkzeug>=0.7 in /Users/pablo/.local/share/virtualenvs/scanr-2m6AW0PB/lib/python3.6/site-packages (from Flask>=0.8->flask_pymongo) (0.14.1)
Requirement already satisfied: Jinja2>=2.4 in /Users/pablo/.local/share/virtualenvs/scanr-2m6AW0PB/lib/python3.6/site-packages (from Flask>=0.8->flask_pymongo) (2.10)
Requirement already satisfied: MarkupSafe>=0.23 in /Users/pablo/.local/share/virtualenvs/scanr-2m6AW0PB/lib/python3.6/site-packages (from Jinja2>=2.4->Flask>=0.8->flask_pymongo) (1.0)
Installing collected packages: PyMongo, flask-pymongo
Successfully installed PyMongo-3.6.1 flask-pymongo-0.5.1

Adding flask_pymongo to Pipfile's [packages]…
Pipfile.lock (c202d3) out of date, updating to (3068be)…
Locking [dev-packages] dependencies…
Locking [packages] dependencies…
Updated Pipfile.lock (3068be)!
Installing dependencies from Pipfile.lock (3068be)…
  🐍   ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 32/32 — 00:00:03
To activate this project's virtualenv, run the following:
 $ pipenv shell

real	0m37.816s
user	0m34.556s
sys	0m4.517s
pablo@batman scanr (develop) $ time pipenv install gunicorn h5py joblib
Installing gunicorn…
Looking in indexes: https://pypi.python.org/simple
Collecting gunicorn
  Using cached https://files.pythonhosted.org/packages/64/32/becbd4089a4c06f0f9f538a76e9fe0b19a08f010bcb47dcdbfbc640cdf7d/gunicorn-19.7.1-py2.py3-none-any.whl
Installing collected packages: gunicorn
Successfully installed gunicorn-19.7.1

Adding gunicorn to Pipfile's [packages]…
Installing h5py…
Looking in indexes: https://pypi.python.org/simple
Collecting h5py
  Using cached https://files.pythonhosted.org/packages/69/d5/dff2a8f7658fd87ab3330a0ab47e4363681d8bdf734a495add65a347f5e3/h5py-2.7.1-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Requirement already satisfied: six in /Users/pablo/.local/share/virtualenvs/scanr-2m6AW0PB/lib/python3.6/site-packages (from h5py) (1.11.0)
Collecting numpy>=1.7 (from h5py)
  Using cached https://files.pythonhosted.org/packages/a0/df/fa637677800e6702a57ef09e1d62e42aec3f598fb235f897155d146f2f59/numpy-1.14.2-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Installing collected packages: numpy, h5py
Successfully installed h5py-2.7.1 numpy-1.14.2

Adding h5py to Pipfile's [packages]…
Installing joblib…
Looking in indexes: https://pypi.python.org/simple
Collecting joblib
  Using cached https://files.pythonhosted.org/packages/4f/51/870b2ec270fc29c5d89f85353da420606a9cb39fba4747127e7c7d7eb25d/joblib-0.11-py2.py3-none-any.whl
Installing collected packages: joblib
Successfully installed joblib-0.11

Adding joblib to Pipfile's [packages]…
Pipfile.lock (0d514f) out of date, updating to (a4d15f)…
Locking [dev-packages] dependencies…
Locking [packages] dependencies…
Updated Pipfile.lock (a4d15f)!
Installing dependencies from Pipfile.lock (a4d15f)…
  🐍   ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 36/36 — 00:00:03
To activate this project's virtualenv, run the following:
 $ pipenv shell

real	6m31.572s
user	1m1.986s
sys	0m11.047s

Wow, nice, that was literally more than a 100x speedup for me, and it also caught a dependency conflict that the previous version didn’t catch!

What would be useful is a verbose flag for pipenv lock - I was only able to diagnose the dependency conflict by editing piptools/logging.py to enable verbose logging, but once I did that it gave a very clear indication of what was going on.

99% of the time I do this, the dependencies will resolve to the same one in my lock file, this is because it’s part of my dev pipeline.

In the case where there are no new upstream packages since the last run, surely the process could be skipped ?

Would it be possible to get the list of hashes from the PyPI API, rather than compute them ourselves?

@keshavkaul PySpark is very large, and can take quite some time just to download. Give it some time, it will be much better afterwards (because Pipenv caches the result).

(Or you can urge the developers to release a wheel distribution. That would help a bit.)

It seems like even with the json interface in use pipenv is making unnecessary requests for wheel files that relate to different architectures. The implementation is currently pretty naive in that it checks all files listed against a given release, regardless of the target platform / arch.

Minimal test-case:

On a Linux host: pipenv install grpcio

Produced the following requests (captured using mitmproxy):

$ mitmdump -w dump.out
Proxy server listening at http://*:8080
127.0.0.1:62303: clientconnect
127.0.0.1:62303: GET https://pypi.org/simple/setuptools/
              << 200 OK 79.82k
127.0.0.1:62305: clientconnect
127.0.0.1:62305: GET https://files.pythonhosted.org/packages/7f/e1/820d941153923aac1d49d7fc37e17b6e73bfbd2904959fffbad77900cf92/setuptools-39.2.0-py2.py3-none-any.whl
              << 200 OK 554.25k
127.0.0.1:62303: GET https://pypi.org/simple/pip/
              << 200 OK 9.56k
127.0.0.1:62303: GET https://pypi.org/simple/wheel/
              << 200 OK 6.91k
127.0.0.1:62303: clientdisconnect
127.0.0.1:62305: clientdisconnect
127.0.0.1:62307: clientconnect
127.0.0.1:62307: GET https://pypi.org/simple/grpcio/
              << 200 OK 112.03k
127.0.0.1:62309: clientconnect
127.0.0.1:62309: GET https://files.pythonhosted.org/packages/1f/ea/664c589ec41b9e9ac6e20cc1fe9016f3913332d0dc5498a5d7771e2835af/grpcio-1.12.1-cp36-cp36m-manylinux1_x86_64.whl
              << 200 OK 8.57m
127.0.0.1:62307: GET https://pypi.org/simple/six/
              << 200 OK 3.34k
127.0.0.1:62309: GET https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl
              << 200 OK 10.45k
127.0.0.1:62309: clientdisconnect
127.0.0.1:62307: clientdisconnect
127.0.0.1:62311: clientconnect
127.0.0.1:62311: GET https://pypi.org/simple/grpcio/
              << 200 OK 112.03k
127.0.0.1:62313: clientconnect
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/1f/ea/664c589ec41b9e9ac6e20cc1fe9016f3913332d0dc5498a5d7771e2835af/grpcio-1.12.1-cp36-cp36m-manylinux1_x86_64.whl
              << 200 OK 8.57m
127.0.0.1:62311: GET https://pypi.org/simple/six/
              << 200 OK 3.34k
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl
              << 200 OK 10.45k
127.0.0.1:62315: clientconnect
127.0.0.1:62315: GET https://pypi.org/pypi/six/json
              << 200 OK 5.94k
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz
              << 200 OK 29.16k
127.0.0.1:62315: GET https://pypi.org/pypi/grpcio/json
              << 200 OK 164.26k
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/5c/73/5e65b81301956bdd32c5e8da691fde3fbd6e61283b65d2bac590b8f43765/grpcio-1.12.1-cp27-cp27m-win32.whl
              << 200 OK 1.35m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/e1/c3/bcce8247da4e6f95a900489b6f7ff3d14d93df40d69875fe4164c1b9544a/grpcio-1.12.1-cp27-cp27mu-manylinux1_i686.whl
              << 200 OK 8.01m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/ed/89/03924c56e9044b0842a014fcc0a81f55975028d1caa9cdd14234a230bc70/grpcio-1.12.1-cp27-cp27m-win_amd64.whl
              << 200 OK 1.35m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/d7/f6/ddeab13c25b8451f05875587801ad87e4e0fc23c4e3eb328c4bd1a80a415/grpcio-1.12.1-cp36-cp36m-linux_armv7l.whl
              << 200 OK 7.77m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/2d/a4/4d1d73c0339e987ea173f44cf62ec6b40fb91e0336c09c960c4a44137552/grpcio-1.12.1-cp35-cp35m-linux_armv7l.whl
              << 200 OK 7.76m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/76/27/b03ec8fc96745cde68d6ec29115f9a444945a6acc45209c5772378cc4d66/grpcio-1.12.1-cp35-cp35m-macosx_10_7_intel.whl
              << 200 OK 1.83m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/30/24/8e247548321e52c266a639b51a838ec19b41fb6bfd27e3bbef018496752e/grpcio-1.12.1-cp27-cp27m-manylinux1_x86_64.whl
              << 200 OK 8.47m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/80/c9/e582b962a4a3aa2684666ff67fc994a042b1b0e444eb6672eb9740f7b59a/grpcio-1.12.1-cp34-cp34m-macosx_10_7_intel.whl
              << 200 OK 1.84m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/a2/25/6d910070a4a07c32633c2376075d5dc03e90f69f855d700e3f73c1affebb/grpcio-1.12.1-cp27-cp27m-macosx_10_12_x86_64.whl
              << 200 OK 1.57m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/33/38/58f3e8d133de1f2e911206ead03799621205079c303ae5b27e7350051f4a/grpcio-1.12.1.tar.gz
              << 200 OK 13.56m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/68/57/da122cbfc1b7815381480b23044fff06b90f58c1be9310e68c2d6b1d623c/grpcio-1.12.1-cp36-cp36m-macosx_10_7_intel.whl
              << 200 OK 1.82m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/c6/b8/47468178ba19143e89b2da778eed660b84136c0a877224e79cc3c1c3fd32/grpcio-1.12.1-cp35-cp35m-manylinux1_x86_64.whl
              << 200 OK 8.55m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/5d/8b/104918993129d6c919a16826e6adcfa4a106c791da79fb9655c5b22ad9ff/grpcio-1.12.1-cp36-cp36m-win_amd64.whl
              << 200 OK 1.37m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/94/6c/02e9cb803cd7b9608c9c1768d86d31c61b088f5b9513a203c10fa7e905d8/grpcio-1.12.1-cp36-cp36m-win32.whl
              << 200 OK 1.12m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/2a/ed/71169dccb7f9250d17031068579832371a72891d8e64891265370ca6e264/grpcio-1.12.1-cp27-cp27mu-linux_armv7l.whl
              << 200 OK 7.68m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/63/38/d73bf5b1ef950dbab8203122b9681137b35012492ecfec56719be109e343/grpcio-1.12.1-cp27-cp27m-manylinux1_i686.whl
              << 200 OK 8.01m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/13/71/87628a8edec5bffc86c5443d2cb9a569c3b65c7ff0ad05d5e6ee68042297/grpcio-1.12.1-cp36-cp36m-manylinux1_i686.whl
              << 200 OK 8.11m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/1d/0d/146582f71161a0074dda2378617ae5f7e2c3d6cf62d4588eb586c1d6b675/grpcio-1.12.1-cp27-cp27mu-manylinux1_x86_64.whl
              << 200 OK 8.47m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/9e/3a/6aceb4fccacf6d2d7d087190c221a90f14b2bdcb56cbee5af24b7050278b/grpcio-1.12.1-cp34-cp34m-win_amd64.whl
              << 200 OK 1.35m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/f9/fa/a0187d220544b744dd3bb0d8b8ec716d130159160bf627415b2880ae599a/grpcio-1.12.1-cp34-cp34m-win32.whl
              << 200 OK 1.35m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/dd/aa/ac8e3c6badf1744f04be7d35fa95dae56df12b956f861285c8cced2a22cb/grpcio-1.12.1-cp34-cp34m-linux_armv7l.whl
              << 200 OK 7.76m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/38/2a/94665daafbcf0214adcf77ad8f5aed8b9dfcbfa871115c7890d88b1b8f3c/grpcio-1.12.1-cp34-cp34m-manylinux1_x86_64.whl
              << 200 OK 8.58m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/0d/33/22ad4a9dcefe330180cdb2d24fdd980af2a7a2dc03af208a408fd48195e0/grpcio-1.12.1-cp35-cp35m-win_amd64.whl
              << 200 OK 1.36m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/b5/13/9e8e5d68a15c51b251e512955a971214fd8425b237e6d6a04f0257c5d090/grpcio-1.12.1-cp34-cp34m-manylinux1_i686.whl
              << 200 OK 8.11m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/21/41/66ab386c65be68b4e907f2cd35223965aea2a086bcd0bd6825999e0bda7c/grpcio-1.12.1-cp35-cp35m-win32.whl
              << 200 OK 1.12m
127.0.0.1:62313: GET https://files.pythonhosted.org/packages/f7/db/fc084f59804a32a8d6efb467896a505f4dc93ea89ec44da856b91f05a5cb/grpcio-1.12.1-cp35-cp35m-manylinux1_i686.whl
              << 200 OK 8.09m
127.0.0.1:62313: clientdisconnect
127.0.0.1:62311: clientdisconnect
127.0.0.1:62315: clientdisconnect

Counting up some of the unnecessary requests:

  • 4 x win32
  • 4 x arm
  • 4 x macosx

Etc. Seems like a quick win to do some simple filtering based on host os and arch?

@AndreasPresthammer so your script is just timing an uncached lock vs installing with lock. We know that it’s the locking but that’s slow. In the case of numpy it’s probably because it’s had to use sdists for resolution in the past which meant compilation. We can use wheels now. That may speed things up

It does have the general feel of being slower than it should be, but maybe I’m underestimating the issue. If I look at the running processes in my computer, I can see the python one running the whole time pipenv is running, and it never goes above ~15%, it should probably use more of it if it’s doing cpu intensive work like hashing files. Also, I’ve used other package managers that hash dependencies, like yarn, and they are pretty fast.

Perhaps a message such as This may take a long time would help to reassure people until a clearer solution is decided upon.

@pablote holy crap that is slow! Do note that this is partly due to the installation of numpy, which I’m sure we are probalby compiling from source to lock or something stupid. would help if we provided useful output here

@giantas not helpful. Please provide pipfile, output, and duration with pip install. Also whether you can do invidual packages.

@kavdev @jtratner introduced a feature to cache the hashes as well so that should make a sizeable improvement

fixed! lock is wicked fast now.

Glad you got things resolve @NicolasWebDev! We’re working on getting this sped up more, hopefully #373 will be a step closer in the next release.