poetry: Poetry is extremely slow when resolving the dependencies

  • I am on the latest Poetry version.
  • I have searched the issues of this repo and believe that this is not a duplicate.
  • If an exception occurs when executing a command, I executed it again in debug mode (-vvv option).

Issue

I created an empty project and run poetry add allennlp. It takes ages to resolve the dependencies.

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 395
  • Comments: 271 (42 by maintainers)

Commits related to this issue

Most upvoted comments

Yes, i’m running into the same problem. Resolving dependencies takes forever. I tried to use VPN to get through the GFW, nevertheless, it is still not working. I also tried to change pip source and wrote local source in the toml file, neither works. It’s driving me nuts.

截屏2020-03-09上午12 28 19

First, it’s capitalized PyPI.

Second, there is no way for PyPI to know dependencies for all packages without executing arbitrary code – which is difficult to do safely and expensive (computationally and financially). PyPI is run on donated infrastructure from sponsors, maintained by volunteers and does not have millions of dollars of funding like many other language ecosystems’ package indexes.


For anyone interested in further reading, here’s an article written by a PyPI admin on this topic: https://dustingram.com/articles/2018/03/05/why-pypi-doesnt-know-dependencies/

No conflict. Poetry is slow as hell.

Screenshot 2020-04-19 at 20 21 11

I’m currently using this workaround:

poetry export -f requirements.txt > requirements.txt
python -m pip install -r requirements.txt
poetry install

It takes a lower time space to install the package locally since all deps are already installed.

Make sure to run poetry shell before to access the created virtual environment and install on it instead of on user/global path.

First of all, I want to say there is ongoing work to improve the dependency resolution.

However, there is so much Poetry can do with the current state of the Python ecosystem. I invite you to read https://python-poetry.org/docs/faq/#why-is-the-dependency-resolution-process-slow to know a little more about why the dependency resolution can be slow.

If you report that Poetry is slow, we would appreciate a pyproject.toml that reproduces the issue so we can debug what’s going on and if it’s on Poetry’s end or just the expected behavior.

@gagarine Could you provide the pyproject.toml file you are using?

Poetry being slow to resolve dependencies seems to be a reoccuring issue:

  • #476 - Poetry resolving dependencies is amazingly slow
  • #819 - Resolving dependencies are slow
  • #832 - Poetry update never finishes resolve and Poetry show --outdated hangs
  • #1047 - Poetry doesn’t resolve on MacOs Mojave

A personal Heroku app is not going to be as valuable a target as PyPI would be. Neither is a $10/month Heroku app going to be able to support the millions of API requests that PyPI gets everyday. The problem isn’t in writing a script run a setup.py file in a sandbox, but in the logistics and challenges of providing it for the entire ecosystem.

“It works 90% of the time” is not an approach that can be taken by the canonical package index (which has to be used by everyone) but can be taken by specific tools (which users opt into using). Similar to how poetry can use an AST parser for setup.py files which works >90% of the time, to avoid the overhead of a subprocess call, but pip shouldn’t.

Anyway, I wanted to call out that “just blame PyPI folks because they don’t care/are lazy” is straight up wrong IMO – there are reasons that things are the way they are. That doesn’t mean we shouldn’t improve them, but it’s important to understand why we’re where we are. I’m going to step away now.

Maybe a stepping stone to a solution could be to add a flag to show some more info regarding dependency resolution - e.g. for each package, how long it took, and issues encountered/processes used. This would at least let us see where slowdowns are coming from, and potentially let us send PRs to other projects to provide better/more appropriate metadata?

Hey dudes - as Sebastian implied, the root cause is the Python eco’s inconsistent/incomplete way of specifying dependencies and package metadata. Unfortunately, the Pypi team is treating this as a wont fix.

In particular, using the Pypi json endpoint, an empty dep list could either mean “no dependencies”, or “dependencies not specified”. The Pypi team doesn’t want to differentiate between these two cases for reasoning I don’t follow.

The soln is to workaround this by maintaining a sep cache from Pypi that properly handles this distinction, and perhaps refuse to use packages that don’t properly specify deps. However, this latter aspect may be tough, due to long dep nests.

Python’s grown a lot over the decades, and it has much remaining from its early days. There’s a culture of no-breaking-changes at any cost.

Having to run arbitrary python code to find dependencies is fucked, but … we can do this for each noncompliant package, and save it.

this has gotten much worse recently, not sure what happened

Before you step away - Can you think of a reason PyPi shouldn’t differentiate between no dependencies, and missing dependency data?

If going through existing releases is too bold, what about for new ones?

Hi. I resolved my problem with long resolving dependencies.

  1. Delete virtualenv (which python)
  2. Delete .lock file
  3. Create new virtualenv
poetry env use 3.10.3
  1. Clear all cache in list. On Error retry command
$ poetry cache list
$ poetry cache clear --all pypi 
  1. Reinstall dependencies
poetry install

If you have problems with numpy, psycogg or other binary libraries on mac m1. Research problems with install the dependency like install <dep> m1. Most resolvers is command prefix arch -x86_64

I’m on Fedora 34, send help Resolving dependencies... (15134.3s)

Hi, PyPI admin here, just want to clear a few things up.

There’s no rate limits for the JSON API or Simple API. Additionally, our rate limits are just rate limits, they immediately 429 when the limit is hit. They don’t slow down the request.

We haven’t implemented any rate limiting at the CDN layer (Fastly) and I have no reason to believe they are doing any rate limiting on our behalf without us realizing it.

I strongly suspect due to the lack of similar complaints from other installers and other consumers of our JSON APIs (there are a significant amount) that this is something specific to Poetry, but I don’t have any other ideas on what it might be.

image left it overnight...

I think the root cause is Python’s been around for a while, and tries to maintain backwards compatibility. I agree - setup.py isn’t an elegant way to do things, and a file that declares dependencies and metadata is a better system. The wheel format causes dependencies to be specified in a MANIFEST file, but there are still many older packages that don’t use this format.

As a new lang, Rust benefited by learning from the successes and failures of existing ones. Ie it has nice tools like Cargo, docs, clippy, fmt etc. It’s possible to to implement tools / defaults like this for Python, but involves a big change, and potentially backwards-incompatibility. There are equivalents for many of these (pyproject.toml, black etc), but they’re not officially supported or widely-adopted. Look at how long it took the Python 3 to be widely adopted for a taste of the challenge.

Could this be due to downloading packages from pypi to inspect their dependencies, when not properly specified?

It seems so. I have checked the detailed log, poetry kept retrying to resolve the dependency for botocore, but without success. So I assume that the dependency can be eventually resolved if enough time is given.

However, is there any way to get around this?

BTW, I also consider it’s better to give some warning if there are some dependencies are not properly specified and could not be resolved after a number of attempts.

Disabling IPv6 on MacOS fixed the issue. System Preferences > Network > Advanced > TCP/IP Tab > Set “Configure IPv6” to “Link-local only”.

Screenshot 2022-04-22 at 13 02 41

Just to add another data point to the conversation. Running poetry update on many of our projects now takes > 15 minutes.

I understand that doing a comparison between pip and poetry install is not an apples for apples comparison, and also that there are many variables outside poetry’s control - however it is hard to believe that 15 minutes resolving a small number of dependencies is unavoidable.

I created a vaguely representative list of dependencies for our projects and put the identical deps in both a pyproject.toml (see https://gist.github.com/jacques-/82b15d76bab3540f98b658c03c9778ea) and Pipfile (see https://gist.github.com/jacques-/293e531e4308bd4d6ad8eabea5299f57).

Poetry resolved this on my machine in around 10-11 minutes, while pipenv did the same in around 1 - 1:15 minutes. This is a roughly 10x improvement.

Unless I’m missing a big part of the puzzle here both pipenv and poetry are doing similar dependency resolution, and are working from the same repositories, so there is no external reason the performance should be this different. Would be great to see prioritising this issue and some of the proposed fixes that are ready to merge e.g. https://github.com/python-poetry/poetry/pull/2149

P.S. thanks for making an awesome tool, poetry has made our lives better since we started using it!

I noticed the same pyproject.toml environment was taking 5000+ seconds on local macOS vs 120 seconds on cloud-hosted Ubuntu. I’ve written up a detailed investigation here, here are two ways to alleviate your Poetry woes:

  1. Internet speed matters a lot: Simply using an ethernet cable gets my download speed from 70 Mbps to 500 Mbps, giving me a 6x performance boost in using Poetry. poetry update makes a lot of API calls to PyPI and has to download certain packages to resolve their dependencies (sometimes for good reasons).
  2. Use CI to resolve your Poetry environments: I wrote this Poetry lock & export Github Action to resolve Poetry environments in CI (usage instructions here). It takes a few minutes to push, click, wait for the PR, merge, pull on local. But it saves me an hour of waiting.

Other points of note:

  • macOS seems to compile 15%-50% slower than Ubuntu. This is for a “pandas-only” environment, controlling for network speed & architecture using Github Actions to run these performance tests. See my results and code here.
  • Some packages add a huge amount of time. For example, facenet-pytorch makes my local poetry update go from 100s to 600s.

Having read through most of this thread 2-3 times now, it sounds like a way forward would be to identify packages that don’t distribute a wheel, and fix them to include said wheel. It also sounds like it wouldn’t be completely perfect, but apparently would improve the situation quite a lot. https://pythonwheels.com/ has a short explanation of how to do this under “My package is white. What can I do?”.

I’m wondering how hard it would be to write a little bot that creates an automated PR (focusing on packages whose code is on github for now) with a fix that includes a wheel in the distribution. In cases where the fix is too hard to create automatically, the bot could open an issue with links to good explanations about how to improve the package and why it matters. Has anyone tried something like this already?

geopandas seems to take a particularly long time. This was on a new project as the first dependency:

Resolving dependencies... (3335.6s)

Hi,

I would like to invite everone interested in how depedencies should be declared to this discussion on python.org

fin swimmer

I’m new to (more serious) python and don’t understand the big drama. Yet setup.py seems a powerful and very bad idea. Dependency management is terrible in python, because of setup.py?

Can someone post a couple of examples where a txt file is not enough andsetup.py was absolutely necessary?

“is a feature that has enabled better compatibility across an increasingly broad spectrum of platforms.”

Cargo do it like this: https://doc.rust-lang.org/cargo/reference/specifying-dependencies.html#platform-specific-dependencies this is not enough for python?

Why poetry does not create their own package repository, avoiding setup.py and using their own dependency declaration? Could take time… but a bot can automatise the pull request on most python module based on the kind of technics used in https://github.com/David-OConnor/pydeps

It’s not as tough as you imply.

You accept some risk by running the arbitrary code, but accepting things as they are isn’t the right approach. We’re already forcing this on anyone who installs Python packages; it’s what triggers the delays cited in this thread.

I have the above repo running on a $10/month Heroku plan, and it works well.

I’ve made the assumption that if dependencies are specified, they’re specified correctly, so only check the ones that show as having no deps. This won’t work every time, but does in a large majority of cases.

Related: Projects like Poetry are already taking a swing at preventing this in the future: Specifying deps in pyproject.toml, Pipfile etc.

The master branch (and therefore upcoming 1.2.0 release) puts a timeout on all requests.

That’s not going to solve anyone’s connectivity problems for them, but should make it clearer when that is what the problem is. Better, anyway, than hanging indefinitely.

I used to pick Poetry over pipenv because it was installing dependencies A WAY FASTER. Sadly, this has changed over the time and now I’m getting unresponsive endless installations over the over again (the second time in two days):

Screenshot 2022-07-03 at 19 29 34

Yes, I have acknowledged about this https://python-poetry.org/docs/faq/#why-is-the-dependency-resolution-process-slow and I don’t think that letting users wait undetermined long period of time is the correct response to this challenge.

I would prefer to be informed that Poetry saw some issues with deps I would like to install and let me opt-out from this indefinite version guessing. Otherwise, Poetry becomes effectively unusable.

I worked around this issue by disabling IPv6 as explained here: https://stackoverflow.com/questions/50787522/extremely-slow-pip-installs

Tried out the newest release! Much much much faster now!

Well, “slow” is an understatement, I left poetry update running overnight and it’s still going at 100% CPU and using 10.04 GB of memory:

image

That’s the extent of it - it’ll install sub dependencies, but whichever one you install last wins. Tools like Poetry, Cargo, npm, pyflow etc store info about the relationship between all dependencies, and attempt to find a solution that satisfies all constraints. The particular issue of this thread is that the python ecosystem provides no reliable way of determining a packages dependencies without installing the package.

@finswimmer I check the discussion. Seem like they are reinventing the wheel instead of copy/past something that works (composer, Cargo, …).

For requirements.txt, I’m on the one hand not sure how you denote extras at all and even if you can, you would need to repeat the requirements of a within the requirements of b. This is prone to human error.

For sure requirements.txt is not good.

You mean replacing PyPI? Good luck with that.

Yes. But why making poetry if it’s not to replace PyPI and requirements.txt?

If poetry is compatible with PyPI, there is no incentive to add a pyproject.toml. Perhaps I don’t even know I should add one. Now if every time I try to install a package that has no pyproject.toml the command line proposes me to open an issue on this project with a ready to use a template, this could speed things up.

this fixed my issue: poetry cache clear --all pypi

Here is my attempt at a minimal reproducible example.

  1. I cleared the poetry cache:

    • Note: I needed to apply this patch in order to not have the “NoneType attribute error”.
    • poetry cache clear --all pypi
  2. I created a new poetry projcet: poetry new speed

  3. I added these lines to the end of pyproject.toml:

    [tool.poetry.dev-dependencies]
    autoflake = "^1.4"
    autopep8 = "^1.5.5"
    flake8 = "^3.8.4"
    ipykernel = "^5.5.5"
    mypy = "^0.930"
    
    Toggle Full pyproject.toml
     [tool.poetry]
     name = "speed"
     version = "0.1.0"
     description = ""
     authors = ["Adam Teichert <adam.teichert@snow.edu>"]
     readme = "README.md"
    

    [tool.poetry.dependencies] python = “^3.8”

    [build-system] requires = [“poetry-core”] build-backend = “poetry.core.masonry.api”

    [tool.poetry.dev-dependencies] autoflake = “^1.4” autopep8 = “^1.5.5” flake8 = “^3.8.4” ipykernel = “^5.5.5” mypy = “^0.930”

  4. I ran py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph.svg poetry update -- -vvv multiple times, canceling each time after several seconds of inactivity and saving each successive run to a different flamegraph image.

    Toggle Full poetry -vvv output
       $ py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph.svg poetry update -- -vvv
       py-spy> Sampling process 100 times a second. Press Control-C to exit.
    

    Loading configuration file /home/adam/.config/pypoetry/config.toml Using virtualenv: /home/adam/.cache/pypoetry/virtualenvs/speed-oYrO2tXU-py3.8 Updating dependencies Resolving dependencies… 1: fact: speed is 0.1.0 1: derived: speed 1: fact: speed depends on autoflake (^1.4) 1: fact: speed depends on autopep8 (^1.5.5) 1: fact: speed depends on flake8 (^3.8.4) 1: fact: speed depends on ipykernel (^5.5.5) 1: fact: speed depends on mypy (^0.930) 1: selecting speed (0.1.0) 1: derived: mypy (>=0.930,<0.931) 1: derived: ipykernel (>=5.5.5,<6.0.0) 1: derived: flake8 (>=3.8.4,<4.0.0) 1: derived: autopep8 (>=1.5.5,<2.0.0) 1: derived: autoflake (>=1.4,<2.0) PyPI: No release information found for mypy-0.500, skipping PyPI: 1 packages found for mypy >=0.930,<0.931 PyPI: No release information found for ipykernel-4.0.0, skipping PyPI: No release information found for ipykernel-4.0.0.dev, skipping PyPI: 2 packages found for ipykernel >=5.5.5,<6.0.0 PyPI: 4 packages found for flake8 >=3.8.4,<4.0.0 PyPI: 4 packages found for autopep8 >=1.5.5,<2.0.0 PyPI: 1 packages found for autoflake >=1.4,<2.0 PyPI: Getting info for mypy (0.930) from PyPI 1: fact: mypy (0.930) depends on typing-extensions (>=3.10) 1: fact: mypy (0.930) depends on mypy-extensions (>=0.4.3) 1: fact: mypy (0.930) depends on tomli (>=1.1.0) 1: selecting mypy (0.930) 1: derived: tomli (>=1.1.0) 1: derived: mypy-extensions (>=0.4.3) 1: derived: typing-extensions (>=3.10) PyPI: 7 packages found for tomli >=1.1.0 PyPI: 1 packages found for mypy-extensions >=0.4.3 PyPI: 8 packages found for typing-extensions >=3.10 PyPI: Getting info for autoflake (1.4) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading sdist: autoflake-1.4.tar.gz 1: fact: autoflake (1.4) depends on pyflakes (>=1.1.0) 1: selecting autoflake (1.4) 1: derived: pyflakes (>=1.1.0) PyPI: No release information found for pyflakes-0.2.0, skipping PyPI: No release information found for pyflakes-0.2.1, skipping PyPI: 16 packages found for pyflakes >=1.1.0 PyPI: Getting info for mypy-extensions (0.4.3) from PyPI 1: selecting mypy-extensions (0.4.3) PyPI: Getting info for ipykernel (5.5.6) from PyPI 1: fact: ipykernel (5.5.6) depends on ipython-genutils () 1: fact: ipykernel (5.5.6) depends on ipython (>=5.0.0) 1: fact: ipykernel (5.5.6) depends on traitlets (>=4.1.0) 1: fact: ipykernel (5.5.6) depends on jupyter-client () 1: fact: ipykernel (5.5.6) depends on tornado (>=4.2) 1: fact: ipykernel (5.5.6) depends on appnope (*) 1: selecting ipykernel (5.5.6) 1: derived: appnope 1: derived: tornado (>=4.2) 1: derived: jupyter-client 1: derived: traitlets (>=4.1.0) 1: derived: ipython (>=5.0.0) 1: derived: ipython-genutils PyPI: 11 packages found for appnope * PyPI: 22 packages found for tornado >=4.2 PyPI: No release information found for jupyter-client-4.0.0.dev, skipping PyPI: 51 packages found for jupyter-client * ^C 1: Version solving took 21.440 seconds. 1: Tried 1 solutions.

    py-spy> Stopped sampling because Control-C pressed py-spy> Wrote flamegraph data to ‘flamegraph.svg’. Samples: 2165 Errors: 0 $ py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph2.svg poetry update – -vvv py-spy> Sampling process 100 times a second. Press Control-C to exit.

    Loading configuration file /home/adam/.config/pypoetry/config.toml Using virtualenv: /home/adam/.cache/pypoetry/virtualenvs/speed-oYrO2tXU-py3.8 Updating dependencies Resolving dependencies… 1: fact: speed is 0.1.0 1: derived: speed 1: fact: speed depends on autoflake (^1.4) 1: fact: speed depends on autopep8 (^1.5.5) 1: fact: speed depends on flake8 (^3.8.4) 1: fact: speed depends on ipykernel (^5.5.5) 1: fact: speed depends on mypy (^0.930) 1: selecting speed (0.1.0) 1: derived: mypy (>=0.930,<0.931) 1: derived: ipykernel (>=5.5.5,<6.0.0) 1: derived: flake8 (>=3.8.4,<4.0.0) 1: derived: autopep8 (>=1.5.5,<2.0.0) 1: derived: autoflake (>=1.4,<2.0) PyPI: No release information found for mypy-0.500, skipping PyPI: 1 packages found for mypy >=0.930,<0.931 PyPI: No release information found for ipykernel-4.0.0, skipping PyPI: No release information found for ipykernel-4.0.0.dev, skipping PyPI: 2 packages found for ipykernel >=5.5.5,<6.0.0 PyPI: 4 packages found for flake8 >=3.8.4,<4.0.0 PyPI: 4 packages found for autopep8 >=1.5.5,<2.0.0 PyPI: 1 packages found for autoflake >=1.4,<2.0 1: fact: mypy (0.930) depends on typing-extensions (>=3.10) 1: fact: mypy (0.930) depends on mypy-extensions (>=0.4.3) 1: fact: mypy (0.930) depends on tomli (>=1.1.0) 1: selecting mypy (0.930) 1: derived: tomli (>=1.1.0) 1: derived: mypy-extensions (>=0.4.3) 1: derived: typing-extensions (>=3.10) PyPI: 7 packages found for tomli >=1.1.0 PyPI: 1 packages found for mypy-extensions >=0.4.3 PyPI: 8 packages found for typing-extensions >=3.10 1: fact: autoflake (1.4) depends on pyflakes (>=1.1.0) 1: selecting autoflake (1.4) 1: derived: pyflakes (>=1.1.0) PyPI: No release information found for pyflakes-0.2.0, skipping PyPI: No release information found for pyflakes-0.2.1, skipping PyPI: 16 packages found for pyflakes >=1.1.0 1: selecting mypy-extensions (0.4.3) 1: fact: ipykernel (5.5.6) depends on ipython-genutils () 1: fact: ipykernel (5.5.6) depends on ipython (>=5.0.0) 1: fact: ipykernel (5.5.6) depends on traitlets (>=4.1.0) 1: fact: ipykernel (5.5.6) depends on jupyter-client () 1: fact: ipykernel (5.5.6) depends on tornado (>=4.2) 1: fact: ipykernel (5.5.6) depends on appnope () 1: selecting ipykernel (5.5.6) 1: derived: appnope 1: derived: tornado (>=4.2) 1: derived: jupyter-client 1: derived: traitlets (>=4.1.0) 1: derived: ipython (>=5.0.0) 1: derived: ipython-genutils PyPI: 11 packages found for appnope * PyPI: 22 packages found for tornado >=4.2 PyPI: No release information found for jupyter-client-4.0.0.dev, skipping PyPI: 51 packages found for jupyter-client * PyPI: No release information found for traitlets-4.0.0.dev, skipping PyPI: 16 packages found for traitlets >=4.1.0 PyPI: No release information found for ipython-0.6.10, skipping PyPI: No release information found for ipython-0.6.11, skipping PyPI: No release information found for ipython-0.6.12, skipping PyPI: No release information found for ipython-0.6.13, skipping PyPI: No release information found for ipython-0.6.14, skipping PyPI: No release information found for ipython-0.6.15, skipping PyPI: No release information found for ipython-0.6.4, skipping PyPI: No release information found for ipython-0.6.5, skipping PyPI: No release information found for ipython-0.6.6, skipping PyPI: No release information found for ipython-0.6.7, skipping PyPI: No release information found for ipython-0.6.8, skipping PyPI: No release information found for ipython-0.6.9, skipping PyPI: No release information found for ipython-0.7.0, skipping PyPI: No release information found for ipython-0.7.1, skipping PyPI: No release information found for ipython-0.7.1.fix1, skipping PyPI: No release information found for ipython-0.7.2, skipping PyPI: No release information found for ipython-0.7.4.svn.r2010, skipping PyPI: No release information found for ipython-0.8.0, skipping PyPI: No release information found for ipython-0.8.1, skipping PyPI: No release information found for ipython-0.8.2, skipping PyPI: No release information found for ipython-0.8.3, skipping PyPI: No release information found for ipython-0.8.4, skipping PyPI: No release information found for ipython-0.9, skipping PyPI: No release information found for ipython-0.9.1, skipping PyPI: No release information found for ipython-4.0.0-b1, skipping PyPI: 74 packages found for ipython >=5.0.0 PyPI: 2 packages found for ipython-genutils * PyPI: Getting info for ipython-genutils (0.2.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: ipython_genutils-0.2.0-py2.py3-none-any.whl 1: selecting ipython-genutils (0.2.0) PyPI: Getting info for flake8 (3.9.2) from PyPI 1: fact: flake8 (3.9.2) depends on pyflakes (>=2.3.0,<2.4.0) 1: fact: flake8 (3.9.2) depends on pycodestyle (>=2.7.0,<2.8.0) 1: fact: flake8 (3.9.2) depends on mccabe (>=0.6.0,<0.7.0) 1: selecting flake8 (3.9.2) 1: derived: mccabe (>=0.6.0,<0.7.0) 1: derived: pycodestyle (>=2.7.0,<2.8.0) 1: derived: pyflakes (>=2.3.0,<2.4.0) PyPI: No release information found for mccabe-0.0.0, skipping PyPI: 2 packages found for mccabe >=0.6.0,<0.7.0 PyPI: No release information found for pycodestyle-0.0.0, skipping PyPI: 1 packages found for pycodestyle >=2.7.0,<2.8.0 PyPI: Getting info for pycodestyle (2.7.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: pycodestyle-2.7.0-py2.py3-none-any.whl 1: selecting pycodestyle (2.7.0) PyPI: Getting info for pyflakes (2.3.1) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: pyflakes-2.3.1-py2.py3-none-any.whl 1: selecting pyflakes (2.3.1) PyPI: Getting info for mccabe (0.6.1) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: mccabe-0.6.1-py2.py3-none-any.whl 1: selecting mccabe (0.6.1) PyPI: Getting info for autopep8 (1.6.0) from PyPI 1: fact: autopep8 (1.6.0) depends on pycodestyle (>=2.8.0) 1: fact: autopep8 (1.6.0) depends on toml () 1: derived: not autopep8 (==1.6.0) PyPI: Getting info for autopep8 (1.5.7) from PyPI 1: fact: autopep8 (1.5.7) depends on pycodestyle (>=2.7.0) 1: fact: autopep8 (1.5.7) depends on toml (*) 1: selecting autopep8 (1.5.7) 1: derived: toml PyPI: 16 packages found for toml * PyPI: Getting info for tomli (2.0.1) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: tomli-2.0.1-py3-none-any.whl 1: selecting tomli (2.0.1) PyPI: Getting info for typing-extensions (4.2.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: typing_extensions-4.2.0-py3-none-any.whl 1: selecting typing-extensions (4.2.0) PyPI: Getting info for traitlets (5.1.1) from PyPI 1: selecting traitlets (5.1.1) PyPI: Getting info for toml (0.10.2) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: toml-0.10.2-py2.py3-none-any.whl 1: selecting toml (0.10.2) PyPI: Getting info for tornado (6.1) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading sdist: tornado-6.1.tar.gz ^C 1: Version solving took 18.430 seconds. 1: Tried 1 solutions.

    py-spy> Stopped sampling because Control-C pressed py-spy> Wrote flamegraph data to ‘flamegraph2.svg’. Samples: 1890 Errors: 0 $ py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph3.svg poetry update – -vvv py-spy> Sampling process 100 times a second. Press Control-C to exit.

    Loading configuration file /home/adam/.config/pypoetry/config.toml Using virtualenv: /home/adam/.cache/pypoetry/virtualenvs/speed-oYrO2tXU-py3.8 Updating dependencies Resolving dependencies… 1: fact: speed is 0.1.0 1: derived: speed 1: fact: speed depends on autoflake (^1.4) 1: fact: speed depends on autopep8 (^1.5.5) 1: fact: speed depends on flake8 (^3.8.4) 1: fact: speed depends on ipykernel (^5.5.5) 1: fact: speed depends on mypy (^0.930) 1: selecting speed (0.1.0) 1: derived: mypy (>=0.930,<0.931) 1: derived: ipykernel (>=5.5.5,<6.0.0) 1: derived: flake8 (>=3.8.4,<4.0.0) 1: derived: autopep8 (>=1.5.5,<2.0.0) 1: derived: autoflake (>=1.4,<2.0) PyPI: No release information found for mypy-0.500, skipping PyPI: 1 packages found for mypy >=0.930,<0.931 PyPI: No release information found for ipykernel-4.0.0, skipping PyPI: No release information found for ipykernel-4.0.0.dev, skipping PyPI: 2 packages found for ipykernel >=5.5.5,<6.0.0 PyPI: 4 packages found for flake8 >=3.8.4,<4.0.0 PyPI: 4 packages found for autopep8 >=1.5.5,<2.0.0 PyPI: 1 packages found for autoflake >=1.4,<2.0 1: fact: mypy (0.930) depends on typing-extensions (>=3.10) 1: fact: mypy (0.930) depends on mypy-extensions (>=0.4.3) 1: fact: mypy (0.930) depends on tomli (>=1.1.0) 1: selecting mypy (0.930) 1: derived: tomli (>=1.1.0) 1: derived: mypy-extensions (>=0.4.3) 1: derived: typing-extensions (>=3.10) PyPI: 7 packages found for tomli >=1.1.0 PyPI: 1 packages found for mypy-extensions >=0.4.3 PyPI: 8 packages found for typing-extensions >=3.10 1: fact: autoflake (1.4) depends on pyflakes (>=1.1.0) 1: selecting autoflake (1.4) 1: derived: pyflakes (>=1.1.0) PyPI: No release information found for pyflakes-0.2.0, skipping PyPI: No release information found for pyflakes-0.2.1, skipping PyPI: 16 packages found for pyflakes >=1.1.0 1: selecting mypy-extensions (0.4.3) 1: fact: ipykernel (5.5.6) depends on ipython-genutils () 1: fact: ipykernel (5.5.6) depends on ipython (>=5.0.0) 1: fact: ipykernel (5.5.6) depends on traitlets (>=4.1.0) 1: fact: ipykernel (5.5.6) depends on jupyter-client () 1: fact: ipykernel (5.5.6) depends on tornado (>=4.2) 1: fact: ipykernel (5.5.6) depends on appnope () 1: selecting ipykernel (5.5.6) 1: derived: appnope 1: derived: tornado (>=4.2) 1: derived: jupyter-client 1: derived: traitlets (>=4.1.0) 1: derived: ipython (>=5.0.0) 1: derived: ipython-genutils PyPI: 11 packages found for appnope * PyPI: 22 packages found for tornado >=4.2 PyPI: No release information found for jupyter-client-4.0.0.dev, skipping PyPI: 51 packages found for jupyter-client * PyPI: No release information found for traitlets-4.0.0.dev, skipping PyPI: 16 packages found for traitlets >=4.1.0 PyPI: No release information found for ipython-0.6.10, skipping PyPI: No release information found for ipython-0.6.11, skipping PyPI: No release information found for ipython-0.6.12, skipping PyPI: No release information found for ipython-0.6.13, skipping PyPI: No release information found for ipython-0.6.14, skipping PyPI: No release information found for ipython-0.6.15, skipping PyPI: No release information found for ipython-0.6.4, skipping PyPI: No release information found for ipython-0.6.5, skipping PyPI: No release information found for ipython-0.6.6, skipping PyPI: No release information found for ipython-0.6.7, skipping PyPI: No release information found for ipython-0.6.8, skipping PyPI: No release information found for ipython-0.6.9, skipping PyPI: No release information found for ipython-0.7.0, skipping PyPI: No release information found for ipython-0.7.1, skipping PyPI: No release information found for ipython-0.7.1.fix1, skipping PyPI: No release information found for ipython-0.7.2, skipping PyPI: No release information found for ipython-0.7.4.svn.r2010, skipping PyPI: No release information found for ipython-0.8.0, skipping PyPI: No release information found for ipython-0.8.1, skipping PyPI: No release information found for ipython-0.8.2, skipping PyPI: No release information found for ipython-0.8.3, skipping PyPI: No release information found for ipython-0.8.4, skipping PyPI: No release information found for ipython-0.9, skipping PyPI: No release information found for ipython-0.9.1, skipping PyPI: No release information found for ipython-4.0.0-b1, skipping PyPI: 74 packages found for ipython >=5.0.0 PyPI: 2 packages found for ipython-genutils * 1: selecting ipython-genutils (0.2.0) 1: fact: flake8 (3.9.2) depends on pyflakes (>=2.3.0,<2.4.0) 1: fact: flake8 (3.9.2) depends on pycodestyle (>=2.7.0,<2.8.0) 1: fact: flake8 (3.9.2) depends on mccabe (>=0.6.0,<0.7.0) 1: selecting flake8 (3.9.2) 1: derived: mccabe (>=0.6.0,<0.7.0) 1: derived: pycodestyle (>=2.7.0,<2.8.0) 1: derived: pyflakes (>=2.3.0,<2.4.0) PyPI: No release information found for mccabe-0.0.0, skipping PyPI: 2 packages found for mccabe >=0.6.0,<0.7.0 PyPI: No release information found for pycodestyle-0.0.0, skipping PyPI: 1 packages found for pycodestyle >=2.7.0,<2.8.0 1: selecting pycodestyle (2.7.0) 1: selecting pyflakes (2.3.1) 1: selecting mccabe (0.6.1) 1: fact: autopep8 (1.6.0) depends on pycodestyle (>=2.8.0) 1: fact: autopep8 (1.6.0) depends on toml () 1: derived: not autopep8 (==1.6.0) 1: fact: autopep8 (1.5.7) depends on pycodestyle (>=2.7.0) 1: fact: autopep8 (1.5.7) depends on toml () 1: selecting autopep8 (1.5.7) 1: derived: toml PyPI: 16 packages found for toml * 1: selecting tomli (2.0.1) 1: selecting typing-extensions (4.2.0) 1: selecting traitlets (5.1.1) 1: selecting toml (0.10.2) PyPI: Getting info for tornado (6.1) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading sdist: tornado-6.1.tar.gz 1: selecting tornado (6.1) PyPI: Getting info for jupyter-client (7.3.0) from PyPI 1: fact: jupyter-client (7.3.0) depends on entrypoints () 1: fact: jupyter-client (7.3.0) depends on jupyter-core (>=4.9.2) 1: fact: jupyter-client (7.3.0) depends on nest-asyncio (>=1.5.4) 1: fact: jupyter-client (7.3.0) depends on python-dateutil (>=2.8.2) 1: fact: jupyter-client (7.3.0) depends on pyzmq (>=22.3) 1: fact: jupyter-client (7.3.0) depends on tornado (>=6.0) 1: fact: jupyter-client (7.3.0) depends on traitlets () 1: selecting jupyter-client (7.3.0) 1: derived: pyzmq (>=22.3) 1: derived: python-dateutil (>=2.8.2) 1: derived: nest-asyncio (>=1.5.4) 1: derived: jupyter-core (>=4.9.2) 1: derived: entrypoints PyPI: 1 packages found for pyzmq >=22.3 PyPI: No release information found for python-dateutil-0.1, skipping PyPI: No release information found for python-dateutil-0.3, skipping PyPI: No release information found for python-dateutil-0.4, skipping PyPI: No release information found for python-dateutil-0.5, skipping PyPI: No release information found for python-dateutil-1.0, skipping PyPI: No release information found for python-dateutil-1.1, skipping PyPI: No release information found for python-dateutil-1.2, skipping PyPI: No release information found for python-dateutil-2.0, skipping PyPI: 1 packages found for python-dateutil >=2.8.2 PyPI: 2 packages found for nest-asyncio >=1.5.4 PyPI: No release information found for jupyter-core-4.0.0.dev, skipping PyPI: 2 packages found for jupyter-core >=4.9.2 PyPI: 7 packages found for entrypoints * PyPI: Getting info for pyzmq (22.3.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading sdist: pyzmq-22.3.0.tar.gz 1: fact: pyzmq (22.3.0) depends on py () 1: fact: pyzmq (22.3.0) depends on cffi () 1: selecting pyzmq (22.3.0) 1: derived: cffi 1: derived: py PyPI: 67 packages found for cffi * PyPI: No release information found for py-0.8.0-alpha2, skipping PyPI: No release information found for py-0.9.0, skipping PyPI: No release information found for py-1.4.32.dev1, skipping PyPI: 61 packages found for py * PyPI: Getting info for python-dateutil (2.8.2) from PyPI 1: fact: python-dateutil (2.8.2) depends on six (>=1.5) 1: selecting python-dateutil (2.8.2) 1: derived: six (>=1.5) PyPI: 18 packages found for six >=1.5 PyPI: Getting info for nest-asyncio (1.5.5) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: nest_asyncio-1.5.5-py3-none-any.whl 1: selecting nest-asyncio (1.5.5) PyPI: Getting info for jupyter-core (4.10.0) from PyPI 1: fact: jupyter-core (4.10.0) depends on traitlets () 1: fact: jupyter-core (4.10.0) depends on pywin32 (>=1.0) 1: selecting jupyter-core (4.10.0) 1: derived: pywin32 (>=1.0) PyPI: No release information found for pywin32-210, skipping PyPI: No release information found for pywin32-214, skipping PyPI: 11 packages found for pywin32 >=1.0 PyPI: Getting info for entrypoints (0.4) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: entrypoints-0.4-py3-none-any.whl 1: selecting entrypoints (0.4) PyPI: Getting info for six (1.16.0) from PyPI ^C 1: Version solving took 44.474 seconds. 1: Tried 1 solutions.

    py-spy> Stopped sampling because Control-C pressed py-spy> Wrote flamegraph data to ‘flamegraph3.svg’. Samples: 7208 Errors: 1 $ py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph4. svg poetry update – -vvv py-spy> Sampling process 100 times a second. Press Control-C to exit.

    Loading configuration file /home/adam/.config/pypoetry/config.toml Using virtualenv: /home/adam/.cache/pypoetry/virtualenvs/speed-oYrO2tXU-py3.8 Updating dependencies Resolving dependencies… 1: fact: speed is 0.1.0 1: derived: speed 1: fact: speed depends on autoflake (^1.4) 1: fact: speed depends on autopep8 (^1.5.5) 1: fact: speed depends on flake8 (^3.8.4) 1: fact: speed depends on ipykernel (^5.5.5) 1: fact: speed depends on mypy (^0.930) 1: selecting speed (0.1.0) 1: derived: mypy (>=0.930,<0.931) 1: derived: ipykernel (>=5.5.5,<6.0.0) 1: derived: flake8 (>=3.8.4,<4.0.0) 1: derived: autopep8 (>=1.5.5,<2.0.0) 1: derived: autoflake (>=1.4,<2.0) PyPI: No release information found for mypy-0.500, skipping PyPI: 1 packages found for mypy >=0.930,<0.931 PyPI: No release information found for ipykernel-4.0.0, skipping PyPI: No release information found for ipykernel-4.0.0.dev, skipping PyPI: 2 packages found for ipykernel >=5.5.5,<6.0.0 PyPI: 4 packages found for flake8 >=3.8.4,<4.0.0 PyPI: 4 packages found for autopep8 >=1.5.5,<2.0.0 PyPI: 1 packages found for autoflake >=1.4,<2.0 1: fact: mypy (0.930) depends on typing-extensions (>=3.10) 1: fact: mypy (0.930) depends on mypy-extensions (>=0.4.3) 1: fact: mypy (0.930) depends on tomli (>=1.1.0) 1: selecting mypy (0.930) 1: derived: tomli (>=1.1.0) 1: derived: mypy-extensions (>=0.4.3) 1: derived: typing-extensions (>=3.10) PyPI: 7 packages found for tomli >=1.1.0 PyPI: 1 packages found for mypy-extensions >=0.4.3 PyPI: 8 packages found for typing-extensions >=3.10 1: fact: autoflake (1.4) depends on pyflakes (>=1.1.0) 1: selecting autoflake (1.4) 1: derived: pyflakes (>=1.1.0) PyPI: No release information found for pyflakes-0.2.0, skipping PyPI: No release information found for pyflakes-0.2.1, skipping PyPI: 16 packages found for pyflakes >=1.1.0 1: selecting mypy-extensions (0.4.3) 1: fact: ipykernel (5.5.6) depends on ipython-genutils () 1: fact: ipykernel (5.5.6) depends on ipython (>=5.0.0) 1: fact: ipykernel (5.5.6) depends on traitlets (>=4.1.0) 1: fact: ipykernel (5.5.6) depends on jupyter-client () 1: fact: ipykernel (5.5.6) depends on tornado (>=4.2) 1: fact: ipykernel (5.5.6) depends on appnope () 1: selecting ipykernel (5.5.6) 1: derived: appnope 1: derived: tornado (>=4.2) 1: derived: jupyter-client 1: derived: traitlets (>=4.1.0) 1: derived: ipython (>=5.0.0) 1: derived: ipython-genutils PyPI: 11 packages found for appnope * PyPI: 22 packages found for tornado >=4.2 PyPI: No release information found for jupyter-client-4.0.0.dev, skipping PyPI: 51 packages found for jupyter-client * PyPI: No release information found for traitlets-4.0.0.dev, skipping PyPI: 16 packages found for traitlets >=4.1.0 PyPI: No release information found for ipython-0.6.10, skipping PyPI: No release information found for ipython-0.6.11, skipping PyPI: No release information found for ipython-0.6.12, skipping PyPI: No release information found for ipython-0.6.13, skipping PyPI: No release information found for ipython-0.6.14, skipping PyPI: No release information found for ipython-0.6.15, skipping PyPI: No release information found for ipython-0.6.4, skipping PyPI: No release information found for ipython-0.6.5, skipping PyPI: No release information found for ipython-0.6.6, skipping PyPI: No release information found for ipython-0.6.7, skipping PyPI: No release information found for ipython-0.6.8, skipping PyPI: No release information found for ipython-0.6.9, skipping PyPI: No release information found for ipython-0.7.0, skipping PyPI: No release information found for ipython-0.7.1, skipping PyPI: No release information found for ipython-0.7.1.fix1, skipping PyPI: No release information found for ipython-0.7.2, skipping PyPI: No release information found for ipython-0.7.4.svn.r2010, skipping PyPI: No release information found for ipython-0.8.0, skipping PyPI: No release information found for ipython-0.8.1, skipping PyPI: No release information found for ipython-0.8.2, skipping PyPI: No release information found for ipython-0.8.3, skipping PyPI: No release information found for ipython-0.8.4, skipping PyPI: No release information found for ipython-0.9, skipping PyPI: No release information found for ipython-0.9.1, skipping PyPI: No release information found for ipython-4.0.0-b1, skipping PyPI: 74 packages found for ipython >=5.0.0 PyPI: 2 packages found for ipython-genutils * 1: selecting ipython-genutils (0.2.0) 1: fact: flake8 (3.9.2) depends on pyflakes (>=2.3.0,<2.4.0) 1: fact: flake8 (3.9.2) depends on pycodestyle (>=2.7.0,<2.8.0) 1: fact: flake8 (3.9.2) depends on mccabe (>=0.6.0,<0.7.0) 1: selecting flake8 (3.9.2) 1: derived: mccabe (>=0.6.0,<0.7.0) 1: derived: pycodestyle (>=2.7.0,<2.8.0) 1: derived: pyflakes (>=2.3.0,<2.4.0) PyPI: No release information found for mccabe-0.0.0, skipping PyPI: 2 packages found for mccabe >=0.6.0,<0.7.0 PyPI: No release information found for pycodestyle-0.0.0, skipping PyPI: 1 packages found for pycodestyle >=2.7.0,<2.8.0 1: selecting pycodestyle (2.7.0) 1: selecting pyflakes (2.3.1) 1: selecting mccabe (0.6.1) 1: fact: autopep8 (1.6.0) depends on pycodestyle (>=2.8.0) 1: fact: autopep8 (1.6.0) depends on toml () 1: derived: not autopep8 (==1.6.0) 1: fact: autopep8 (1.5.7) depends on pycodestyle (>=2.7.0) 1: fact: autopep8 (1.5.7) depends on toml () 1: selecting autopep8 (1.5.7) 1: derived: toml PyPI: 16 packages found for toml * 1: selecting tomli (2.0.1) 1: selecting typing-extensions (4.2.0) 1: selecting traitlets (5.1.1) 1: selecting toml (0.10.2) 1: selecting tornado (6.1) 1: fact: jupyter-client (7.3.0) depends on entrypoints () 1: fact: jupyter-client (7.3.0) depends on jupyter-core (>=4.9.2) 1: fact: jupyter-client (7.3.0) depends on nest-asyncio (>=1.5.4) 1: fact: jupyter-client (7.3.0) depends on python-dateutil (>=2.8.2) 1: fact: jupyter-client (7.3.0) depends on pyzmq (>=22.3) 1: fact: jupyter-client (7.3.0) depends on tornado (>=6.0) 1: fact: jupyter-client (7.3.0) depends on traitlets () 1: selecting jupyter-client (7.3.0) 1: derived: pyzmq (>=22.3) 1: derived: python-dateutil (>=2.8.2) 1: derived: nest-asyncio (>=1.5.4) 1: derived: jupyter-core (>=4.9.2) 1: derived: entrypoints PyPI: 1 packages found for pyzmq >=22.3 PyPI: No release information found for python-dateutil-0.1, skipping PyPI: No release information found for python-dateutil-0.3, skipping PyPI: No release information found for python-dateutil-0.4, skipping PyPI: No release information found for python-dateutil-0.5, skipping PyPI: No release information found for python-dateutil-1.0, skipping PyPI: No release information found for python-dateutil-1.1, skipping PyPI: No release information found for python-dateutil-1.2, skipping PyPI: No release information found for python-dateutil-2.0, skipping PyPI: 1 packages found for python-dateutil >=2.8.2 PyPI: 2 packages found for nest-asyncio >=1.5.4 PyPI: No release information found for jupyter-core-4.0.0.dev, skipping PyPI: 2 packages found for jupyter-core >=4.9.2 PyPI: 7 packages found for entrypoints * 1: fact: pyzmq (22.3.0) depends on py () 1: fact: pyzmq (22.3.0) depends on cffi () 1: selecting pyzmq (22.3.0) 1: derived: cffi 1: derived: py PyPI: 67 packages found for cffi * PyPI: No release information found for py-0.8.0-alpha2, skipping PyPI: No release information found for py-0.9.0, skipping PyPI: No release information found for py-1.4.32.dev1, skipping PyPI: 61 packages found for py * 1: fact: python-dateutil (2.8.2) depends on six (>=1.5) 1: selecting python-dateutil (2.8.2) 1: derived: six (>=1.5) PyPI: 18 packages found for six >=1.5 1: selecting nest-asyncio (1.5.5) 1: fact: jupyter-core (4.10.0) depends on traitlets () 1: fact: jupyter-core (4.10.0) depends on pywin32 (>=1.0) 1: selecting jupyter-core (4.10.0) 1: derived: pywin32 (>=1.0) PyPI: No release information found for pywin32-210, skipping PyPI: No release information found for pywin32-214, skipping PyPI: 11 packages found for pywin32 >=1.0 1: selecting entrypoints (0.4) PyPI: Getting info for six (1.16.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: six-1.16.0-py2.py3-none-any.whl 1: selecting six (1.16.0) PyPI: Getting info for ipython (8.2.0) from PyPI 1: fact: ipython (8.2.0) depends on backcall () 1: fact: ipython (8.2.0) depends on decorator () 1: fact: ipython (8.2.0) depends on jedi (>=0.16) 1: fact: ipython (8.2.0) depends on matplotlib-inline () 1: fact: ipython (8.2.0) depends on pickleshare () 1: fact: ipython (8.2.0) depends on prompt-toolkit (>=2.0.0,❤️.0.0 || >3.0.0,❤️.0.1 || >3.0.1,❤️.1.0) 1: fact: ipython (8.2.0) depends on pygments (>=2.4.0) 1: fact: ipython (8.2.0) depends on setuptools (>=18.5) 1: fact: ipython (8.2.0) depends on stack-data () 1: fact: ipython (8.2.0) depends on traitlets (>=5) 1: fact: ipython (8.2.0) depends on pexpect (>4.3) 1: fact: ipython (8.2.0) depends on appnope () 1: fact: ipython (8.2.0) depends on colorama () 1: selecting ipython (8.2.0) 1: derived: colorama 1: derived: pexpect (>4.3) 1: derived: stack-data 1: derived: setuptools (>=18.5) 1: derived: pygments (>=2.4.0) 1: derived: prompt-toolkit (>=2.0.0,!=3.0.0,!=3.0.1,❤️.1.0) 1: derived: pickleshare 1: derived: matplotlib-inline 1: derived: jedi (>=0.16) 1: derived: decorator 1: derived: backcall PyPI: 42 packages found for colorama * PyPI: No release information found for pexpect-0.97, skipping PyPI: No release information found for pexpect-2.01, skipping PyPI: 6 packages found for pexpect >4.3 PyPI: 13 packages found for stack-data * PyPI: No release information found for setuptools-13.0, skipping PyPI: 332 packages found for setuptools >=18.5 PyPI: 20 packages found for pygments >=2.4.0 PyPI: 38 packages found for prompt-toolkit >=2.0.0,❤️.0.0 || >3.0.0,❤️.0.1 || >3.0.1,❤️.1.0 PyPI: 12 packages found for pickleshare * PyPI: 4 packages found for matplotlib-inline * PyPI: Unable to parse version “0.8.0-final0” for the jedi package, skipping PyPI: Unable to parse version “0.8.1-final0” for the jedi package, skipping PyPI: 6 packages found for jedi >=0.16 PyPI: No release information found for decorator-3.4.1, skipping PyPI: 34 packages found for decorator * PyPI: 2 packages found for backcall * PyPI: Getting info for backcall (0.2.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: backcall-0.2.0-py2.py3-none-any.whl 1: selecting backcall (0.2.0) PyPI: Getting info for matplotlib-inline (0.1.3) from PyPI 1: fact: matplotlib-inline (0.1.3) depends on traitlets () 1: selecting matplotlib-inline (0.1.3) PyPI: Getting info for jedi (0.18.1) from PyPI ^C 1: Version solving took 24.798 seconds. 1: Tried 1 solutions.

    py-spy> Stopped sampling because Control-C pressed py-spy> Wrote flamegraph data to ‘flamegraph4.svg’. Samples: 2419 Errors: 0 $ py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph5. svg poetry update – -vvv py-spy> Sampling process 100 times a second. Press Control-C to exit.

    Loading configuration file /home/adam/.config/pypoetry/config.toml Using virtualenv: /home/adam/.cache/pypoetry/virtualenvs/speed-oYrO2tXU-py3.8 Updating dependencies Resolving dependencies… 1: fact: speed is 0.1.0 1: derived: speed 1: fact: speed depends on autoflake (^1.4) 1: fact: speed depends on autopep8 (^1.5.5) 1: fact: speed depends on flake8 (^3.8.4) 1: fact: speed depends on ipykernel (^5.5.5) 1: fact: speed depends on mypy (^0.930) 1: selecting speed (0.1.0) 1: derived: mypy (>=0.930,<0.931) 1: derived: ipykernel (>=5.5.5,<6.0.0) 1: derived: flake8 (>=3.8.4,<4.0.0) 1: derived: autopep8 (>=1.5.5,<2.0.0) 1: derived: autoflake (>=1.4,<2.0) PyPI: No release information found for mypy-0.500, skipping PyPI: 1 packages found for mypy >=0.930,<0.931 PyPI: No release information found for ipykernel-4.0.0, skipping PyPI: No release information found for ipykernel-4.0.0.dev, skipping PyPI: 2 packages found for ipykernel >=5.5.5,<6.0.0 PyPI: 4 packages found for flake8 >=3.8.4,<4.0.0 PyPI: 4 packages found for autopep8 >=1.5.5,<2.0.0 PyPI: 1 packages found for autoflake >=1.4,<2.0 1: fact: mypy (0.930) depends on typing-extensions (>=3.10) 1: fact: mypy (0.930) depends on mypy-extensions (>=0.4.3) 1: fact: mypy (0.930) depends on tomli (>=1.1.0) 1: selecting mypy (0.930) 1: derived: tomli (>=1.1.0) 1: derived: mypy-extensions (>=0.4.3) 1: derived: typing-extensions (>=3.10) PyPI: 7 packages found for tomli >=1.1.0 PyPI: 1 packages found for mypy-extensions >=0.4.3 PyPI: 8 packages found for typing-extensions >=3.10 1: fact: autoflake (1.4) depends on pyflakes (>=1.1.0) 1: selecting autoflake (1.4) 1: derived: pyflakes (>=1.1.0) PyPI: No release information found for pyflakes-0.2.0, skipping PyPI: No release information found for pyflakes-0.2.1, skipping PyPI: 16 packages found for pyflakes >=1.1.0 1: selecting mypy-extensions (0.4.3) 1: fact: ipykernel (5.5.6) depends on ipython-genutils () 1: fact: ipykernel (5.5.6) depends on ipython (>=5.0.0) 1: fact: ipykernel (5.5.6) depends on traitlets (>=4.1.0) 1: fact: ipykernel (5.5.6) depends on jupyter-client () 1: fact: ipykernel (5.5.6) depends on tornado (>=4.2) 1: fact: ipykernel (5.5.6) depends on appnope () 1: selecting ipykernel (5.5.6) 1: derived: appnope 1: derived: tornado (>=4.2) 1: derived: jupyter-client 1: derived: traitlets (>=4.1.0) 1: derived: ipython (>=5.0.0) 1: derived: ipython-genutils PyPI: 11 packages found for appnope * PyPI: 22 packages found for tornado >=4.2 PyPI: No release information found for jupyter-client-4.0.0.dev, skipping PyPI: 51 packages found for jupyter-client * PyPI: No release information found for traitlets-4.0.0.dev, skipping PyPI: 16 packages found for traitlets >=4.1.0 PyPI: No release information found for ipython-0.6.10, skipping PyPI: No release information found for ipython-0.6.11, skipping PyPI: No release information found for ipython-0.6.12, skipping PyPI: No release information found for ipython-0.6.13, skipping PyPI: No release information found for ipython-0.6.14, skipping PyPI: No release information found for ipython-0.6.15, skipping PyPI: No release information found for ipython-0.6.4, skipping PyPI: No release information found for ipython-0.6.5, skipping PyPI: No release information found for ipython-0.6.6, skipping PyPI: No release information found for ipython-0.6.7, skipping PyPI: No release information found for ipython-0.6.8, skipping PyPI: No release information found for ipython-0.6.9, skipping PyPI: No release information found for ipython-0.7.0, skipping PyPI: No release information found for ipython-0.7.1, skipping PyPI: No release information found for ipython-0.7.1.fix1, skipping PyPI: No release information found for ipython-0.7.2, skipping PyPI: No release information found for ipython-0.7.4.svn.r2010, skipping PyPI: No release information found for ipython-0.8.0, skipping PyPI: No release information found for ipython-0.8.1, skipping PyPI: No release information found for ipython-0.8.2, skipping PyPI: No release information found for ipython-0.8.3, skipping PyPI: No release information found for ipython-0.8.4, skipping PyPI: No release information found for ipython-0.9, skipping PyPI: No release information found for ipython-0.9.1, skipping PyPI: No release information found for ipython-4.0.0-b1, skipping PyPI: 74 packages found for ipython >=5.0.0 PyPI: 2 packages found for ipython-genutils * 1: selecting ipython-genutils (0.2.0) 1: fact: flake8 (3.9.2) depends on pyflakes (>=2.3.0,<2.4.0) 1: fact: flake8 (3.9.2) depends on pycodestyle (>=2.7.0,<2.8.0) 1: fact: flake8 (3.9.2) depends on mccabe (>=0.6.0,<0.7.0) 1: selecting flake8 (3.9.2) 1: derived: mccabe (>=0.6.0,<0.7.0) 1: derived: pycodestyle (>=2.7.0,<2.8.0) 1: derived: pyflakes (>=2.3.0,<2.4.0) PyPI: No release information found for mccabe-0.0.0, skipping PyPI: 2 packages found for mccabe >=0.6.0,<0.7.0 PyPI: No release information found for pycodestyle-0.0.0, skipping PyPI: 1 packages found for pycodestyle >=2.7.0,<2.8.0 1: selecting pycodestyle (2.7.0) 1: selecting pyflakes (2.3.1) 1: selecting mccabe (0.6.1) 1: fact: autopep8 (1.6.0) depends on pycodestyle (>=2.8.0) 1: fact: autopep8 (1.6.0) depends on toml () 1: derived: not autopep8 (==1.6.0) 1: fact: autopep8 (1.5.7) depends on pycodestyle (>=2.7.0) 1: fact: autopep8 (1.5.7) depends on toml () 1: selecting autopep8 (1.5.7) 1: derived: toml PyPI: 16 packages found for toml * 1: selecting tomli (2.0.1) 1: selecting typing-extensions (4.2.0) 1: selecting traitlets (5.1.1) 1: selecting toml (0.10.2) 1: selecting tornado (6.1) 1: fact: jupyter-client (7.3.0) depends on entrypoints () 1: fact: jupyter-client (7.3.0) depends on jupyter-core (>=4.9.2) 1: fact: jupyter-client (7.3.0) depends on nest-asyncio (>=1.5.4) 1: fact: jupyter-client (7.3.0) depends on python-dateutil (>=2.8.2) 1: fact: jupyter-client (7.3.0) depends on pyzmq (>=22.3) 1: fact: jupyter-client (7.3.0) depends on tornado (>=6.0) 1: fact: jupyter-client (7.3.0) depends on traitlets () 1: selecting jupyter-client (7.3.0) 1: derived: pyzmq (>=22.3) 1: derived: python-dateutil (>=2.8.2) 1: derived: nest-asyncio (>=1.5.4) 1: derived: jupyter-core (>=4.9.2) 1: derived: entrypoints PyPI: 1 packages found for pyzmq >=22.3 PyPI: No release information found for python-dateutil-0.1, skipping PyPI: No release information found for python-dateutil-0.3, skipping PyPI: No release information found for python-dateutil-0.4, skipping PyPI: No release information found for python-dateutil-0.5, skipping PyPI: No release information found for python-dateutil-1.0, skipping PyPI: No release information found for python-dateutil-1.1, skipping PyPI: No release information found for python-dateutil-1.2, skipping PyPI: No release information found for python-dateutil-2.0, skipping PyPI: 1 packages found for python-dateutil >=2.8.2 PyPI: 2 packages found for nest-asyncio >=1.5.4 PyPI: No release information found for jupyter-core-4.0.0.dev, skipping PyPI: 2 packages found for jupyter-core >=4.9.2 PyPI: 7 packages found for entrypoints * 1: fact: pyzmq (22.3.0) depends on py () 1: fact: pyzmq (22.3.0) depends on cffi () 1: selecting pyzmq (22.3.0) 1: derived: cffi 1: derived: py PyPI: 67 packages found for cffi * PyPI: No release information found for py-0.8.0-alpha2, skipping PyPI: No release information found for py-0.9.0, skipping PyPI: No release information found for py-1.4.32.dev1, skipping PyPI: 61 packages found for py * 1: fact: python-dateutil (2.8.2) depends on six (>=1.5) 1: selecting python-dateutil (2.8.2) 1: derived: six (>=1.5) PyPI: 18 packages found for six >=1.5 1: selecting nest-asyncio (1.5.5) 1: fact: jupyter-core (4.10.0) depends on traitlets () 1: fact: jupyter-core (4.10.0) depends on pywin32 (>=1.0) 1: selecting jupyter-core (4.10.0) 1: derived: pywin32 (>=1.0) PyPI: No release information found for pywin32-210, skipping PyPI: No release information found for pywin32-214, skipping PyPI: 11 packages found for pywin32 >=1.0 1: selecting entrypoints (0.4) 1: selecting six (1.16.0) 1: fact: ipython (8.2.0) depends on backcall () 1: fact: ipython (8.2.0) depends on decorator () 1: fact: ipython (8.2.0) depends on jedi (>=0.16) 1: fact: ipython (8.2.0) depends on matplotlib-inline () 1: fact: ipython (8.2.0) depends on pickleshare () 1: fact: ipython (8.2.0) depends on prompt-toolkit (>=2.0.0,❤️.0.0 || >3.0.0,❤️.0.1 || >3.0.1,❤️.1.0) 1: fact: ipython (8.2.0) depends on pygments (>=2.4.0) 1: fact: ipython (8.2.0) depends on setuptools (>=18.5) 1: fact: ipython (8.2.0) depends on stack-data () 1: fact: ipython (8.2.0) depends on traitlets (>=5) 1: fact: ipython (8.2.0) depends on pexpect (>4.3) 1: fact: ipython (8.2.0) depends on appnope () 1: fact: ipython (8.2.0) depends on colorama () 1: selecting ipython (8.2.0) 1: derived: colorama 1: derived: pexpect (>4.3) 1: derived: stack-data 1: derived: setuptools (>=18.5) 1: derived: pygments (>=2.4.0) 1: derived: prompt-toolkit (>=2.0.0,!=3.0.0,!=3.0.1,❤️.1.0) 1: derived: pickleshare 1: derived: matplotlib-inline 1: derived: jedi (>=0.16) 1: derived: decorator 1: derived: backcall PyPI: 42 packages found for colorama * PyPI: No release information found for pexpect-0.97, skipping PyPI: No release information found for pexpect-2.01, skipping PyPI: 6 packages found for pexpect >4.3 PyPI: 13 packages found for stack-data * PyPI: No release information found for setuptools-13.0, skipping PyPI: 332 packages found for setuptools >=18.5 PyPI: 20 packages found for pygments >=2.4.0 PyPI: 38 packages found for prompt-toolkit >=2.0.0,❤️.0.0 || >3.0.0,❤️.0.1 || >3.0.1,❤️.1.0 PyPI: 12 packages found for pickleshare * PyPI: 4 packages found for matplotlib-inline * PyPI: Unable to parse version “0.8.0-final0” for the jedi package, skipping PyPI: Unable to parse version “0.8.1-final0” for the jedi package, skipping PyPI: 6 packages found for jedi >=0.16 PyPI: No release information found for decorator-3.4.1, skipping PyPI: 34 packages found for decorator * PyPI: 2 packages found for backcall * 1: selecting backcall (0.2.0) 1: fact: matplotlib-inline (0.1.3) depends on traitlets () 1: selecting matplotlib-inline (0.1.3) PyPI: Getting info for jedi (0.18.1) from PyPI 1: fact: jedi (0.18.1) depends on parso (>=0.8.0,<0.9.0) 1: selecting jedi (0.18.1) 1: derived: parso (>=0.8.0,<0.9.0) PyPI: 4 packages found for parso >=0.8.0,<0.9.0 PyPI: Getting info for parso (0.8.3) from PyPI 1: selecting parso (0.8.3) PyPI: Getting info for pickleshare (0.7.5) from PyPI 1: selecting pickleshare (0.7.5) PyPI: Getting info for stack-data (0.2.0) from PyPI 1: fact: stack-data (0.2.0) depends on executing () 1: fact: stack-data (0.2.0) depends on asttokens () 1: fact: stack-data (0.2.0) depends on pure-eval () 1: selecting stack-data (0.2.0) 1: derived: pure-eval 1: derived: asttokens 1: derived: executing PyPI: 8 packages found for pure-eval * PyPI: 24 packages found for asttokens * PyPI: 24 packages found for executing * PyPI: Getting info for pure-eval (0.2.2) from PyPI 1: selecting pure-eval (0.2.2) PyPI: Getting info for pygments (2.12.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: Pygments-2.12.0-py3-none-any.whl 1: selecting pygments (2.12.0) PyPI: Getting info for asttokens (2.0.5) from PyPI 1: fact: asttokens (2.0.5) depends on six () 1: selecting asttokens (2.0.5) PyPI: Getting info for executing (0.8.3) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: executing-0.8.3-py2.py3-none-any.whl 1: selecting executing (0.8.3) PyPI: Getting info for decorator (5.1.1) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: decorator-5.1.1-py3-none-any.whl 1: selecting decorator (5.1.1) PyPI: Getting info for prompt-toolkit (3.0.29) from PyPI 1: fact: prompt-toolkit (3.0.29) depends on wcwidth () 1: selecting prompt-toolkit (3.0.29) 1: derived: wcwidth PyPI: 17 packages found for wcwidth * PyPI: Getting info for wcwidth (0.2.5) from PyPI 1: selecting wcwidth (0.2.5) PyPI: Getting info for setuptools (62.1.0) from PyPI 1: selecting setuptools (62.1.0) PyPI: Getting info for pexpect (4.8.0) from PyPI 1: fact: pexpect (4.8.0) depends on ptyprocess (>=0.5) 1: selecting pexpect (4.8.0) 1: derived: ptyprocess (>=0.5) PyPI: 5 packages found for ptyprocess >=0.5 PyPI: Getting info for ptyprocess (0.7.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: ptyprocess-0.7.0-py2.py3-none-any.whl 1: selecting ptyprocess (0.7.0) PyPI: Getting info for appnope (0.1.3) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: appnope-0.1.3-py2.py3-none-any.whl 1: selecting appnope (0.1.3) PyPI: Getting info for pywin32 (303) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: pywin32-303-cp310-cp310-win32.whl 1: selecting pywin32 (303) PyPI: Getting info for colorama (0.4.4) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: colorama-0.4.4-py2.py3-none-any.whl 1: selecting colorama (0.4.4) PyPI: Getting info for py (1.11.0) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: py-1.11.0-py2.py3-none-any.whl 1: selecting py (1.11.0) PyPI: Getting info for cffi (1.15.0) from PyPI 1: fact: cffi (1.15.0) depends on pycparser () 1: selecting cffi (1.15.0) 1: derived: pycparser PyPI: 21 packages found for pycparser * PyPI: Getting info for pycparser (2.21) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: pycparser-2.21-py2.py3-none-any.whl ^C 1: Version solving took 23.435 seconds. 1: Tried 1 solutions.

    py-spy> Stopped sampling because Control-C pressed py-spy> Wrote flamegraph data to ‘flamegraph5.svg’. Samples: 2412 Errors: 0 $ py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph6. svg poetry update – -vvv py-spy> Sampling process 100 times a second. Press Control-C to exit.

    Loading configuration file /home/adam/.config/pypoetry/config.toml Using virtualenv: /home/adam/.cache/pypoetry/virtualenvs/speed-oYrO2tXU-py3.8 Updating dependencies Resolving dependencies… 1: fact: speed is 0.1.0 1: derived: speed 1: fact: speed depends on autoflake (^1.4) 1: fact: speed depends on autopep8 (^1.5.5) 1: fact: speed depends on flake8 (^3.8.4) 1: fact: speed depends on ipykernel (^5.5.5) 1: fact: speed depends on mypy (^0.930) 1: selecting speed (0.1.0) 1: derived: mypy (>=0.930,<0.931) 1: derived: ipykernel (>=5.5.5,<6.0.0) 1: derived: flake8 (>=3.8.4,<4.0.0) 1: derived: autopep8 (>=1.5.5,<2.0.0) 1: derived: autoflake (>=1.4,<2.0) PyPI: No release information found for mypy-0.500, skipping PyPI: 1 packages found for mypy >=0.930,<0.931 PyPI: No release information found for ipykernel-4.0.0, skipping PyPI: No release information found for ipykernel-4.0.0.dev, skipping PyPI: 2 packages found for ipykernel >=5.5.5,<6.0.0 PyPI: 4 packages found for flake8 >=3.8.4,<4.0.0 PyPI: 4 packages found for autopep8 >=1.5.5,<2.0.0 PyPI: 1 packages found for autoflake >=1.4,<2.0 1: fact: mypy (0.930) depends on typing-extensions (>=3.10) 1: fact: mypy (0.930) depends on mypy-extensions (>=0.4.3) 1: fact: mypy (0.930) depends on tomli (>=1.1.0) 1: selecting mypy (0.930) 1: derived: tomli (>=1.1.0) 1: derived: mypy-extensions (>=0.4.3) 1: derived: typing-extensions (>=3.10) PyPI: 7 packages found for tomli >=1.1.0 PyPI: 1 packages found for mypy-extensions >=0.4.3 PyPI: 8 packages found for typing-extensions >=3.10 1: fact: autoflake (1.4) depends on pyflakes (>=1.1.0) 1: selecting autoflake (1.4) 1: derived: pyflakes (>=1.1.0) PyPI: No release information found for pyflakes-0.2.0, skipping PyPI: No release information found for pyflakes-0.2.1, skipping PyPI: 16 packages found for pyflakes >=1.1.0 1: selecting mypy-extensions (0.4.3) 1: fact: ipykernel (5.5.6) depends on ipython-genutils () 1: fact: ipykernel (5.5.6) depends on ipython (>=5.0.0) 1: fact: ipykernel (5.5.6) depends on traitlets (>=4.1.0) 1: fact: ipykernel (5.5.6) depends on jupyter-client () 1: fact: ipykernel (5.5.6) depends on tornado (>=4.2) 1: fact: ipykernel (5.5.6) depends on appnope () 1: selecting ipykernel (5.5.6) 1: derived: appnope 1: derived: tornado (>=4.2) 1: derived: jupyter-client 1: derived: traitlets (>=4.1.0) 1: derived: ipython (>=5.0.0) 1: derived: ipython-genutils PyPI: 11 packages found for appnope * PyPI: 22 packages found for tornado >=4.2 PyPI: No release information found for jupyter-client-4.0.0.dev, skipping PyPI: 51 packages found for jupyter-client * PyPI: No release information found for traitlets-4.0.0.dev, skipping PyPI: 16 packages found for traitlets >=4.1.0 PyPI: No release information found for ipython-0.6.10, skipping PyPI: No release information found for ipython-0.6.11, skipping PyPI: No release information found for ipython-0.6.12, skipping PyPI: No release information found for ipython-0.6.13, skipping PyPI: No release information found for ipython-0.6.14, skipping PyPI: No release information found for ipython-0.6.15, skipping PyPI: No release information found for ipython-0.6.4, skipping PyPI: No release information found for ipython-0.6.5, skipping PyPI: No release information found for ipython-0.6.6, skipping PyPI: No release information found for ipython-0.6.7, skipping PyPI: No release information found for ipython-0.6.8, skipping PyPI: No release information found for ipython-0.6.9, skipping PyPI: No release information found for ipython-0.7.0, skipping PyPI: No release information found for ipython-0.7.1, skipping PyPI: No release information found for ipython-0.7.1.fix1, skipping PyPI: No release information found for ipython-0.7.2, skipping PyPI: No release information found for ipython-0.7.4.svn.r2010, skipping PyPI: No release information found for ipython-0.8.0, skipping PyPI: No release information found for ipython-0.8.1, skipping PyPI: No release information found for ipython-0.8.2, skipping PyPI: No release information found for ipython-0.8.3, skipping PyPI: No release information found for ipython-0.8.4, skipping PyPI: No release information found for ipython-0.9, skipping PyPI: No release information found for ipython-0.9.1, skipping PyPI: No release information found for ipython-4.0.0-b1, skipping PyPI: 74 packages found for ipython >=5.0.0 PyPI: 2 packages found for ipython-genutils * 1: selecting ipython-genutils (0.2.0) 1: fact: flake8 (3.9.2) depends on pyflakes (>=2.3.0,<2.4.0) 1: fact: flake8 (3.9.2) depends on pycodestyle (>=2.7.0,<2.8.0) 1: fact: flake8 (3.9.2) depends on mccabe (>=0.6.0,<0.7.0) 1: selecting flake8 (3.9.2) 1: derived: mccabe (>=0.6.0,<0.7.0) 1: derived: pycodestyle (>=2.7.0,<2.8.0) 1: derived: pyflakes (>=2.3.0,<2.4.0) PyPI: No release information found for mccabe-0.0.0, skipping PyPI: 2 packages found for mccabe >=0.6.0,<0.7.0 PyPI: No release information found for pycodestyle-0.0.0, skipping PyPI: 1 packages found for pycodestyle >=2.7.0,<2.8.0 1: selecting pycodestyle (2.7.0) 1: selecting pyflakes (2.3.1) 1: selecting mccabe (0.6.1) 1: fact: autopep8 (1.6.0) depends on pycodestyle (>=2.8.0) 1: fact: autopep8 (1.6.0) depends on toml () 1: derived: not autopep8 (==1.6.0) 1: fact: autopep8 (1.5.7) depends on pycodestyle (>=2.7.0) 1: fact: autopep8 (1.5.7) depends on toml () 1: selecting autopep8 (1.5.7) 1: derived: toml PyPI: 16 packages found for toml * 1: selecting tomli (2.0.1) 1: selecting typing-extensions (4.2.0) 1: selecting traitlets (5.1.1) 1: selecting toml (0.10.2) 1: selecting tornado (6.1) 1: fact: jupyter-client (7.3.0) depends on entrypoints () 1: fact: jupyter-client (7.3.0) depends on jupyter-core (>=4.9.2) 1: fact: jupyter-client (7.3.0) depends on nest-asyncio (>=1.5.4) 1: fact: jupyter-client (7.3.0) depends on python-dateutil (>=2.8.2) 1: fact: jupyter-client (7.3.0) depends on pyzmq (>=22.3) 1: fact: jupyter-client (7.3.0) depends on tornado (>=6.0) 1: fact: jupyter-client (7.3.0) depends on traitlets () 1: selecting jupyter-client (7.3.0) 1: derived: pyzmq (>=22.3) 1: derived: python-dateutil (>=2.8.2) 1: derived: nest-asyncio (>=1.5.4) 1: derived: jupyter-core (>=4.9.2) 1: derived: entrypoints PyPI: 1 packages found for pyzmq >=22.3 PyPI: No release information found for python-dateutil-0.1, skipping PyPI: No release information found for python-dateutil-0.3, skipping PyPI: No release information found for python-dateutil-0.4, skipping PyPI: No release information found for python-dateutil-0.5, skipping PyPI: No release information found for python-dateutil-1.0, skipping PyPI: No release information found for python-dateutil-1.1, skipping PyPI: No release information found for python-dateutil-1.2, skipping PyPI: No release information found for python-dateutil-2.0, skipping PyPI: 1 packages found for python-dateutil >=2.8.2 PyPI: 2 packages found for nest-asyncio >=1.5.4 PyPI: No release information found for jupyter-core-4.0.0.dev, skipping PyPI: 2 packages found for jupyter-core >=4.9.2 PyPI: 7 packages found for entrypoints * 1: fact: pyzmq (22.3.0) depends on py () 1: fact: pyzmq (22.3.0) depends on cffi () 1: selecting pyzmq (22.3.0) 1: derived: cffi 1: derived: py PyPI: 67 packages found for cffi * PyPI: No release information found for py-0.8.0-alpha2, skipping PyPI: No release information found for py-0.9.0, skipping PyPI: No release information found for py-1.4.32.dev1, skipping PyPI: 61 packages found for py * 1: fact: python-dateutil (2.8.2) depends on six (>=1.5) 1: selecting python-dateutil (2.8.2) 1: derived: six (>=1.5) PyPI: 18 packages found for six >=1.5 1: selecting nest-asyncio (1.5.5) 1: fact: jupyter-core (4.10.0) depends on traitlets () 1: fact: jupyter-core (4.10.0) depends on pywin32 (>=1.0) 1: selecting jupyter-core (4.10.0) 1: derived: pywin32 (>=1.0) PyPI: No release information found for pywin32-210, skipping PyPI: No release information found for pywin32-214, skipping PyPI: 11 packages found for pywin32 >=1.0 1: selecting entrypoints (0.4) 1: selecting six (1.16.0) 1: fact: ipython (8.2.0) depends on backcall () 1: fact: ipython (8.2.0) depends on decorator () 1: fact: ipython (8.2.0) depends on jedi (>=0.16) 1: fact: ipython (8.2.0) depends on matplotlib-inline () 1: fact: ipython (8.2.0) depends on pickleshare () 1: fact: ipython (8.2.0) depends on prompt-toolkit (>=2.0.0,❤️.0.0 || >3.0.0,❤️.0.1 || >3.0.1,❤️.1.0) 1: fact: ipython (8.2.0) depends on pygments (>=2.4.0) 1: fact: ipython (8.2.0) depends on setuptools (>=18.5) 1: fact: ipython (8.2.0) depends on stack-data () 1: fact: ipython (8.2.0) depends on traitlets (>=5) 1: fact: ipython (8.2.0) depends on pexpect (>4.3) 1: fact: ipython (8.2.0) depends on appnope () 1: fact: ipython (8.2.0) depends on colorama () 1: selecting ipython (8.2.0) 1: derived: colorama 1: derived: pexpect (>4.3) 1: derived: stack-data 1: derived: setuptools (>=18.5) 1: derived: pygments (>=2.4.0) 1: derived: prompt-toolkit (>=2.0.0,!=3.0.0,!=3.0.1,❤️.1.0) 1: derived: pickleshare 1: derived: matplotlib-inline 1: derived: jedi (>=0.16) 1: derived: decorator 1: derived: backcall PyPI: 42 packages found for colorama * PyPI: No release information found for pexpect-0.97, skipping PyPI: No release information found for pexpect-2.01, skipping PyPI: 6 packages found for pexpect >4.3 PyPI: 13 packages found for stack-data * PyPI: No release information found for setuptools-13.0, skipping PyPI: 332 packages found for setuptools >=18.5 PyPI: 20 packages found for pygments >=2.4.0 PyPI: 38 packages found for prompt-toolkit >=2.0.0,❤️.0.0 || >3.0.0,❤️.0.1 || >3.0.1,❤️.1.0 PyPI: 12 packages found for pickleshare * PyPI: 4 packages found for matplotlib-inline * PyPI: Unable to parse version “0.8.0-final0” for the jedi package, skipping PyPI: Unable to parse version “0.8.1-final0” for the jedi package, skipping PyPI: 6 packages found for jedi >=0.16 PyPI: No release information found for decorator-3.4.1, skipping PyPI: 34 packages found for decorator * PyPI: 2 packages found for backcall * 1: selecting backcall (0.2.0) 1: fact: matplotlib-inline (0.1.3) depends on traitlets () 1: selecting matplotlib-inline (0.1.3) 1: fact: jedi (0.18.1) depends on parso (>=0.8.0,<0.9.0) 1: selecting jedi (0.18.1) 1: derived: parso (>=0.8.0,<0.9.0) PyPI: 4 packages found for parso >=0.8.0,<0.9.0 1: selecting parso (0.8.3) 1: selecting pickleshare (0.7.5) 1: fact: stack-data (0.2.0) depends on executing () 1: fact: stack-data (0.2.0) depends on asttokens () 1: fact: stack-data (0.2.0) depends on pure-eval () 1: selecting stack-data (0.2.0) 1: derived: pure-eval 1: derived: asttokens 1: derived: executing PyPI: 8 packages found for pure-eval * PyPI: 24 packages found for asttokens * PyPI: 24 packages found for executing * 1: selecting pure-eval (0.2.2) 1: selecting pygments (2.12.0) 1: fact: asttokens (2.0.5) depends on six () 1: selecting asttokens (2.0.5) 1: selecting executing (0.8.3) 1: selecting decorator (5.1.1) 1: fact: prompt-toolkit (3.0.29) depends on wcwidth () 1: selecting prompt-toolkit (3.0.29) 1: derived: wcwidth PyPI: 17 packages found for wcwidth * 1: selecting wcwidth (0.2.5) 1: selecting setuptools (62.1.0) 1: fact: pexpect (4.8.0) depends on ptyprocess (>=0.5) 1: selecting pexpect (4.8.0) 1: derived: ptyprocess (>=0.5) PyPI: 5 packages found for ptyprocess >=0.5 1: selecting ptyprocess (0.7.0) 1: selecting appnope (0.1.3) 1: selecting pywin32 (303) 1: selecting colorama (0.4.4) 1: selecting py (1.11.0) 1: fact: cffi (1.15.0) depends on pycparser () 1: selecting cffi (1.15.0) 1: derived: pycparser PyPI: 21 packages found for pycparser * PyPI: Getting info for pycparser (2.21) from PyPI PyPI: No dependencies found, downloading archives PyPI: Downloading wheel: pycparser-2.21-py2.py3-none-any.whl 1: selecting pycparser (2.21) 1: Version solving took 0.771 seconds. 1: Tried 1 solutions.

    Writing lock file

    Finding the necessary packages for the current system

    Package operations: 38 installs, 0 updates, 0 removals

    • Installing six (1.16.0): Pending… • Installing six (1.16.0): Installing… • Installing six (1.16.0) • Installing asttokens (2.0.5): Pending… • Installing asttokens (2.0.5): Installing… • Installing executing (0.8.3): Pending… • Installing executing (0.8.3): Installing… • Installing parso (0.8.3): Pending… • Installing asttokens (2.0.5) • Installing executing (0.8.3): Pending… • Installing executing (0.8.3): Installing… • Installing executing (0.8.3) • Installing parso (0.8.3): Pending… • Installing parso (0.8.3): Installing… • Installing parso (0.8.3) • Installing ptyprocess (0.7.0): Pending… • Installing ptyprocess (0.7.0): Installing… • Installing ptyprocess (0.7.0) • Installing pure-eval (0.2.2): Pending… • Installing pure-eval (0.2.2): Installing… • Installing pure-eval (0.2.2) • Installing traitlets (5.1.1): Pending… • Installing traitlets (5.1.1): Installing… • Installing traitlets (5.1.1) • Installing wcwidth (0.2.5): Pending… • Installing wcwidth (0.2.5): Installing… • Installing wcwidth (0.2.5) • Installing backcall (0.2.0): Pending… • Installing backcall (0.2.0): Installing… • Installing decorator (5.1.1): Pending… • Installing decorator (5.1.1): Installing… • Installing nest-asyncio (1.5.5): Pending… • Installing nest-asyncio (1.5.5): Installing… • Installing backcall (0.2.0) • Installing decorator (5.1.1): Pending… • Installing decorator (5.1.1): Installing… • Installing decorator (5.1.1) • Installing nest-asyncio (1.5.5): Pending… • Installing nest-asyncio (1.5.5): Installing… • Installing jedi (0.18.1): Pending… • Installing nest-asyncio (1.5.5) • Installing jedi (0.18.1): Pending… • Installing jedi (0.18.1): Installing… • Installing matplotlib-inline (0.1.3): Pending… • Installing matplotlib-inline (0.1.3): Installing… • Installing matplotlib-inline (0.1.3) • Installing jupyter-core (4.10.0): Pending… • Installing jedi (0.18.1) • Installing matplotlib-inline (0.1.3): Pending… • Installing matplotlib-inline (0.1.3): Installing… • Installing matplotlib-inline (0.1.3) • Installing jupyter-core (4.10.0): Pending… • Installing jupyter-core (4.10.0): Installing… • Installing jupyter-core (4.10.0) • Installing pexpect (4.8.0): Pending… • Installing pexpect (4.8.0): Installing… • Installing pexpect (4.8.0) • Installing entrypoints (0.4): Pending… • Installing entrypoints (0.4): Installing… • Installing entrypoints (0.4) • Installing pickleshare (0.7.5): Pending… • Installing pickleshare (0.7.5): Installing… • Installing prompt-toolkit (3.0.29): Pending… • Installing prompt-toolkit (3.0.29): Installing… • Installing pygments (2.12.0): Pending… • Installing pygments (2.12.0): Installing… • Installing pickleshare (0.7.5) • Installing prompt-toolkit (3.0.29): Pending… • Installing prompt-toolkit (3.0.29): Installing… • Installing pygments (2.12.0): Pending… • Installing prompt-toolkit (3.0.29) • Installing pygments (2.12.0): Pending… • Installing pygments (2.12.0): Installing… • Installing python-dateutil (2.8.2): Pending… • Installing pygments (2.12.0) • Installing python-dateutil (2.8.2): Pending… • Installing python-dateutil (2.8.2): Installing… • Installing python-dateutil (2.8.2) • Installing pyzmq (22.3.0): Pending… • Installing pyzmq (22.3.0): Installing… • Installing pyzmq (22.3.0) • Installing setuptools (62.1.0): Pending… • Installing setuptools (62.1.0): Installing… • Installing setuptools (62.1.0) • Installing stack-data (0.2.0): Pending… • Installing stack-data (0.2.0): Installing… • Installing stack-data (0.2.0) • Installing tornado (6.1): Pending… • Installing tornado (6.1): Installing… py-spy> 2.77s behind in sampling, results may be inaccurate. Try reducing the sampling rate • Installing tornado (6.1) • Installing ipython (8.2.0): Pending… • Installing ipython (8.2.0): Installing… • Installing ipython-genutils (0.2.0): Pending… • Installing ipython-genutils (0.2.0): Installing… • Installing jupyter-client (7.3.0): Pending… • Installing jupyter-client (7.3.0): Installing… • Installing mccabe (0.6.1): Pending… • Installing mccabe (0.6.1): Installing… • Installing pycodestyle (2.7.0): Pending… • Installing pycodestyle (2.7.0): Installing… • Installing ipython (8.2.0) • Installing ipython-genutils (0.2.0): Pending… • Installing ipython-genutils (0.2.0): Installing… • Installing ipython-genutils (0.2.0) • Installing jupyter-client (7.3.0): Pending… • Installing jupyter-client (7.3.0): Installing… • Installing jupyter-client (7.3.0) • Installing mccabe (0.6.1): Pending… • Installing mccabe (0.6.1): Installing… • Installing mccabe (0.6.1) • Installing pycodestyle (2.7.0): Pending… • Installing pycodestyle (2.7.0): Installing… • Installing pycodestyle (2.7.0) • Installing mypy-extensions (0.4.3): Pending… • Installing mypy-extensions (0.4.3): Installing… • Installing mypy-extensions (0.4.3) • Installing pyflakes (2.3.1): Pending… • Installing pyflakes (2.3.1): Installing… • Installing pyflakes (2.3.1) • Installing toml (0.10.2): Pending… • Installing toml (0.10.2): Installing… • Installing toml (0.10.2) • Installing tomli (2.0.1): Pending… • Installing tomli (2.0.1): Installing… py-spy> 1.39s behind in sampling, results may be inaccurate. Try reducing the sampling rate • Installing tomli (2.0.1) • Installing autoflake (1.4): Pending… • Installing autoflake (1.4): Installing… • Installing autoflake (1.4) • Installing autopep8 (1.5.7): Pending… • Installing autopep8 (1.5.7): Installing… • Installing autopep8 (1.5.7) • Installing flake8 (3.9.2): Pending… • Installing flake8 (3.9.2): Installing… • Installing flake8 (3.9.2) • Installing ipykernel (5.5.6): Pending… • Installing ipykernel (5.5.6): Installing… • Installing ipykernel (5.5.6) • Installing mypy (0.930): Pending… • Installing mypy (0.930): Installing… • Installing mypy (0.930)

    py-spy> Stopped sampling because process exited py-spy> Wrote flamegraph data to ‘flamegraph6.svg’. Samples: 60451 Errors: 0

    </pre>
    

flamegraph flamegraph2 flamegraph3 flamegraph4 flamegraph5 flamegraph6

For all of us using private repositories, I found issue #4035 and made a PR for it in #4353. Even with @ls-jad-elkik’s suggestion, dependency resolution is still slow since poetry will still check both on PyPi and in any secondary = true repositories for each package.

Pip doesn’t have dependency resolution.

Yes. But why making poetry if it’s not to replace PyPI and requirements.txt?

You seem to confuse multiple parts of the ecosystem. I would distinguish those entities:

  • The software which people want to share
  • Software Repository: The platform on which people want to share it (e.g. PyPI)
  • Package Format: The format in which they want to share it (e.g. wheels)
  • Package Builder: The software people want to use to build the package (setuptools / poetry)
  • Package Uploader: The software people want to use to upload it (twine / poetry)
  • Package Manager: The software people want to use to install the package (pip / poetry) and its dependencies.
  • Environment Manager: The software people use to encapsulate (pipenv / poetry)

Under the hood, I think, poetry uses a couple of those base tools. It is just meant to show a more consistent interface to the user.

I found that poetry update was being very slow especially when I defined a secondary source.

I only wanted to use the secondary source for my company’s proprietary packages.

I think (not sure) that poetry is calling out to secondary source when not necessary.

my setup was e.g.

[tool.poetry.dependencies]
python = "^3.9"
pint = "^0.17"
# many more public packages...
acme-private-decimals = "^0.0.6"

[[tool.poetry.source]]
name = "acme-private"
url = "http://www.acme-private.com/pypi/"
secondary = true

https://github.com/python-poetry/poetry/blob/a3aafa840de950f81e99553e52190a54c94d6bce/docs/cli.md

(but not documented here: https://python-poetry.org/docs/dependency-specification/ )

Explicitly defining source per package seemed to have a big help! Although it’s unfortunately verbose.

[tool.poetry.dependencies]
python = "^3.9"
pint = { version = "^0.17", source = "PyPI" }
acme-private-decimals = { version = "^0.0.6", source="acme-private" }

[[tool.poetry.source]]
name = "acme-private"
url = "http://www.acme-private.com/pypi/"
secondary = true

Hopefully, this will be a help to someone. A colleague of mine last week started having issues with one of our projects where poetry would never finish resolving deps. We tried uninstalling poetry and then installing via brew. We then tried removing the virtual environment instance associated with the project but no luck.

We eventually wondered if it might be a local caching issue. We ended up running the following command to blow out the pypi cache.

poetry cache clear --all pypi

Once it was cleared he was able to perform poetry update and poetry install again without issue. I can’t guarantee it will help everyone but hopefully, it’ll be useful to some on this thread.

We’ve been having issues with slow resolution, and I’ve looked into it a bit deeper. Seems to be (partially) caused by the private registry being a primary source, and (partially) by a bad cache.

Poetry version:

Poetry version 1.1.11

I did a cProfile of poetry update -vvv and saw where the bottleneck was.

The pyproject.toml I’m testing with has quite a few packages, some of them public, others from the private repo:

[[tool.poetry.source]]
name = "our-private-repo"
url = "https://our-private-repo.jfrog.io/artifactory/api/pypi/pypi-local-our-private-repo/simple"
secondary = true

By making it secondary = true it cut out the time from 70.x seconds down to 33.x seconds. And that’s because:

  • when it was secondary = false: LegacyRepository._get_release_info was called 50 times (30k ms total)
  • when it was secondary = true: PyPiRepository._get_release_info was only called once (422 ms total)

At first glance it looks like the caching config is very different between LegacyRepository and PyPiRepository.

The size of the pypi cache was in the 50MB range, while our private repo had a cache in the range of a few hundred KBs.

The other 30s are spent in pool.find_packages (100 times) -> LegacyRepository.find_packages (50 times) -> _get (50 times) called regardless of secondary.

Here’s the gist with the profiling script I was using. Can’t share the pyproject.toml because it would need private credentials anyway.

If the issue is that certain python libraries do not specify the requirements, could poetry have some command/option to shed some light onto which libraries are creating the problem? This would help developers either:

  • Omitting the install of those libraries/replacing them with others.
  • Requesting a fix on the library’s repository.

poetry took forever to install a blank project. the only things in pyproject.toml were python version, and pytest. Then I upgraded pip from version 20.2.3 to 21.1.2. Now poetry install finishes in a flash. I can stop pulling out my remaining strands of hair…

(also: pypi ¯\_(ツ)_/¯)

@joannayoo0117 @thesofakillers Try poetry 1.2.0rc1 if you haven’t already. Our lock times went from hours to minutes with poetry 1.2.0b3 and 1.2.0rc1. You can wait a bit until the final release of 1.2.0 is out, which should happen next week. Beware that, in general, you can’t use poetry 1.1 with poetry.lock files created by poetry 1.2. We are also setting installer.max-workers to 1

Hey, I had this issue. Solving dependencies was over 1000 seconds without finishing when trying to add a dependency. Ran poetry add <dep> -vvv and it showed that it was having issues resolving boto3. Was trying every minor. I installed same boto3 as <dep> expected, and it worked fine. Maybe this helps someone out there.

What I found in

Was that a corrupted Poetry cache (probably due to Ctrl-C’ing something poetry was doing mid-operation) caused dependency resolution to hang forever.

To fix it I cleared the cache:

$ poetry cache clear --all pypi

Also, running poetry commands with -vvv is very useful to see if poetry is still actually working on resolving dependencies or is hung up.

I’m having these issues:

  • Poetry install gets stuck.
  • Poetry add gets stuck in “resolving dependencies…”.
  • poetry self update gets stuck (I know I’m using the latest but wanted to try just in case)

Things tried that didn’t change anything:

  • Removing the virtual env to make a fresh install.
  • I had a repositories.testpypi config set to https://test.pypi.org/legacy/, I removed it just in case.
  • Run everything with -vvv, the only difference is that poetry install then shows few packages that is skipping and then it hits the first that wants to install and gets stuck.
  • Disconnected from wify and switched to ethernet.

What did provide a different result:

  • Connected via wify to my phone’s hotspot.

After that the new results are:

  • poetry install: worked like a charm and super fast.
  • poetry add openpyxl: apparently stuck at resolving dependencies in the first try, tried again after some minutes and got resolved in 8 seconds and the package successfully installed.
  • poetry self update: now immediately answers “You are using the latest version”.

Great! I don’t understand the issue but I can finally continue developing, thanks to everyone suggesting workarounds in this thread.

After that I switched back to ethernet and tried:

  • poetry install: says no new dependencies to install, fair enough.
  • poetry update (I don’t really need it but won’t harm, and for the sake of debugging this issue): after 163 seconds tryng to ‘resolve dependencies’ I cancelled.
  • Tried poetry update again: after 60 seconds I cancelled.
  • Switched back again to phone hotspot connection and did poetry update -vvv, worked like a charm and super fast.

I hope this helps in some way, thanks everyone for the efforts.

Clearing the cache is most likely related to clearing out partial/incomplete/corrupted by concurrent usage downloads that can cause an indefinite hang.

i can confirm https://github.com/python-poetry/poetry/issues/2094#issuecomment-1070340891

my “workaround” is fequent retrying:

  • wait about 10s (depends on size of dependency graph)
  • use -vv, see if it hangs “too long”
  • do a keyboard interrupt
  • try again
  • usually after about 5- 10 repeats it works without the “delay”

Same problem here on poetry 2.13.0 and ubuntu 20.04.3 LTS. Adding jupyter take forever.

I wonder if maybe some of you are finding themselves in a situation similar to this:

The current project's Python requirement (>=3.6) is not compatible with some of the required packages Python requirement:
    - isort requires Python >=3.6,<4.0, so it will not be satisfied for Python >=4.0

  Because isort (5.7.0) requires Python >=3.6,<4.0
   and no versions of isort match >5.7.0,<6.0.0, isort is forbidden.

For project with few dependencies this fails quickly, but I could also imagine that with lots of dependencies it could be resolving for what seems forever. Might be totally unrelated, might be a false lead. But maybe it’s worth looking into this…

In your pyproject.toml set the Python requirement to a fix major-minor and see if it helps the dependency resolution. For example:

[tool.poetry.dependencies]
python = "3.6"

instead of >=3.6 or ^3.6, etc.

[I am not a maintainer]

My recommendation to pin or restrict the version ranges of some dependencies, or even some indirect dependencies, is just a workaround to help the dependency resolution algorithm in cases where it is struggling to find a suitable combination of distributions to install. If/when poetry’s dependency resolution gets better those version pins and restrictions could probably be removed.

Otherwise, there are some things that maybe help (or maybe not, hard to tell, since there are so many cases presented here, and I’m not even sure they are all due to dependency resolution):

  • check that the python restriction in your pyproject.toml is compatible with your dependencies (I think I remember seeing quite some cases where it would lead to unsolvable dependencies, I could try to find those again to show you what I’m talking about)
  • clear poetry’s cache
  • build your own index server containing only the dependencies you need (so that the dependency resolution only considers a very limited amount of distributions)
  • check your network traffic (make sure that it is not a download speed issue)
  • try different Python versions (maybe some versions are easier to solve)
  • help reviewing/improving the code of the dependency resolution algorithm
  • get in touch with the maintainers and see how to support them (financially, etc.)

@cglacet

Usually in these situations what works best is lobbying for a better (simpler/clearer/broader/…) standards to be promoted by the authority. In this case I guess its PyPA? Or maybe both PSF and PyPA?

Yes. That would be PyPA. They know all about these kinds of issues. They are actively working on solving them. These things take time. There is no need to lobby. There is need to participate with writing good documentation and good code. And most important of all, donate to fund developers to work full time on it.

What about this news: New pip resolver to roll out this year?

This is a part of the work, yes. Once this rolls out, PyPA will be able to move on to solving other packaging issues. This work was partly done thanks to financial grants (money donations).

You can read more about related, ongoing work (these links are only a short, semi-random selection, but they are all somewhat intertwined):

I never worked in any project that actively creates python packages so I might be wrong, but from my point of view packaging is not something that is currently crystal clear. It seems like there are way too many ways of doing one thing, so people like me don’t really now what they should do. Because in the end, we have no idea about our choices impacts. Which, ultimately, leads to the problems you are talking here.

Yes. From my point of view, issue is that the overwhelming majority of advice found on the internet (articles, blogs, StackOverflow answers, etc.) is either outdated, misguided, or plain wrong.

A good reference is this website (from PyPA itself):

If you follow poetry’s workflows you are already in very good hands, and you should not worry about anything too much. Upload wheels! Well, you need to upload both sdists and wheels. The sdists are still very important, do not forget them.

For what its worth, I find poetry’s documentation to be a good way of preaching for better solutions. It’s so clean it makes you want to make things cleaner.

Yes, it is also doing a very good job at getting rid of outdated, bad practices.

[Sadly somehow, there are always users pushing for poetry to adapt to their own broken workflows, instead of users changing their habits for the clean workflows of poetry. It is a constant battle.]

From the user perspective its already a very good improvement to have a standards such as PEP 518 – Specifying Minimum Build System Requirements for Python Projects, but that’s apparently not sufficient? Or maybe the problem you are debating here only arise for older projects ?

Yes, this was another great step forward. Python packaging ecosystem is improving a lot these days.

And yes, exactly, a great hurdle is keeping the compatibility with older projects. This slows down the work a lot. In particular older, broken setuptools / distutils setup.py based projects are very problematic. Although it is nowadays entirely possible to write clean, well-behaved, and standard-conform setuptools based projects.


[I am writing this of the top of my head according to the bits of info I have gathered here and there along the way. I do not have insight into all the processes involved, so there might be some inaccuracies. Feel free to correct me. Feel free to ask me for clarifications.]

Hi all,

This issue has gotten quite long and meandering, with many disparate causes, solutions, fixed issues, perhaps still extant bugs, and many “me too” comments all discussed.

I’m going to close this issue as most of the root causes discussed within have either been solved in 1.2 or 1.3 (the changes in 1.3 are behavior changes not eligible for backport).

If you are having issues with Poetry taking a long time to resolve dependencies, please first open a Discussion or start on Discord, as many of them are related to configuration and large search space (basically, you’re creating exponential work for the solver and should tighten your constraints). Tooling to advise the user should be possible (if difficult to develop) in the long run, and anyone interested in tackling this hairy problem should reach out to the team via a Discussion or on Discord.

Past that, please make sure to test with the latest code (both on the 1.2 branch and master branch presently) when trying to reproduce resolver issues as we are making improvements all the time, and your issue may be fixed and pending release already.

Finally, good reproductions are needed for this category of issue. Many times they are related to transient network issues, pathologically bad cases due to decisions made around (low traffic, private) custom package indexes, or a corrupted cache/bad local config. Reproducing in a container with publicly available packages will mean that someone can dissect your issue and possibly fix it. If you can’t reproduce it with public packages, but you can with private packages, there are still options – everything from sharing details with the team in private, to creating ‘imitation’ repositories to reproduce an issue.

Please refrain from commenting on this issue more if it’s not a mitigation/common solution – “me too” is not very helpful and will send dozens of emails, and if you can reproduce bad performance consistently/in a clean environment it should be an issue. If you’re stuck and need help, ask for support using one of the methods mentioned above.

I noticed the same pyproject.toml environment was taking 5000+ seconds on local macOS vs 120 seconds on cloud-hosted Ubuntu. I’ve written up a detailed investigation here, here are two ways to alleviate your Poetry woes: […] 2. Use CI to resolve your Poetry environments: I wrote this Poetry lock & export Github Action to resolve Poetry environments in CI (usage instructions here). It takes a few minutes to push, click, wait for the PR, merge, pull on local. But it saves me an hour of waiting. […] * macOS seems to compile 15%-50% slower than Ubuntu. This is for a “pandas-only” environment, controlling for network speed & architecture using Github Actions to run these performance tests. See my results and code here.

Hi everyone,

inspired by the quoted comment, I wrote a “one-line” docker command to perform poetry lock from within a container. For me this makes a huge improvement, going from occasionally >1 hour, when I run poetry lock directly on my M1 Mac shell, down to consistently sub 90 seconds, when run inside the container on the same machine.

docker run \
-v $PWD/pyproject.toml:/workdir/_curr/pyproject.toml \
-v $PWD/poetry.lock:/workdir/_curr/poetry.lock \
-v pypoetry_cache:/root/.cache/pypoetry \
--workdir=/workdir/_curr \
--rm \
-t \
python:3.9.6 \
bash -c "pip install --quiet poetry && poetry lock --no-update"

…to be run from the same folder, where the pyproject.toml and poetry.lock are located.

This is a base command for public dependencies only. You can also e.g. resolve relative local dependencies by mounting them into /workdir or from private repositories, by passing the authorization config via environment vars.

I can only speculate, why this works. Maybe by making poetry believe it would run from a ARM64 Linux platform instead of Mac it will actually be able to pull and analyze pre-built wheels in the majority of cases, instead of installing and building from source, or sth. like that?

PS: A proper poetry install can only to be done on the bare target machine - not from within a container afaik, but that’s not the slow bit in my case.

This helped me: poetry cache clear --all .. Then re-run poetry update -vvv. I assume some faulty/interrupted download got in the way.

I think what might at least be useful here is some tooling to at least help you find out which packages are the ones causing the issues, so you can move those specific ones out. Is there any way to do an analysis? The output of poetry lock is very spartan. Right now I’m just adding dependencies one by one and hoping I hit upon the slow one, which looks similar to your approach @john-sandall .

@finswimmer What’s the process for identifying a/the bottleneck in dependency resolution? If there’s not something already, it would be great to have a built-in tool/flag that would be able to give us feedback on slow-to-resolve packages.

I’d happily spend the time needed talking to, working with, and submitting PRs to projects that are slowing things down for my team, if I had a good way to identify.

things that helped on my end was to update poetry, clear the cache, and set upper limit to python version: poetry self update poetry cache clear --all . poetry config experimental.new-installer false

[tool.poetry.dependencies] python = “>=3.9,<4.0”

then create debug log to confirm fewer iterations of dependency checks: poetry update -vvv | tee ./poetry_update.log

@abn Sure, I’ll submit a new issue soon. I managed to get it working, poetry update now runs in 22 seconds.

I just noticed that the issue for me seemed related to using boto3 without specifying a version in the package I was importing. So I had package A that I built using poetry with boto3 = '*'. That did seem to resolve fairly quickly. But when I tried to import package A into a new package, B, it took >10 minutes to resolve (if it would ever finish). I specified the version used by package A for boto3 in package B, and it resolved my dependencies in < 30 seconds.

In my case the slow dependency resolution in Poetry was related to an IPv6 issue (also see this related answer on StackOverflow). Temporarily disabling IPv6 solved it. On Ubuntu this can be achieved using the following commands:

sudo sysctl -w net.ipv6.conf.all.disable_ipv6=1
sudo sysctl -w net.ipv6.conf.default.disable_ipv6=1

@tall-josh Because Poetry had (maybe still has?) a bug where corrupted cache entries make Poetry hang forever when trying to resolve dependencies.

For me this seemed to occur if I Ctrl+C’d Poetry while it was doing an install and it was downloading packages.

I observed this on 1.1.14, so perhaps it’s fixed in 1.2+.

@tall-josh Have you tried disabling ipv6? It solved it for me. Not sure if it works for private pypi though.

A lock died after ~10 hrs overnight:

$ poetry lock
Updating dependencies
Resolving dependencies... (36023.6s)

  ConnectionError

  HTTPSConnectionPool(host='stuff.work.lan', port=443): Max retries exceeded with url: /artifactory/api/pypi/python-public-proxy/simple/python-dateutil/ (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x2c1d30cc8>: Failed to establish a new connection: [Errno 60] Operation timed out'))

My config:

[tool.poetry.dependencies]
python = "^3.7.12" # unfortunately stuck on 3.7 for deployment platform purposes
click = "^8.0.1"
pandas = "^1.2.0"
pyodbc = "^4.0.30"
retry = "^0.9.2"
Pillow = "^8.3.2"
PyYAML = "^6.0"
requests = "^2"
internal-certs = { version = ">=2022", source ="subproject" }

[tool.poetry.dev-dependencies]
pytest = "^6.2.4"
black = "^21.5b1"
flake8 = "^3.9.2"
pytest-mock = "^3.6.1"

[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"

[tool.poetry]
name = "our-tools"
version = "0.1.0"
description = "A template repo for our team."
authors = []
packages = [
    { include = "tools" },
]

[[tool.poetry.source]]
name = "python-public-proxy"
url = "https://stuff.work.lan/artifactory/api/pypi/python-public-proxy/simple"
default = true

[[tool.poetry.source]]
name = "subproject"
url = "https://stuff.work.lan/artifactory/api/pypi/subproject/simple"
secondary = true

N.b. stuff.work.lan is a generally reliable Artifactory instance inside my corporate firewall. Its name has been changed for public posting, obvs. I’m connecting over a VPN but can pretty normally hit 15 MB/s download with bursts up to ~25 MB/s, so throughput isn’t a culprit.

I’m trying the cache clear with poetry cache clear --all python-public-proxy and rerunning with -vvv and capturing the output. Peeking at it, I’m seeing numpy a lot.

Edit: Clearing the cache seemed to be effective. The captured run took about 10 minutes.

I temporarily disabled my IPv6 to work on the project with poetry. In my case, I am developing in a Linux WSL2 environment, and I followed the tips on this site to disable it:

https://itsfoss.com/disable-ipv6-ubuntu-linux/

@abn I was curious too and tried to do this, but in the meantime IPv6 suddenly became supported on my connection and I can no longer reproduce the issue.

For anyone who still has this issue it would be very interesting to see a flamegraph created by:

brew install py-spy
# in a poetry project:
poetry cache clear --all pypi
sudo py-spy record --idle --threads --subprocesses --format flamegraph --output flamegraph.svg poetry update

Same problem here. Adding jupyter take forever.

I have removed jupyter and ipykernel from pyproject.toml and just installed those packages using conda and included it in my environment.yml file. Everything works right now!

Make sure that poetry --version is 1.1.12 because some older version that I have had was funky when installing numpy.

Same problem here. Sometimes some seconds, sometimes a long. It’s not related to specific packages. I am experiencing such an issue even when I start a new project from scratch with poetry init. But not all the times. Really annoying. See https://github.com/python-poetry/poetry/issues/4855 for more

for me the issue got resolved by itself, so I think it was something to do with my ISP (or PyPI fixed something and didn’t report on it, which I think is more unlikely)

Hi @gchamon, did you figure out what was going on? I’m afraid I’m facing a similar issue (#4855). I do not know how I can troubleshoot the problem with my ISP or with PyPy.

* [x]  Speed tests are fine, but I believe I'm just scratching the surface

It went away by itself. I believe it was a problem with my ISP…

Maybe make sure you are getting a valid ipv6 address? I wasn’t getting ipv6 because network manager was conflicting with dhcpcd. Disabled dhcpcd and I could use dhcpv6.

Sometimes you can monitor SNR, TX, RX values for your modem. Check if those metrics are healthy.

DNS can also be a problem. Maybe try different DNS servers. I recommend trying to rollout your own with pihole + unbound but that setup is more involved and may not help you with your problem with pypi

I have the same delay problem here. On Windows this no longer happens using Linux (Ubuntu 20.04 via WSL2), it has an “infinite” delay.

I found my problem was with the internet connection via WSL2. I changed the nameserver in the /etc/resolv.conf file to 8.8.8.8, and Poetry (version 1.1.11) was normal again. https://github.com/microsoft/WSL/issues/5420#issuecomment-646479747

I found that this occurred when I tried to add pandas to a pyproject with python = “^3.7.9”.

I think what was causing the issue was numpy (which pandas installs). When I tried to add numpy separately, I got the warning numpy requires Python >=3.7.9,<3.11, so it will not be satisfied for Python >=3.7,<4.0.0.

Once I switched my pyproject python to python = “>=3.7.9,❤️.11”, this slowness immediately went away.

This is perhaps a naive question, but why can’t we just not do dependency resolution? Since pip doesn’t seem to do this and works just fine, why does Poetry do it?

Correction: pip used to “not do dependency resolution”. And it was often not working fine, that was probably one of the main reasons why people wrote poetry and migrated to it.

pip now has an actual dependency resolution algorithm since some months, I can’t find the actual release date and version number. Looks like it’s 20.3 (2020-11-30):

What if we create a service from pydeps which can take millions of requests from around the world, then make poetry to use that service to resolve dependencies?

If you are looking for projects to help contribute to that don’t yet have wheels, this site lists the top 360 packages, a handful of which don’t have wheels: https://pythonwheels.com/

@cglacet

So if I get it correctly there are two problems, 1) we need to retrieve the whole package to answer any question about it, 2) answering dependency questions could potentially take some time because you need to open some files. I can very well understand why 1) is a problem because package registry are slow, but why is 2) really a problem?

1. Yes. True for both wheels and sdists. They have to be downloaded. Although there is some ongoing work that would result in the possibility to skip the download for the wheels.

2. Yes and no. True for both wheels and sdists, these archives have to be “opened” and some files have to be read to figure if there are dependencies and what they are. But this is not the part that is slow. The slow part, is that for sdists (not for wheels) just opening the archive and reading some files is not enough, those files have to be built (execute the setup.py for example) and in some cases a resource intensive compilation step is necessary (C extensions for example need to be compiled with a C compiler which is usually the very slow bit of the whole process).

Ins’t there a cache for storing/reading already installed packages?

As far as I know, there is and subsequent dependency resolutions for the same projects should be faster (download and build steps can be partially be skipped). The wheels built locally in previous attempts are reused.

Are there some references I could use to read more about this without adding interferences to this thread (I find it interesting but that’s probably off-topic for maintainers or even most users).

Yes, a bit off-topic but I believe it is helpful for the next users wondering the slow dependency resolution to read some insight into why.

Some good reading I could find on the spot:


Update:

Thinking about it more, I realize I might have mischaracterised things. Getting the metadata (including dependency requirements) out of an sdist, does not require compiling the C extensions (setup.py build). It should be enough to get the egg info (setup.py egg_info).

As far as I know: If your project and all its dependencies (and their dependencies) are available for your platform (Python interpreter minor version, operating system, and CPU bitness) as wheels, then it is the best case scenario. Because in a wheel, the dependencies are defined statically (no need to build a sdist to figure out what the exact dependencies are, for example).

You, as the developer (maintainer) of a project, the best you can do to help lower the difficulty of dependency resolution for everyone else, is to distribute the wheels of your project for as many platforms as possible (upload the .whl files to PyPI). Often projects (libraries, applications) are made of pure Python code (no C extension for example) so just 1 wheel is enough to cover all platforms.

@David-OConnor, what’s your suggestion for resolving things in the immediate term. How can I determine which package is causing the slow down? I am more than happy to make a PR to whichever project that is, but as it is now, any changes to pyproject.toml takes upwards of 20 minutes. When I run with -vvv, I see 1: derived: pyyaml (^5.3.1) as the last line before it hangs for several minutes, but I would assume you are doing installation asynchronously or something.

I figured I would add more to this issue. It’s taking more than 20 minutes for me:

gcoakes@workstation ~/s/sys-expect (master) [1]> time poetry add --dev 'pytest-asyncio'
The currently activated Python version 3.7.7 is not supported by the project (^3.8).
Trying to find and use a compatible version. 
Using python3.8 (3.8.2)
Using version ^0.12.0 for pytest-asyncio

Updating dependencies
Resolving dependencies... (655.1s)

Writing lock file


Package operations: 1 install, 0 updates, 0 removals

  - Installing pytest-asyncio (0.12.0)

________________________________________________________
Executed in   20.98 mins   fish           external 
   usr time    4.96 secs    0.00 micros    4.96 secs 
   sys time    0.35 secs  560.00 micros    0.35 secs 

This is the pyproject.toml:

[tool.poetry]
name = "sys-expect"
version = "0.1.0"
description = ""
readme = "README.md"
include = [
    "sys_expect/**/*.html",
    "sys_expect/**/*.js",
]

[tool.poetry.dependencies]
python = "^3.8"
pyyaml = "^5.3.1"
serde = "^0.8.0"
aiohttp = "^3.6.2"
async_lru = "^1.0.2"
astunparse = "^1.6.3"
coloredlogs = "^14.0"
aiofiles = "^0.5.0"

[tool.poetry.dev-dependencies]
pytest = "^5.4"
black = "^19.10b0"
isort = { version = "^4.3.21", extras = ["pyproject"] }
flakehell = "^0.3.3"
flake8-bugbear = "^20.1"
flake8-mypy = "^17.8"
flake8-builtins = "^1.5"
coverage = "^5.1"
pytest-asyncio = "^0.12.0"

[tool.poetry.scripts]
sys-expect = 'sys_expect.cli:run'

[tool.isort]
multi_line_output = 3
include_trailing_comma = true
force_grid_wrap = 0
use_parentheses = true
line_length = 88

[tool.flakehell.plugins]
pyflakes = ["+*"]
flake8-bugbear = ["+*"]
flake8-mypy = ["+*"]
flake8-builtins = ["+*"]

[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"

Can you think of a reason PyPi shouldn’t differentiate between no dependencies, and missing dependency data?

It’d be more productive to file an issue on https://github.com/pypa/warehouse, to ask this. There’s either a good reason, or PyPI would be open to adding this functionality. In the latter case, depending on how the details work out, it might need to be standardized like pyproject.toml was before poetry adopted it, so that the entire ecosystem can depend on and utilize it.

Reducing the search space is not a workaround, but a common issue (as mentioned in my comment above). That being said, the version you have written should be equivalent (^3.10.0 and ^3.10) – any differences you noticed with that change are likely unrelated.

@kache, It appears to search through dependencies depth-first, rather than breadth-first. As a result, you’ve probably got a something earlier in your pyproject.toml that depends on ddtrace, so the dependency resolver grabbed that version and tried to resolve using that, rather than the ddtrace version you’ve specified.

I’ve had some success moving the dependencies I want exact version logic prioritizing earlier in the pyproject.toml file.

(I also disabled IPv6, upgraded to poetry 1.2x, and have reduced the possible space for the troubling aws libraries (boto3 and awscli, for me) so those go at the very end of my dependency file and have only a few recent versions to chew through.

I’m seeing dependency resolution time between 5 and 35 seconds most of the time now.

Not yet. TBH I’ve only been looking when I get the overhead. My unfortunate work around at the moment it to pip install when I need to keep moving then to a poetry lock over night as needed. Then poetry install the following morning

OK thanks. My workaround is to use poetry add for every package I have instead of poetry install.

Weird behaviour:

  • poetry install : 30mn+ dependencies resolution
  • poetry add : I did a basic script that adds every package sequentially. It takes around 5-6mn to resolve dependencies for the same toml used before.

I can’t understand the issue here… Might be a caching problem?

@tall-josh Have you tried disabling ipv6? It solved it for me. Not sure if it works for private pypi though.

I’m on a work machine where I don’t think that’s an option for me. I can try the resolution on my local machine with IPV6 disable and see if that helps though. Cheers.

Update: I tried disabling ipv6 but it did not seem to have an effect 😦

Hello,

For PyPI packages poetry works well, but in my work we use Codeartifact and there is the issue.

I added something like

[[tool.poetry.source]]
name = "foo"
url = "https://foo.bar/simple/"
secondary = true

and then mine started going very slow. almost like all the caches stopped working. the flame graph is full of

get_release_info (poetry/repositories/pypi_repository.py:223)
find_packages (poetry/repositories/legacy_repository.py:264)

is it possible that the cache headers on my private repo are wrong? but if i set it to secondary i would assume if it found the package in the primary then it would not even use the private repo.

@teichert Nice work pulling these together! the giant chunks of time in read() from ssl.py could point to time spent waiting for bytes from an SSL socket that for some reason isn’t sending any data, right?

Disabling IPv6 on MacOS fixed the issue. System Preferences > Network > Advanced > TCP/IP Tab > Set “Configure IPv6” to “Link-local only”.

Screenshot 2022-04-22 at 13 02 41

Wow, this solved it for me. Thank you!

Running poetry inside an ubuntu container (see answer by @sekalkowski) worked perfectly. Looks to be an issue related to MacOS (I have an M1 chip)

This answer on the FAQ is misleading, as this doesn’t seem to be an error with Pypi’s API https://python-poetry.org/docs/faq/#why-is-the-dependency-resolution-process-slow

It doesn’t feel like latency to me, I was doing tests using fiber connection and through ethernet, my latency to pypi.org is around 14ms and super stable.

I want to add some more information to my tests: I’m also installing lots of packages every time I build a docker image or I use tox for running the tests, both things happen very often (daily basis) and we’re using poetry in this builds as well. This issue didn’t show up for neither docker or tox building processes.

Missatge de Ryan Munro @.***> del dia dl., 11 d’abr. 2022 a les 15:51:

  • Switched back again to phone hotspot connection and did poetry update -vvv, worked like a charm and super fast.

Having the same issue, I also believe it may be network latency, since I’m also on a bad connection right now. I haven’t this issue before when I’m normally on my fast internet connection. 😭

— Reply to this email directly, view it on GitHub https://github.com/python-poetry/poetry/issues/2094#issuecomment-1095079706, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABFVRQWCDUIY2IEXOELHN6LVEQU63ANCNFSM4K4UPQCQ . You are receiving this because you commented.Message ID: @.***>

i was able to work around the dependency resolution at least by just running poetry update over and over and over again (ran with -vvv to examine output). Seems like the ‘non-deterministic’ behavior described above might just be the installer making a little more progress each time.

Interestingly, even after dependencies were resolved, the installer still got hung up on ‘pending…’ for several numerical libraries. Looks like something going on with gcc or zlib or some other linux package.

Tries setting poetry config installer.parallel false at some point… not sure if that is helping or hurting.

Just upgraded to python3.10 using pyenv on WSL 2

I noticed that the workflow is single threaded. Any chance we could use all threads available, or at least half, for parsing?

I’m facing the same issue here (recently changed my ISP)

poetry_issue

[tool.poetry]
name = "my-app"
version = "0.1.0"
description = ""
authors = [""]

[tool.poetry.dependencies]
python = "^3.9"
azure-functions = "^1.8.0"
azure-identity = "^1.7.1"
azure-keyvault-secrets = "^4.3.0"
xmltodict = "^0.12.0"
aiohttp = "^3.8.1"

[tool.poetry.dev-dependencies]

[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"

Screen Shot 2021-12-21 at 1 44 48 PM

I think I’m waiting too much for poetry to finish.

Hello @lassepe,

thanks for sharing your experience. The bottleneck for the dependency resolution there (and I saw this in other cases before) is the dependency awscli. awscli doesn’t provide any information about it’s dependencies via PyPi’s json api, so poetry needs to download the package, extract and parse the information. Furthermore there are plenty of version. It seems that every version has different dependencies. This seems to be the worst case for building a dependency tree.

I have no idea about the algorithm and if it’s possible to optimize it for such a situation. I guess it would help at least a but if awscli would provide the dependency information via the API.

Anybody else who comes here with an absurdly long (upwards of 1 hour) dependency resolution, it’s probably because you are being prompted for the passphrase to an ssh key. The timer display keeps overwriting the last line in the terminal, such as the ssh password prompt. Solution: make sure any keys you need are added to ssh-agent.

Here’s a pretty good summary of why pip and requirements.txt isn’t sufficient for many use cases:

https://modelpredict.com/wht-requirements-txt-is-not-enough

Those numbers are very surprising. There does not seem to be anything particularly difficult happening here. It does not seem to be the dependency resolver that is struggling (doing lots of back-tracking or things like that). I wonder what it could be… Maybe open a separate bug ticket, I feel like this issue might be different than the rest of the thread.

@zillionare Have went through the same issue, fixed it after setting up VPN to kill GFW.

Guess this is the root cause for my case too. I have setup a proxy for pypi, it still runs slow (with -vvv options on, I can see it’s progressing), and I know how it works now. A lot of files need to be download before the deps is resolved.

Hope poetry can support mirror pip source, so the performance issue will be solved.

@cglacet

Upload a poetry.lock to CI will avoid resolving dependencies.

The discussion on python.org might be interesting for some as well: Standardized way for receiving dependencies

Yep. Anecdotally, if deps are specified at all on PyPi, they’re probably accurate. If not, it could mean either deps aren’t specified, or there are no deps.

Pypi not fixing this is irresponsible. Yes, I’m throwing spears at people doing work in their free time, but this needs to be fixed, even if that means being mean.

@David-OConnor Is there a technical reason for that? Isn’t it possible to check wether a package correctly specifies its dependencies?

Discussions here are quite interesting for a noob like me. I have a very naïve question though. Are packages built/published with poetry “correctly specifying their dependencies”. In other words, imagine I only add packages built with poetry, will the resolving phase be lightening fast? Or will this still apply:

… the python ecosystem provides no reliable way of determining a packages dependencies without installing the package

I first came here because I though 40s for resolving a single package dependencies was slow, but when I see minutes and hours to the counter I suppose it is normal.

I guess it’s not a good idea to use poetry for creating docker images (in CI pipelines for example)?

Another example. Adding black took 284 seconds.

% poetry add --dev black
Using version ^20.8b1 for black

Updating dependencies
Resolving dependencies... (284.1s)

Writing lock file

Package operations: 6 installs, 0 updates, 0 removals

  • Installing appdirs (1.4.4)
  • Installing mypy-extensions (0.4.3)
  • Installing pathspec (0.8.0)
  • Installing typed-ast (1.4.1)
  • Installing typing-extensions (3.7.4.3)
  • Installing black (20.8b1)

Unfortunately I can’t share the pyproject.toml.

for any one who come from china mainland (如果你来自中国大陆), add this to pyproject.toml

[[tool.poetry.source]]
name = "aliyun"
url = "https://mirrors.aliyun.com/pypi/simple/"
default = true

Why?

A pypi mirror with faster network access in mainland China.

Follwing @lmarsden suggestion I managed to speed up the process by making sure that sites/servers that prefer ipv4 use ipv4. On ubuntu I modified /etc/gai.conf by removing # (uncommenting) the following line: # precedence ::ffff:0:0/96 100

I realise that now as I mention in #2338 . I’m therefor not that interested in poetry at the moment. I taught it was like composer and https://packagist.org, but it looks mostly like a wrapper around differents legacy tools.