setuptools: `InvalidVersion` exception when invalid version used on Setuptools 66

setuptools version

66.0.0

Python version

3.8

OS

Ubuntu 20.04

Additional environment information

only happening when not running inside a venv

Description

Trying to install certain pip packages like e.g. ssdeep results in an pkg_resources.extern.packaging.version.InvalidVersion exception. This seems to be related to this bug where certain ubuntu/debian packages install python packages (in this case python3-distro-info and python-debian) with versions that don’t conform to PEP 440

Expected behavior

With setuptools <66.0.0 this does not cause an error

How to Reproduce

  1. docker run -it --rm --entrypoint=bash ubuntu:focal
  2. apt update && apt install python3 python3-pip python3-distro-info python-debian libfuzzy-dev -y
  3. python3 -m pip install -U setuptools pip wheel
  4. python3 -m pip install ssdeep

Output

python3 -m pip install --user ssdeep --no-cache-dir
Collecting ssdeep
  Downloading ssdeep-3.4.tar.gz (110 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 110.8/110.8 kB 3.6 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [28 lines of output]
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/tmp/pip-install-3dwjl5j0/ssdeep_f2bd2f0d9f384d9a92bf4f0f6f3090d3/setup.py", line 108, in <module>
          setup(
        File "/home/vagrant/.local/lib/python3.8/site-packages/setuptools/__init__.py", line 86, in setup
          _install_setup_requires(attrs)
        File "/home/vagrant/.local/lib/python3.8/site-packages/setuptools/__init__.py", line 80, in _install_setup_requires
          dist.fetch_build_eggs(dist.setup_requires)
        File "/home/vagrant/.local/lib/python3.8/site-packages/setuptools/dist.py", line 874, in fetch_build_eggs
          resolved_dists = pkg_resources.working_set.resolve(
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/__init__.py", line 815, in resolve
          dist = self._resolve_dist(
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/__init__.py", line 844, in _resolve_dist
          env = Environment(self.entries)
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/__init__.py", line 1044, in __init__
          self.scan(search_path)
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/__init__.py", line 1077, in scan
          self.add(dist)
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/__init__.py", line 1096, in add
          dists.sort(key=operator.attrgetter('hashcmp'), reverse=True)
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/__init__.py", line 2631, in hashcmp
          self.parsed_version,
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/__init__.py", line 2678, in parsed_version
          self._parsed_version = parse_version(self.version)
        File "/home/vagrant/.local/lib/python3.8/site-packages/pkg_resources/_vendor/packaging/version.py", line 266, in __init__
          raise InvalidVersion(f"Invalid version: '{version}'")
      pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: '0.23ubuntu1'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

About this issue

  • Original URL
  • State: open
  • Created a year ago
  • Reactions: 129
  • Comments: 92 (25 by maintainers)

Commits related to this issue

Most upvoted comments

This issue is by design (#2497). Non-conformant versions have been discouraged since ~2014, deprecated in packaging since late 2020 and deprecated in Setuptools since Oct 2021.

Projects should adapt to provide conformant versions (i.e. 0.23+ubuntu1) or environments that rely on such packages should pin to Setuptools < 66. These changes are necessary to keep Setuptools healthy and aligned with the goals of the packaging ecosystem.

Unfortunately, deprecation and removal is the main way I know to inform downstream consumers of breaking changes like these.

If the disruption is too great or the workarounds aren’t suitable, this project may consider backing out the changes for another limited period, but only as a mitigation measure. Please first explore mitigation measures, such as pinning setuptools in the build-system.requires in the projects or work with the pip team to allow specifying constraints on build system requirements.

I’m also facing this error

Please use the thumbs-up on the original post to register that this issue also affects your project.

I’m also facing this error

Quick note that in some cases internal dependencies several steps down the chain from the package I’m using are bringing in the new version of setuptools, making it hard to impossible to pin setuptools as a user who’s running pip install. It also means install workflows that worked yesterday are broken today, even when pinning all my requirements.txt rows. This means wide impact without an available workaround for users. Am I missing an available workaround? Please consider coordinating the rollout of this change with the pip team (et al) to allow for effective pinning

Since the error message is somewhat cryptic (it would help if it also printed out the offending package in addition to the version), I dug around a bit to help match the version from the error message to the package that caused them:

@jaraco This needs to be fixed. As of now, setuptools will refuse to install a package, because it uses a package, that uses a package, that uses a system package where Canonical has released a version that has a version number that doesn’t conform to the PEP format.

That should not be an error. A warning, yes, an error, no. That Canonical released a package with a bad version number should NOT prevent the installation of the package I’m currently trying to install.

I totally agree that Canonical should not release and install such packages, I completely agree that tools should refuse to build packages with such versions, but refusing to install a package because a dependency, far down the line, has an incorrect number is NOT the correct implementation.

I’m also facing this error

Please use the thumbs-up on the original post to register that this issue also affects your project.

Thumbs up 👍, yes this broke my build. And thumbs down 👎, I’m staying late tonight to fix a bug that’s a critical…spelling error.

Unfortunately, deprecation and removal is the main way I know to inform downstream consumers of breaking changes like these.

Yeah, it’s a tough problem. But deprecated or not, I generally assume that core tools work. Version numbers in particular have been historically very lenient, so this is a really fundamental change that caught me off guard.

Build logs are noise, unless there’s a problem. So while it’s possible that I’ve seen the xxx is an invalid version and will not be supported in a future release warning flying by in the logs, to me, it isn’t a call to action. It’s just one line among hundreds.

Suffice to say, if something is going to break, I’d prefer a more detailed message, for example:

******************************************************************************** 
The version number <xxx> on package <yyy> package is no longer valid. 

By January 2023, you need to update version numbers to conform with PEP440 
or your builds may fail. You can retain the old behavior by passing 
the --allow-deprecated-version-numbers option to setup. 

See <some url> for details.
********************************************************************************

I’d also suggest putting the message at the end of the build log, not the beginning. Then it probably would be the first thing that I see after I run a build.

Python was known for being permissive. This is a very JAVAesque push. Did having a git commit hash in the version number ever break dependency resolution for setuptools? NO! But setuptools going all nazi on versioning breaks everyone’s solutions. Be permissive, not restrictive.

PEP 440 should allow for stuff such as 1.2.3.sha1

The use case is a common library you have to install from a repository in-house and you have 2 competing versions built for 2 checked-out feature branches but you would like to run your system testing with each of them and having consecutive unreleased version numbers for the 2 feature branches is not a solution.

PEP440 seems to concern itself with public packages released into the wild but forgets about important dev use-cases.

This is so stupid! Invalid version: '0.23ubuntu1' (package: distro-info) Now my entire pip is broken.

Thank you @jaraco for the suggestion. I would agree if our private dev versions would affect anyone anywhere… but they don’t. When the branches are merged the version collapses back to normal semantic versioning.

The reason I am advocating against this change here is the fact that the tooling now sabotages people from doing what they want while being concerned with use-cases out in the wild. PEP440 is fine as it is for the public use-cases and I do not advocate necessarily for the idea that my use-case needs to be included in the standard I am only saying that the tooling should not enforce this unless I ask it to, such as when I run flake8 or black or isort or whatever. Nothing was broken with a permissive system and if anything the people facing this debian issue(and what else) are trying to restrict what the tooling can do just to serve their use-case.

We worked around our problem(for now anyway) but this direction will eventually increase the proliferation of forks if folks get annoyed enough about the authoritarian thinking that drives similar changes and we get another distutils vs setuptools battle for a decade where there will be tooling that is more sensible that needs to reinvent everything setuptools has already done correctly only to add a few flags to turn the restrictive features off.

It should always have been a warning only in this tool. “WARNING: Dear user the version 'bollox.3.%$£” does not comply with versioning standards outlined in PEP440. If you plan to release this package publicly consider changing the versioning to a compliant one." - and when that does not happen pypi can still reject the package on upload. And that is it. That is I think where the line should be/could be.

Btw. this feels so relevant right now https://xkcd.com/927/

I’m getting this issue when I’m using setup.py with https://github.com/pypa/setuptools_scm I have the following configuration:

        use_scm_version={
            'version_scheme': 'python-simplified-semver',
            'local_scheme': 'dirty-tag',
        },
        setup_requires=['setuptools_scm'],

With setuptools 66.0.0, I get

...
           self._parsed_version = parse_version(self.version)
        File "/usr/local/lib/python3.10/dist-packages/pkg_resources/_vendor/packaging/version.py", line 266, in __init__
          raise InvalidVersion(f"Invalid version: '{version}'")
      pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: '1.1build1'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

When I roll back the setuptools version to 65.7.0, it works as expected (e.g. my package version ends up as 1.0.20220516.dev56)

Hi there @pradyunsg

So I read your explanation and it seems to me that your problem is with debian-based linux distributions. Would it not be more prudent that you enforce this versioning scheme in debian/ubuntu? I really do not see a reason for python to limit its dev versioning ability(the above explained sha1 inclusion ability) and whatever other usecases might be… I think a package should be allowed to have whatever version if I want to call one version “rubberduck” and the other one “whatever” I should not be prevented from building and installing that package as long as I maintain both the repository and the packages. This way one has the freedom to build sophisticated python-based development workflows that may not have been foreseen by people who wrote PEP440.

Where the strict enforcement should be is public repositories not accepting certain versions or ignoring incorrect ones completely. And therefore ubuntu should not give poorly versioned packages to a “legacy” build system… I might add at this point that we are pushing everyone into a corner to support some legacy system instead of just losing it for the sake of ubuntu/debian users ONLY. Nobody else would be affected.

This is a bit similar to the case when systemd developers expect the entire linux community to bend over backwards for them when their code causes a new problem and now everyone else should adapt.

I do appreciate the fact that it has been communicated and the decision was made but I do not think setuptools should fail building my package WHATEVER version number I provide. I should be allowed to name things my way. This is the ethos of python. When it comes to shared resources like public repos… by all means they should be filtering what comes in. This is my take on it. If ubuntu breaks then ubuntu should make sure they do not provide you with garbage it should not be that we force everyone to bend over backwards just because ubuntu’s versioning scheme does not align with pip. So what? In most cases it is anyway prudent to use one’s own python and never some system python on linux distros that is full of random hacks that usually sooner or later break something for the average application developer.

PS: I understand that you guys already pushed this through and everyone is forced to do this(until someone forks setuptools again and we begin another format war) and that at this point I am unlikely to change your mind but I do posit that you fixed a problem in the system that did not cause it, on the wrong level, by turning the wrong dial and you also went against one of the attractive aspects of development in Python, which is its reluctance to force people to do things the ONE way. As long as it does not break things, should not be governed with an iron fist and this did not break things for 63 major versions, from what I am gathering from the explanation people in debian did.

FWIW, to provide a fuller explanation of what’s happening here:

  • When pip tries to build a project with the “legacy” build system, the executed setup.py for the package is exposed to the entire set of packages available on that Python installation.
  • setuptools needs to determine what’s installed to support setup_requires with the legacy install mechanism, for which it’s calling pkg_resources.working_set.resolve which looks at the entire environment.
  • As that scans the environment, it trips on the invalid version if a package in the environment has an improper version.
  • Debian/Ubuntu’s system Python have packages installed that have invalid versions (or your $organizations’ Python packages)

Please consider coordinating the rollout of this change with the pip team (et al) to allow for effective pinning

puts on pip and packaging maintainer hat

Hi. I am probably the person who set the ball rolling on this form of strictness within the Python packaging ecosystem. I can also tell you that I’ve been involved in multiple discussions with setuptools’ maintainers around this change (~all of those are in public spaces). 😃

Pinning things would not have helped in this case – this isn’t related to how setuptools builds your package, or how pip handles the built project, or how the two of them interact. The issue is with the environment you’re building in, which contains a package with a bad version – specifically, in OP, it’s because Debian’s Python packages have done things that degrade the UX (honestly, it’s not the first time). The only people who could seen and reported/handled this warning didn’t report this to the right people/handle this in… 2-9 years.

I’m certain that, if the maintainers of setuptools knew about this failure mode, they would’ve done something to provide some mechanisms to deal with this. However, if we don’t know about a failure mode, because Debian (or your internal packages) do things in the way that would trigger this failure mode and no one told us, we can’t do anything about a failure mode we don’t know about. And, once it’s been broken, the churn costs have been paid already – it doesn’t really make sense to undo things at that point.

At the end of the day, this is not a straightforward failure – it’s not an issue with your package (which pip would’ve warned you about), following best-practices of isolated from your system-provided Python environment will not trigger this issue and the folks who made the “mistake” of using non-PEP 440 versions didn’t action on the warnings when they were being presented.


Since this has been reported, there have been multiple improvements to make this less painful of a failure mode and https://github.com/pypa/setuptools/issues/3780 tracks making deprecations coming out of setuptools more visible.

I can confidently say that the maintainers of setuptools, pip etc are well aware of the network effects of these changes, and that we understand that it can be difficult to deal with increased strictness causing issues because a transitive dependency three layers down does things incorrectly. To that end, we’ve been using the mechanisms we have to communicate this. Both setuptools and pip have been printing warnings related to this, and pip is going to get stricter around this area as well in this year (the change needed to happen in setuptools first for a complicated reasons that I’m not gonna go into).

While I’m sure that there’s more that the volunteers who maintain the foundational pieces of the Python ecosystem could have done to reduce pain for you, I would like to implore you to think about what you could’ve done to have flagged this to them.


Am I missing an available workaround?

Multiple ways to resolve/work around this have already been mentioned since this was posted. I’ll try to reiterated on what they are (and hopefully, my explanations above help clarify how/why that fixes/avoids issues).

  • If you “own” the package affected, you can do one of the following:

  • If you do not “own” the problematic package, you should inform the maintainers of the package and can do one of the following:

    • pass --use-pep517 to pip[^1] to opt all builds into the pyproject.toml-based build system, or
    • isolating your installation process from the system, by working in a virtual environment that can’t see system-provided packages.

Finally, I genuinely appreciate that setuptools as a project has made this change, even though it’s a slightly painful one and a potentially disruptive one. I should note that this is a change that genuinely makes things better and, honestly, increased strictness in what these tools accept will make things better for the overall ecosystem.

[^1]: That option has a bad name, we’ll change it to a better one some day – but it does what we need, enabling the pyproject.toml-based build system for all projects.

PS: The setuptools team has wanted to get rid of pkg_resources for a vast array of suboptimal behaviours, like the one that triggered this issue, but it’s a complicated problem and there hasn’t been sufficient (primarily-volunteer) developer availability to drive that effort over the line.

A workaround for this error is to use pip install --use-pep517 which will prevent the deprecated behaviour for being triggered.

Hi @abravalheri, I tried it out and can confirm that it installs without errors. Thank you for your response!

TL;DR for those looking for workarounds:

Option 1: Try to use pip install --use-pep517 and see if this solves your problem Option 2: Try to always install packages using pip in a virtual environment

Please try to identify the packages/projects that are using invalid versions and notify the maintainers.

Meanwhile setuptools will keep trying to obliviate pkg_resources. Unfortunately I believe this change cannot be removed, since we do need to pursue compatibility with the latest versions of packaging.

Just want to emphasize, if missed, that suggesting back-pinning is essentially asking people to fork, and contributes to the fragmentation of community, aside from pinning is not always possible. Thus it’s barely a mitigation, far from a “solution”.

Should it fail if it ALREADY IS INSTALLED? No.

This point is a good one. Thanks for highlighting.

Perhaps pkg_resources should just get out of the business of validating installed version numbers.

@astrojuanlu It’s worth noting that with unoserver, I have to install the virtualenv with --system-site-packages, as it uses the uno libraries provided by Libreoffice, which aren’t available any other way.

The vibe here has been extremely negative

That’s because the setuptools maintainers’ response has been negative and refusing to take this issue seriously.

Setuptools should simply not fail to install a package because an already installed package has a version number it doesn’t like. That is a bug, and maintainers claiming it isn’t and not listening to people’s actual problems is the source of the negativity.

I see many people pinning setuptools but not many pull requests to transitive dependencies with invalid version numbers? I think it would be helpful to mention which packages are triggering this error.

Switching to setuptools 65 solved it for me, now I have new error. pip install setuptools==65

sudo apt-get remove --auto-remove python3-distro-info

As far as I am concerned while developing complex systems the last thing I would like to deal with is pedantic people messing with tooling and ignoring the use-cases most people tend to use for developing software… aka adding a commit hash and selecting the version based on that. This is nothing new and it is standard practice because it just makes sense.

I already spent way too much time arguing and all I hear as a response is pedantic arguments that things should just be nice and OCD and line up because otherwise someone will get a fit. Software development is never perfect, that is the reality deal with it, which is why python is popular because it is permissive. i.e.: You can make a mess with metaclasses but you get to have them anyway and there are some use-cases that they enable that otherwise would be a pain to implement.

Pedantic versioning breaks this promise.

If someone is very OCD I may suggest Haskell or Java… except Java supports adding commit hashes in version spec isn’t that ironic… 😄

And you know… I hate anaconda with a passion because of its super-slow and buggy package dependency resolution algo and sometimes its overly restrictive or downright bad(pyface pulling in GPL packages in conda but not in pip) dependency structure enforced through and through but not even they prevent people to append whatever they want at the end of the version string.

@xkszltl saying it is a timebomb in user code to pin old version of setuptools is accurate and I would also posit that treating python devs like they were Java devs is indeed a bad trend. We do not need everything prescribed for us. i.e. PEP8 can be enforced by using specific tooling, but python still runs code that isn’t PEP8 but is still parseable because it is the way it has been build for decades a social contract and understanding of sorts amongst python devs that we do not stampede over each other unnecessarily as long as things work.

sudo apt-get -y -qq update
sudo apt-get install python3-pip python-dev build-essential
sudo pip3 install --upgrade setuptools
sudo pip3 install awscli --upgrade

I removed the setuptools and it works fine.

sudo apt-get -y -qq update
sudo apt-get install python3-pip python-dev build-essential
sudo pip3 install awscli --upgrade

Yeah, it’s a tough problem. But deprecated or not, I generally assume that core tools work. Version numbers in particular have been historically very lenient, so this is a really fundamental change that caught me off guard.

Hi @sethrh, please note that this change is necessary so setuptools can update its dependency on the latest version of packaging. This is important because:

  1. Newer versions packaging will accumulate bug fixes and improvements. If setuptools is stuck with an older version, users will not benefit from the changes.
  2. Operating System packaging ecosystems want to be able to distribute setuptools and this may require harmonizing the versions of the dependencies on packaging with other tools such as pip (this depends on the OS).

Build logs are noise, unless there’s a problem. So while it’s possible that I’ve seen the xxx is an invalid version and will not be supported in a future release warning flying by in the logs, to me, it isn’t a call to action. It’s just one line among hundreds.

Please note that you can control the verbosity of build logs, for example:

python -m build -C--quiet

The output is “medium” verbose by default. The discussion if this is a good default or not is a completely separated thing (some people may agree that when there is an error in a CI build, it is good to have some logs, otherwise you have to change the scripts to be more-verbose and re-run the entire process again).

Am I missing an available workaround?

If I am not mistaken, using a virtual environment should work (since all the packages with offending versions seem to come via apt or equivalent). If the package you are trying to install or its dependencies do not conform to PEP 440, this will of course not help. Also uninstalling the system packages (python3-distro-info, python-debian, etc.) seems to work but they are probably needed by some other packages. python3-distro-info for example is used by update-manager-core, which is in turn used by update-manager, which is used by ubuntu-desktop, so it might not be feasible.

For those that for some reason cannot use the suggested fix (i.e. installing things with pip install --use-pep5171), version 67.5.1 implements a workaround for the specific problem described by @regebro:

Setuptools should simply not fail to install a package because an already installed package has a version number it doesn’t like.

Notes

* Non-PEP 440 compliant versions are still deprecated and no-longer supported when building packages.
  Projects that don't comply with PEP 440 are _urged_ to change.
  Developers that support the idea that general projects should not necessarily be subject to PEP 440 are encouraged to create a discussion in the [Python discourse for packaging](https://discuss.python.org/c/packaging/14).
  
  * Setuptools issue tracker is not the best place to discuss this because it involves the entire packaging ecosystem and the standards that `setuptools` and `pip` are supposed to follow.

* Using the `--use-pep517` flag in pip is still the recommended solution[1](#user-content-fn-1-2e03857a76ce8c336e8d18daffbbd541).

Tips

* Package developers are encouraged to run tests/automated builds that don't hide deprecation warnings (e.g. `python -X dev -m build`, `python -X dev -m pip -v` and [`pytest` warning filters](https://docs.pytest.org/en/7.1.x/how-to/capture-warnings.html))

* Package developers can make setuptools less verbose by setting [build's `--config-setting`](https://pypa-build.readthedocs.io/en/latest/#python--m-build---config-setting), e.g. `python -m build -C--quiet`.

Based on the reproducer reported in #3772 (comment) by @kiorky, we can see 67.5.1 working for the complex scenario described above:

> docker run --rm -e DEBIAN_FRONTEND=noninteractive -it ubuntu:bionic@sha256:4a45212e9518f35983a976eead0de5eecc555a2f047134e9dd2cfc589076a00d bash

apt update && apt install -q -y virtualenv python3-distro-info python3-distutils

mkdir -p /tmp/mypkg
cd /tmp/mypkg
echo "aaa=1" > mymod.py
cat <<EOF > setup.py
from setuptools import setup
setup(
    name="mypkg",
    version='1.0',
    py_modules=['mymod']
)
EOF

## An older version will cause the error:
virtualenv --system-site-packages /tmp/.venv1
/tmp/.venv1/bin/python -m pip install -U 'pip==23.0.1' 'setuptools==67.4.0'
/tmp/.venv1/bin/python -m pip install -e .
# ...
# File "/tmp/.venv1/lib/python3.8/site-packages/pkg_resources/__init__.py", line 2694, in parsed_version
#   raise packaging.version.InvalidVersion(f"{str(ex)} {info}") from None
# pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: '0.23ubuntu1' (package: distro-info)

## 67.5.1 will not cause the error:
virtualenv --system-site-packages /tmp/.venv2
/tmp/.venv2/bin/python -m pip install -U 'pip==23.0.1' 'setuptools==67.5.1'
/tmp/.venv2/bin/python -m pip -v install -e .
# Successfully installed mypkg-1.0

Footnotes

1. Note that this is going to be the default in [`pip` 23.1](https://github.com/pypa/pip/blob/main/src/pip/_internal/utils/deprecation.py#L164-L175). See [pypa/pip#8559](https://github.com/pypa/pip/issues/8559) for details. [leftwards_arrow_with_hook](#user-content-fnref-1-2e03857a76ce8c336e8d18daffbbd541) [leftwards_arrow_with_hook2](#user-content-fnref-1-2-2e03857a76ce8c336e8d18daffbbd541)

thx that worked for me!

sudo apt autoremove python3-debian python3-distro-info

after removing these two packages ,below error is gone

pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: ‘0.23ubuntu1’

Sorry if this should be a new issue. I have a problem where the version of Python itself is causing an Invalid Version error. I install Python from the tip of 3.12, then:

% .tox/anypy/bin/python -c "import pkg_resources as p; p.load_entry_point('coverage', 'console_scripts', 'coverage')()"
Traceback (most recent call last):
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2711, in _dep_map
    return self.__dep_map
           ^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2826, in __getattr__
    raise AttributeError(attr)
AttributeError: _Distribution__dep_map

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 522, in load_entry_point
    return get_distribution(dist).load_entry_point(group, name)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2855, in load_entry_point
    return ep.load()
           ^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2467, in load
    self.require(*args, **kwargs)
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2489, in require
    reqs = self.dist.requires(self.extras)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2746, in requires
    dm = self._dep_map
         ^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2713, in _dep_map
    self.__dep_map = self._filter_extras(self._build_dep_map())
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 2728, in _filter_extras
    invalid_marker(marker) or not evaluate_marker(marker)
    ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 1415, in invalid_marker
    evaluate_marker(text)
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/__init__.py", line 1433, in evaluate_marker
    return marker.evaluate()
           ^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/_vendor/packaging/markers.py", line 245, in evaluate
    return _evaluate_markers(self._markers, current_environment)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/_vendor/packaging/markers.py", line 151, in _evaluate_markers
    groups[-1].append(_eval_op(lhs_value, op, rhs_value))
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/_vendor/packaging/markers.py", line 109, in _eval_op
    return spec.contains(lhs, prereleases=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/_vendor/packaging/specifiers.py", line 565, in contains
    normalized_item = _coerce_version(item)
                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/_vendor/packaging/specifiers.py", line 36, in _coerce_version
    version = Version(version)
              ^^^^^^^^^^^^^^^^
  File "/Users/nedbatchelder/coverage/trunk/.tox/anypy/lib/python3.12/site-packages/pkg_resources/_vendor/packaging/version.py", line 197, in __init__
    raise InvalidVersion(f"Invalid version: '{version}'")
pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: '3.12.0a5+'

% .tox/anypy/bin/python -m pip freeze --all
attrs==22.2.0
colorama==0.4.6
-e git+ssh://git@github.com/nedbat/coveragepy.git@1edd0608fac9ffadf24da72820572bec43d0c8fd#egg=coverage
distlib==0.3.6
exceptiongroup==1.1.0
execnet==1.9.0
filelock==3.9.0
flaky==3.7.0
hypothesis==6.68.0
importlib-metadata==6.0.0
iniconfig==2.0.0
packaging==23.0
pip==23.0
platformdirs==3.0.0
pluggy==1.0.0
pytest==7.2.1
pytest-xdist==3.2.0
setuptools==67.2.0
sortedcontainers==2.4.0
tomli==2.0.1
typing_extensions==4.4.0
virtualenv==20.19.0
wheel==0.38.4
zipp==3.12.1

Why is setuptools trying to parse the Python version?

There’s a corresponding coverage.py issue about this.

The discussion here hasn’t been very productive

Thx for the commenters.

In the meanwhile though, judging by the amount of upvotes and the number of comments flowing in, I think it would help if the maintainers edit the top comment and add a paragraph with the known workaround, which is IIUC to use a virtual environment and, if not possible, to pin setuptools, as well as telling people not to uninstall the offending packages at the risk of borking their system.

It’s not always possible:

  • We may not have sufficient access to fix pip versions
  • The call to setuptools installation may not be easily overridable
  • We can’t upgrade the system for any reason

pinning setuptools seems to do the trick.

It’s a temporary workaround at most, far from a solution.

Let’s call it a win?

Better a fail : breaking stability and retro compatibility is always a fail. And even if packages are fixed in the first place, suddenly, already installed/running systems are broken without intervention where they were qualified to be working fine earlier.

https://bugs.launchpad.net/bugs/1991606 is the master bug for the affected packages in Ubuntu. Half of those packages can be blamed on me (since I am the author of distro-info and involved in others). Let’s get them fixed in Ubuntu. I fixed most packages in Ubuntu 23.04 (lunar) - the fix for devscripts is pending and drslib needs to be fixed in upstream. Next step: Prepare stable release updated to Ubuntu 22.10, 22.04 LTS, 20.04 LTS, 18.04 LTS.

If someone want to help, you can prepare the SRU Bug Template (see https://wiki.ubuntu.com/StableReleaseUpdates).

Yep, that’s conclude this detour and merge back to the main thread.

(BTW this is a little bit off topic, feel free to ignore as I don’t want to drag the focus away from pep440 enforcement.)

For people wander into this branch, here’s your save point~

Some more apt packages with “illegal” versions I found for focal:

apt package python package python package version
devscripts devscripts 2.20.2ubuntu2
drslib drslib 0.3.1p3
python3-duecredit duecredit 0.7.0.debian3
python3-ubuntutools ubuntu-dev-tools 0.176ubuntu20.04.1
twms twms 0.06y

The initial bug (faulty versions of python3-distro-info and python-debian) is finally fixed in Ubuntu >= 20.04 (focal).

For those that for some reason cannot use the suggested fix (i.e. installing things with pip install --use-pep517[^1]), version 67.5.1 implements a workaround for the specific problem described by @regebro:

Setuptools should simply not fail to install a package because an already installed package has a version number it doesn’t like.

Notes

  • Non-PEP 440 compliant versions are still deprecated and no-longer supported when building packages. Projects that don’t comply with PEP 440 are urged to change. Developers that support the idea that general projects should not necessarily be subject to PEP 440 are encouraged to create a discussion in the Python discourse for packaging.
    • Setuptools issue tracker is not the best place to discuss this because it involves the entire packaging ecosystem and the standards that setuptools and pip are supposed to follow.
  • Using the --use-pep517 flag in pip is still the recommended solution[^1].

Tips

  • Package developers are encouraged to run tests/automated builds that don’t hide deprecation warnings (e.g. python -X dev -m build, python -X dev -m pip -v and pytest warning filters)
  • Package developers can make setuptools less verbose by setting build’s --config-setting, e.g. python -m build -C--quiet.

Based on the reproducer reported in https://github.com/pypa/setuptools/issues/3772#issuecomment-1429496548 by @kiorky, we can see 67.5.1 working for the complex scenario described above:

> docker run --rm -e DEBIAN_FRONTEND=noninteractive -it ubuntu:bionic@sha256:4a45212e9518f35983a976eead0de5eecc555a2f047134e9dd2cfc589076a00d bash

apt update && apt install -q -y virtualenv python3-distro-info python3-distutils

mkdir -p /tmp/mypkg
cd /tmp/mypkg
echo "aaa=1" > mymod.py
cat <<EOF > setup.py
from setuptools import setup
setup(
    name="mypkg",
    version='1.0',
    py_modules=['mymod']
)
EOF

## An older version will cause the error:
virtualenv --system-site-packages /tmp/.venv1
/tmp/.venv1/bin/python -m pip install -U 'pip==23.0.1' 'setuptools==67.4.0'
/tmp/.venv1/bin/python -m pip install -e .
# ...
# File "/tmp/.venv1/lib/python3.8/site-packages/pkg_resources/__init__.py", line 2694, in parsed_version
#   raise packaging.version.InvalidVersion(f"{str(ex)} {info}") from None
# pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: '0.23ubuntu1' (package: distro-info)

## 67.5.1 will not cause the error:
virtualenv --system-site-packages /tmp/.venv2
/tmp/.venv2/bin/python -m pip install -U 'pip==23.0.1' 'setuptools==67.5.1'
/tmp/.venv2/bin/python -m pip -v install -e .
# Successfully installed mypkg-1.0

[^1]: Note that this is going to be the default in pip 23.1. See pypa/pip#8559 for details.

The discussion here hasn’t been very productive but thanks to @regebro’s comment above I’ve realized that this issue is more nuanced than what I thought at the beginning.

Should pip install x fail if any dependency of x has an invalid version? Annoying, but yes. At the same time, it’s a very unlikely situation, since invalid versions are not possible in PyPI if I understand correctly. Judging by the comments, it doesn’t seem to be the most common situation.

But then, should pip install x fail if any package present in site-packages, related to x or not, has an invalid version? That’s the current situation and seems a bit far fetched, but perhaps I’m missing some context. It’s important to note though that using virtual environments averts the issue, and for the cases that it can’t be done (such as really depending on a system-installed package), pinning setuptools seems to do the trick.

At least this breakage triggered some offending packages, namely python3-distro-info and python3-reportbug, to change their versioning schemes, which is something that wouldn’t have happened otherwise. Let’s call it a win?

In the meanwhile though, judging by the amount of upvotes and the number of comments flowing in, I think it would help if the maintainers edit the top comment and add a paragraph with the known workaround, which is IIUC to use a virtual environment and, if not possible, to pin setuptools, as well as telling people not to uninstall the offending packages at the risk of borking their system.

I second everyone like @regebro and @kozmaz87 that the main python installer should never fail on what’s known to work earlier.

Breaking compatibility and stability where we know deliberately the intent is just something intolerable, furthermore just for a stricter version parser.

Please reconsider to make this again just a warning, here lot of our install scripts are broken without any painfully option that to go find a way to pin down setuptools if it’s possible, and track down then again when setuptools or our packages set (like debian/ubuntu fixed packages which will have to go for a complete SRU process to land …) will have (maybe) a fix. This causes useless pain, expenses, technical debt, and maintainability cost to python users.

Hello Folks,

While I am waiting, have not heard back yet – I found a temporary work-around by modifying PKG-INFO:Version 7.5.3-deb10u1 to: PKG-INFO:Version: 7.5.3 under reportbug’s dist-packages directory, to match PEP 440 compliance of no dashes, etc. Successfully Installed my package and I can move on now.

I am not sure if this can help anyone or not and by far is definitely not the fix. Also, not sure of the ramifications of modifying the package version, but… Thank you @bdrung for guiding me in the right direction!

Matt

For that specific issue with reportbug on Debian stable, please contact the reportbug maintainers (debian-reportbug@lists.debian.org) and ask them to release a stable update with the fix from https://salsa.debian.org/reportbug-team/reportbug/-/commit/ca2e3b530682cd0ed8aab5597b84f8ff303673cc

Even if it ends up not supporting bog standard basic dev use-cases because of it? LOL

Do not get me wrong I really enjoyed all the insight you provided and all but people who should read this thread won’t read it if we keep steering off topic.

I do recognize this (and I always advocated for software to be less bloated), distros and repos were created to address linux’s shortcomings in the ABI department…etcetcetc but can we stick to topic here? I do appreciate this conversation I do but I think it is a red herring at the moment.

Entry into python for underdeveloped countries really isn’t going to inform this debate whether we should have changed versioning scheme enforcement in a packaging tool. If it was opt-in I would not be here.

Having said that I am also following this ticket on azure cli being 1.2GB to install into linux because the python sdk shipped inside it is that huge so there is definitely a linux sucks as a deployment platform debate to be had but it is not pertinent to python’s versioning.

I haven’t fully absorbed all the details above, but it does seem to me that there may be some important use cases excluded by PEP 440. If that’s the case, I would encourage you to distill the issue at pypa/packaging-problems or in the packaging discourse, where this concern is more likely to get the appropriate attention.

FWIW, to provide a fuller explanation of what’s happening here:

  • When pip tries to build a project with the “legacy” build system, the executed setup.py for the package is exposed to the entire set of packages available on that Python installation.
  • setuptools needs to determine what’s installed to support setup_requires with the legacy install mechanism, for which it’s calling pkg_resources.working_set.resolve which looks at the entire environment.
  • As that scans the environment, it trips on the invalid version if a package in the environment has an improper version.
  • Debian/Ubuntu’s system Python have packages installed that have invalid versions (or your $organizations’ Python packages)

Please consider coordinating the rollout of this change with the pip team (et al) to allow for effective pinning

puts on pip and packaging maintainer hat

Hi. I am probably the person who set the ball rolling on this form of strictness within the Python packaging ecosystem. I can also tell you that I’ve been involved in multiple discussions with setuptools’ maintainers around this change (~all of those are in public spaces). 😃

Pinning things would not have helped in this case – this isn’t related to how setuptools builds your package, or how pip handles the built project, or how the two of them interact. The issue is with the environment you’re building in, which contains a package with a bad version – specifically, in OP, it’s because Debian’s Python packages have done things that degrade the UX (honestly, it’s not the first time). The only people who could seen and reported/handled this warning didn’t report this to the right people/handle this in… 2-9 years.

I’m certain that, if the maintainers of setuptools knew about this failure mode, they would’ve done something to provide some mechanisms to deal with this. However, if we don’t know about a failure mode, because Debian (or your internal packages) do things in the way that would trigger this failure mode and no one told us, we can’t do anything about a failure mode we don’t know about. And, once it’s been broken, the churn costs have been paid already – it doesn’t really make sense to undo things at that point.

At the end of the day, this is not a straightforward failure – it’s not an issue with your package (which pip would’ve warned you about), following best-practices of isolated from your system-provided Python environment will not trigger this issue and the folks who made the “mistake” of using non-PEP 440 versions didn’t action on the warnings when they were being presented.

Since this has been reported, there have been multiple improvements to make this less painful of a failure mode and #3780 tracks making deprecations coming out of setuptools more visible.

I can confidently say that the maintainers of setuptools, pip etc are well aware of the network effects of these changes, and that we understand that it can be difficult to deal with increased strictness causing issues because a transitive dependency three layers down does things incorrectly. To that end, we’ve been using the mechanisms we have to communicate this. Both setuptools and pip have been printing warnings related to this, and pip is going to get stricter around this area as well in this year (the change needed to happen in setuptools first for a complicated reasons that I’m not gonna go into).

While I’m sure that there’s more that the volunteers who maintain the foundational pieces of the Python ecosystem could have done to reduce pain for you, I would like to implore you to think about what you could’ve done to have flagged this to them.

Am I missing an available workaround?

Multiple ways to resolve/work around this have already been mentioned since this was posted. I’ll try to reiterated on what they are (and hopefully, my explanations above help clarify how/why that fixes/avoids issues).

  • If you “own” the package affected, you can do one of the following:

  • If you do not “own” the problematic package, you should inform the maintainers of the package and can do one of the following:

    • pass --use-pep517 to pip1 to opt all builds into the pyproject.toml-based build system, or
    • isolating your installation process from the system, by working in a virtual environment that can’t see system-provided packages.

Finally, I genuinely appreciate that setuptools as a project has made this change, even though it’s a slightly painful one and a potentially disruptive one. I should note that this is a change that genuinely makes things better and, honestly, increased strictness in what these tools accept will make things better for the overall ecosystem.

PS: The setuptools team has wanted to get rid of pkg_resources for a vast array of suboptimal behaviours, like the one that triggered this issue, but it’s a complicated problem and there hasn’t been sufficient (primarily-volunteer) developer availability to drive that effort over the line.

Footnotes

  1. That option has a bad name, we’ll change it to a better one some day – but it does what we need, enabling the pyproject.toml-based build system for all projects.

A few high-level comments on this:

  • “it’s because Debian’s Python packages have done things that degrade the UX”: It’s incompatibility. I don’t think anyone would be in good position of setting std and enforce it to the other side. If I have to choose I would say python should adapt to distro’s behavior as that’s the “platform” in “cross-platform”. Similar argument would also apply to the bad collaboration between python’s and distro’s package manager.
  • “you should inform the maintainers of the package …”: Have seen this in many places. This is essentially using the large user base to put more pressure on pkg maintainer, doesn’t sounds like a healthy relationship. And to my understanding it rarely works because many people simply don’t have that feedback channel available, or the weeks-months time of reporting+waiting for new release has already made it meaningless to go that direction.
    • There are also libs no longer maintained, e.g. research projects or code of papers.
    • Pkg maintainer may not willing to backport patches to old versions, while latest version can have behavior changes.
  • Regarding failure mode, it’s quite obvious that it’ll trigger failures, so why bothering with the “mode”? If we cannot prove there’s no high-visibility/impact failure mode at all, we should not make decision based on that.

… Projects should adapt to provide conformant versions (i.e. 0.23+ubuntu1) or environments that rely on such packages should pin to Setuptools < 66. These changes are necessary to keep Setuptools healthy and aligned with the goals of the packaging ecosystem. …

I would say version pinning should ONLY be suggested as a glass-breaking work around, to cover the gap before an issued is fixed in the near future.

Pinning would make future upgrade even harder to push out. Given the combination of:

  • Fast movement of python ecosystem
  • Limited backporting, even for security features
  • Relatively short support cycle

pinning basically put a time bomb in user code. IMO that’s not a good idea for infrastructure like setuptools, which is supposed to be reliable so that people can build their things on top.

I’m having the same error (pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: '0.23ubuntu1') using the following command:

pip3 install --upgrade pip wheel setuptools && \
pip3 install scikit-build && \
pip3 install opencv-python

I have tried to use older versions of setuptools and it’s failing because of a different error Could not build wheels for opencv-python which use PEP 517 and cannot be installed directly

I have the same issue with setuptools==65.5.1. When tagging and trying to install a local project I get:

pkg_resources.extern.packaging.version.InvalidVersion: Invalid version: '1.23.0-test.1.2.3'
$ pip list | grep setuptools
setuptools                65.5.1

1.23.0-test.1.2.3 is a valid semver 2.0.0 tag.

EDIT: this happens inside venv (unlike OP)

$ which python
/c/USERS/foobar/.virtualenvs/venv-3.8/Scripts/python

EDIT 2: Seems like an issue on my end - after checking PEP440 I learned that it’s not fully compatible with Semantic Versioning:

Semantic versions containing a hyphen (pre-releases - clause 10) or a plus sign (builds - clause 11) are not compatible with this PEP and are not permitted in the public version field.