scipy: BUG: PyPI requires on 1.7.1 should allow Python 3.10+
Describe your issue.
With Python 3.10rc2 installed, pip install scipy
installs scipy 1.6.1 (the most recent version on PyPI that does not restrict Python ❤️.10) and not 1.7.1, resulting in a rather broken install. I get a lot of
Undefined symbol: _PyGen_Send
when running ordinary code. It looks like on PyPI scipy 1.7.1 requires Python < 3.10.
Installing master pip install git+https://github.com/scipy/scipy
fixes the problems.
Reproducing Code Example
see above
Error message
see above
SciPy/NumPy/Python version information
3.6.1
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Reactions: 6
- Comments: 43 (35 by maintainers)
We will release 1.7.2 with Python 3.10 support very soon.
The other part of this is: it would be great to at some point prevent older SciPy versions from being picked up. This is a little tricky to achieve, but can be done by adding a new 1.6.x release which aborts a from-source build immediately with a clear error message. That’s something I may get around to at some point, but it needs thorough testing to ensure we’re not breaking anything.
The problem @NickleDave is complaining about is not the fact SciPy doesn’t work on 3.10, but rather that SciPy is setting
python_requires = "...,<3.10"
, which is a misunderstanding of howRequires-Python
metadata slot works. This was added to allow packages to drop a Python version (2.7) without breaking people on that version[^1] - pip would simply scroll back in time and select the final release that included that Python version. It was never intended to cap Python; it was designed to allow dropping Python. If you want to cap Python, you can add a check in setup.py that throws an error if Python is too high. You should not use this metadata slot to cap Python. If you do use it that way, you must always use it or yank any old releases that don’t cap, otherwise you’d get the problem above, that pip installing SciPy on Python 3.10 will “work” but get the oldest version that didn’t have this cap. This is exactly the intended behavior, and it’s exactly wrong.The good news is this bad solve is only on Python 3.10, where even a good solve would likely not work (assuming SciPy really doesn’t support Python 3.10).
This has another unpleasant effect. If you use a locking package manager (Poetry, PDM), then the package manager will see that the lock file has a Python cap in it. It will then force the user to add the matching cap, since the lock file will be invalid on Python 3.10. If you are developing a library and don’t have a strong cap on SciPy, this is wrong, but it doesn’t matter, these systems are prioritizing the lock file over the library dependencies.
The bad news is this affects all downstream libraries using a locking package manager on any Python version.
IMO, a library’s Requires-Python should not depend on the lock file; if a system wants to force you to be truthful for a lock file’s sake, that should be a separate way to do this that does not get exported. Or it could just print a warning (“this lock file will only be valid on Python x to y”) But nether @frostming’s PDM nor Poetry support this, as far as I know. Until this is fixed, I think the only advice I can give is to avoid locking package managers for developing libraries.
[^1]: This would probably have been a lot better as a “min_python” slot, but at the time, it was important to allow older Python 3’s to be dropped while keeping 2.7 as well, so it was allowed to be an open expression. But it was never supposed to upper-cap!
This does not really make sense IMO.
I am not suggesting that scipy should properly “support” beta releases of Python i.e. fix issues quickly, but wouldn’t it make sense to be ready for when e.g. 3.10 becomes a full release?
And whilst it generally makes sense for latest releases to require >= python 3.x (the version that supports specific language features that the version needs), and for older releases to sometimes require < python 3.y (the version that stopped supporting specific language features that this old version required), I am unclear what the purpose is of putting version < python 3.10 on a current package so that pypi automatically installs a lower version that is less likely to support language features in the new beta.
This is tied to how the entire system works. Every file is standalone and contains its own metadata - wheels and SDists. Files are hashed; you can never upload two different wheels with the same name but different hashes. If you use pip-tool’s compile, you’ll get a requirements.txt that looks like https://github.com/pypa/manylinux/blob/main/docker/build_scripts/requirements3.10.txt - the hashes are listed for each file. You can’t change the files; that’s a core security requirement. So the only way to modify the metadata would be to add a new file that sits “alongside” the wheel or SDist file - probably each one, since each file technically could have different metadata, and then modify setuptools, flit, and every other build system to produce them, pip and every other installer to include them, pypi and every other wheelhouse to include them, pip-tools, Poetry, Pipenv, PDM, and every other locking system to handle them somehow, etc. And the incoming PEP 665. And any older version of anything (like pip) would not respect the new metadata.
And, you’d have to be very careful to only allow a very limited subset of changes - you wouldn’t want an update of a package to add a new malicious dependency via metadata that you could not avoid via hashes!
Would you be able to get old versions of dependency override files too? Plus other questions would need to be answered. It could be done, but it would be a major undertaking, for a rather small benefit - ideally, everyone should try to avoid capping things, play nicely with deprecation periods, test alpha/beta/rc, and just understand that in the real world, a library can’t perfectly specify it’s dependencies so that there will never be a breakage. That’s only possible when locking, for applications.
You should at least consider the scope of the problem and the other aspects before you call something “crazy stupid”. In fact, do other non-Python systems allow changing metadata? I think many of them do not. And it’s really easy to “fix” by releasing a new patch release with more limited metadata - this just is not ideal for large binary releases, but is fine for pure Python / smaller packages.
If someone carefully researches this, finds a core-dev sponsor to work with (I can suggest a few), and is willing to push this through, that’s what it would take to make it happen. The first step would be to discuss it on the Python forums.
Making all metadata immutable is crazily stupid - and this is an example just why that is.
Yes - for security reasons, most of the metadata should be immutable except metadata relating to co-dependencies - because you never know when co-dependency constraints are going to need to change.
I haven’t fully thought it through - but it might be possible to only allow increased constraints on existing releases, rather than removal of constraints.
Sorry if this is in the discussion somewhere, but why is it absolutely necessary to upper cap the
requires_python
?It has the drawback of causing
pip
to look for older versions that are not upper capped, as you have noticed, and it prevents people from testing with newer Python versions.It also forces everyone who depends on
scipy
to similarly upper cap theirrequires_python
, having a cascading effect on all downstream libraries, as you might imagine. Why do we want to prevent people from using new versions of Python?Specifying an upper cap in the setup.py also goes against the goals for what that file is doing. This is the distinction between “abstract” and “concrete” requirements, right? https://caremad.io/posts/2013/07/setup-vs-requirement/ https://packaging.python.org/en/latest/discussions/install-requires-vs-requirements/#install-requires-vs-requirements-files
What would be the downside of not capping it? Do we really expect that every new Python is just going to break
scipy
? I don’t think users are going to demand that there’s an immediate release ofscipy
for any new Python, simply because it’s not explicitly upper capped in thesetup.py
file. And again you are kind of limiting the number of people that can help you test or develop with new Python versions, like @Sophist-UK was saying.I ask because I am running into a problem similar to what @NeilGirdhar describes, where a solver complains because my
requires_python
doesn’t upper cap likescipy
does (in spite of me trying to change frompoetry
topdm
to avoid this 😢 ).I am definitely not a packaging expert but to me this post from @henryiii makes a pretty compelling case that libraries should not place upper caps on dependencies, and Python in particular:
https://iscinumpy.dev/post/bound-version-constraints/
See also this discussion in particular the points Paul Ganssle makes:
https://discuss.python.org/t/use-of-less-than-next-major-version-e-g-4-in-python-requires-setup-py/1066/3
I realize
scipy
is not exactly a pure Python package, and that’s putting it lightly, so maybe there’s some complexity in the build that makes this more complicated then I can understand.I’ve started a discussion here. https://discuss.python.org/t/requires-python-upper-limits/12663
Requires-Python is a free form string that gets parsed into a packaging SpecifierSet. It has no understanding of ‘maximum’ or ‘minimum’. It can only answer one question: does a version of Python (including patch!) match it? It returns true of it does. So if it “fails”, pip doesn’t know why, and assumes that it needs to keep looking for one that passes. It was developed for IPython, as they were the first to want to drop Python 2 support from their main releases and just have an LTS release for Python 2. It’s now a key component of NEP 29 - the fact the latest libraries don’t have new releases for Python X doesn’t stop Python X from being used.
This obviously doesn’t support upper caps properly. That was never the intention of metadata - it is supposed to help fix a broken solve, not break a solve. That’s why there is no requires-arch, etc. Which, by the way, was about half the complaints above - Apple released Apple Silicon on the existing 3.8, so a cap would not have fixed that.
Use setup.py to throw an error if you want to intentionally break something with a nice error.
And, yes, updating metadata would be nice from a package’s standpoint. It’s just a mess from all other standpoints. 😃 Doesn’t mean it’s impossible someday AFAIK, it just needs someone willing to work it out.
I don’t know what the technical solution might be, but as a user of SciPy I know what I want to happen - I want to get the latest version of SciPy that works on my version of Python. If there is no version of SciPy verified for my version of Python (i.e. because it is too new or a beta or too old), I want to be told that I need to use a different version of Python.
What I don’t want is for PyPi to install an earlier version of SciPy which is equally invalid for the version of Python I am running because (as actually happened) the earlier version was NOT limited to the Python versions it would install against.
(In other words, to some extent this is an issue with older versions NOT being constrained rather than current version being constrained.)