llm: Homebrew release can't install plugins that use PyTorch (no PyTorch for Python 3.12 yet)

macOS Sonoma 14.0, M1 Pro

$ brew install llm
<skipped, no issues>
$ llm
Traceback (most recent call last):
  File "/opt/homebrew/bin/llm", line 5, in <module>
    from llm.cli import cli
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/__init__.py", line 18, in <module>
    from .plugins import pm
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/plugins.py", line 37, in <module>
    mod = importlib.import_module(plugin)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/python@3.12/3.12.0/Frameworks/Python.framework/Versions/3.12/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/default_plugins/openai_models.py", line 6, in <module>
    import openai
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/__init__.py", line 19, in <module>
    from openai.api_resources import (
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/api_resources/__init__.py", line 1, in <module>
    from openai.api_resources.audio import Audio  # noqa: F401
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/api_resources/audio.py", line 4, in <module>
    from openai import api_requestor, util
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/api_requestor.py", line 24, in <module>
    import requests
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/requests/__init__.py", line 147, in <module>
    from . import packages, utils
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/requests/utils.py", line 24, in <module>
    from . import certs
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/requests/certs.py", line 14, in <module>
    from certifi import where
ModuleNotFoundError: No module named 'certifi'

If I go and run:

$ brew install python-certifi

and try again:

$ llm
Traceback (most recent call last):
  File "/opt/homebrew/bin/llm", line 5, in <module>
    from llm.cli import cli
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/__init__.py", line 18, in <module>
    from .plugins import pm
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/plugins.py", line 37, in <module>
    mod = importlib.import_module(plugin)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/python@3.12/3.12.0/Frameworks/Python.framework/Versions/3.12/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/default_plugins/openai_models.py", line 17, in <module>
    import yaml
ModuleNotFoundError: No module named 'yaml'

So I do:

$ brew install pyyaml

and now we are in business.

About this issue

  • Original URL
  • State: closed
  • Created 8 months ago
  • Comments: 20 (14 by maintainers)

Commits related to this issue

Most upvoted comments

My problem here is that I want my tools to be possible to work with zero knowledge of Python at all. Conda and venv are very much tools for Python developers.

I thought I had that with Homebrew, but clearly there are some nasty traps waiting for people that way as well.

One option is I could start baking this stuff into Datasette Desktop, which would give me the ability to maintain complete control over the bundled Python without it conflicting with whatever else is going on with their system.

Installing Python apps via Homebrew is generally problematic, has been for me anyway. Brew likes to update Python as often as it updates the other packages in the system, and library dependencies get incrementally out of sync. As Brew does not run pip upgrades. What works today might silently break tomorrow after the next Brew update run.

Also, managing a more mainstream Python distribution such as 3.11 along with the latest that Brew might install (is it 3.12 now?) leads to mysterious errors, because indeed the ML related PyPI packages tend to not work with the very latest Python. Often the same with the latest CUDA toolkit.

I think you’ve found a systematic problem in this rabbit hole.

Somehow gently guiding users to a venv or conda setup makes things much more stable but also more complicated for newcomers. Not sure what the silver bullet for usability is here, without turning the QuickStart into a small venv tutorial.

Some repos just include the venv commands as part of the README flow, but then don’t mention what it does. Which is not obvious for getting-started level newcomers.

Optimally the README would provide some guidance for an environment-managed upgrade flow, so that later installing something else along with llm does not break llm running in the same env.

I’ve stopped recommending Homebrew installation for the moment. I updated my blog entry too: https://simonwillison.net/2023/Oct/26/llm-embed-jina/

Aha!

/opt/homebrew/bin/llm python -m pip install torch
ERROR: Could not find a version that satisfies the requirement torch (from versions: none)
ERROR: No matching distribution found for torch

[notice] A new release of pip is available: 23.2.1 -> 23.3.1
[notice] To update, run: /opt/homebrew/Cellar/llm/0.11_1/libexec/bin/python -m pip install --upgrade pip

Another issue, maybe similar root cause? The error message makes no sense to me. Torch is installed.

$ llm install llm-embed-jina
WARNING: Skipping /opt/homebrew/lib/python3.12/site-packages/packaging-23.2.dist-info due to invalid metadata entry 'name'
Collecting llm-embed-jina
  Using cached llm_embed_jina-0.1.2-py3-none-any.whl.metadata (3.0 kB)
Requirement already satisfied: llm in /opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages (from llm-embed-jina) (0.11)
Collecting transformers (from llm-embed-jina)
  Using cached transformers-4.34.1-py3-none-any.whl.metadata (121 kB)
INFO: pip is looking at multiple versions of llm-embed-jina to determine which version is compatible with other requirements. This could take a while.
Collecting llm-embed-jina
  Using cached llm_embed_jina-0.1.1-py3-none-any.whl.metadata (3.0 kB)
  Using cached llm_embed_jina-0.1-py3-none-any.whl.metadata (3.0 kB)
ERROR: Cannot install llm-embed-jina==0.1, llm-embed-jina==0.1.1 and llm-embed-jina==0.1.2 because these package versions have conflicting dependencies.

The conflict is caused by:
    llm-embed-jina 0.1.2 depends on torch
    llm-embed-jina 0.1.1 depends on torch
    llm-embed-jina 0.1 depends on torch

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts