llm: Homebrew release can't install plugins that use PyTorch (no PyTorch for Python 3.12 yet)
macOS Sonoma 14.0, M1 Pro
$ brew install llm
<skipped, no issues>
$ llm
Traceback (most recent call last):
File "/opt/homebrew/bin/llm", line 5, in <module>
from llm.cli import cli
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/__init__.py", line 18, in <module>
from .plugins import pm
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/plugins.py", line 37, in <module>
mod = importlib.import_module(plugin)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.12/3.12.0/Frameworks/Python.framework/Versions/3.12/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/default_plugins/openai_models.py", line 6, in <module>
import openai
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/__init__.py", line 19, in <module>
from openai.api_resources import (
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/api_resources/__init__.py", line 1, in <module>
from openai.api_resources.audio import Audio # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/api_resources/audio.py", line 4, in <module>
from openai import api_requestor, util
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/openai/api_requestor.py", line 24, in <module>
import requests
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/requests/__init__.py", line 147, in <module>
from . import packages, utils
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/requests/utils.py", line 24, in <module>
from . import certs
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/requests/certs.py", line 14, in <module>
from certifi import where
ModuleNotFoundError: No module named 'certifi'
If I go and run:
$ brew install python-certifi
and try again:
$ llm
Traceback (most recent call last):
File "/opt/homebrew/bin/llm", line 5, in <module>
from llm.cli import cli
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/__init__.py", line 18, in <module>
from .plugins import pm
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/plugins.py", line 37, in <module>
mod = importlib.import_module(plugin)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.12/3.12.0/Frameworks/Python.framework/Versions/3.12/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/llm/0.11_1/libexec/lib/python3.12/site-packages/llm/default_plugins/openai_models.py", line 17, in <module>
import yaml
ModuleNotFoundError: No module named 'yaml'
So I do:
$ brew install pyyaml
and now we are in business.
About this issue
- Original URL
- State: closed
- Created 8 months ago
- Comments: 20 (14 by maintainers)
Commits related to this issue
- llm: migrate to `python@3.12` Signed-off-by: Rui Chen <rui@chenrui.dev> llm: need `python-setuptools` Signed-off-by: Rui Chen <rui@chenrui.dev> — committed to Homebrew/homebrew-core by chenrui333 8 months ago
- Remove homebrew suggestion for the moment, refs #315 — committed to simonw/llm by simonw 8 months ago
- Remove homebrew suggestion for the moment, refs #315 !stable-docs — committed to simonw/llm by simonw 8 months ago
- Homebrew installation, refs #315, #397 This reverts commit abcb457b20367ee56e27602e3553bb4bd6a17312. — committed to simonw/llm by simonw 5 months ago
My problem here is that I want my tools to be possible to work with zero knowledge of Python at all. Conda and venv are very much tools for Python developers.
I thought I had that with Homebrew, but clearly there are some nasty traps waiting for people that way as well.
One option is I could start baking this stuff into Datasette Desktop, which would give me the ability to maintain complete control over the bundled Python without it conflicting with whatever else is going on with their system.
Installing Python apps via Homebrew is generally problematic, has been for me anyway. Brew likes to update Python as often as it updates the other packages in the system, and library dependencies get incrementally out of sync. As Brew does not run pip upgrades. What works today might silently break tomorrow after the next Brew update run.
Also, managing a more mainstream Python distribution such as 3.11 along with the latest that Brew might install (is it 3.12 now?) leads to mysterious errors, because indeed the ML related PyPI packages tend to not work with the very latest Python. Often the same with the latest CUDA toolkit.
I think you’ve found a systematic problem in this rabbit hole.
Somehow gently guiding users to a venv or conda setup makes things much more stable but also more complicated for newcomers. Not sure what the silver bullet for usability is here, without turning the QuickStart into a small venv tutorial.
Some repos just include the venv commands as part of the README flow, but then don’t mention what it does. Which is not obvious for getting-started level newcomers.
Optimally the README would provide some guidance for an environment-managed upgrade flow, so that later installing something else along with llm does not break llm running in the same env.
I’ve stopped recommending Homebrew installation for the moment. I updated my blog entry too: https://simonwillison.net/2023/Oct/26/llm-embed-jina/
Aha!
Another issue, maybe similar root cause? The error message makes no sense to me. Torch is installed.