llama_index: [Bug]: No module named 'llama_index.core.llms.generic_utils'
Bug Description
I installed llama-index on google colab notebook.
!pip install llama-index-embeddings-anyscale
!pip install -U llama-index llama-index-core llama-index-llms-openai
I’m trying to import
from llama_index.embeddings.anyscale import AnyscaleEmbedding
I’m getting this error
ModuleNotFoundError: No module named ‘llama_index.core.llms.generic_utils’
Version
llama-index==0.10.10
Steps to Reproduce
Just install llama-index, and llama-index-embeddings-anyscale on colab and import
from llama_index.embeddings.anyscale import AnyscaleEmbedding
Relevant Logs/Tracbacks
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
[<ipython-input-7-6245d537c86e>](https://localhost:8080/#) in <cell line: 21>()
19 from langchain_core.documents import Document
20 from langchain.vectorstores.utils import maximal_marginal_relevance
---> 21 from llama_index.embeddings.anyscale import AnyscaleEmbedding
22 from itertools import islice
23 from typing import Iterable, Iterator, List, TypeVar, Dict, Tuple, Union, Optional
2 frames
[/usr/local/lib/python3.10/dist-packages/llama_index/embeddings/anyscale/utils.py](https://localhost:8080/#) in <module>
1 from typing import Optional, Tuple
2
----> 3 from llama_index.core.llms.generic_utils import get_from_param_or_env
4
5 DEFAULT_ANYSCALE_API_BASE = "https://api.endpoints.anyscale.com/v1"
ModuleNotFoundError: No module named 'llama_index.core.llms.generic_utils'
---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.
To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------
About this issue
- Original URL
- State: closed
- Created 4 months ago
- Comments: 18 (3 by maintainers)
I managed to solve this problem by copying generic_utils.py from llama_index/core/base/llms into llama_index/core/llms
🤖
Hey @MudassirAqeelAhmed! Good to see you back here. Hope you’re doing well.
Based on the error message you provided, it seems like the module ‘llama_index.core.llms.generic_utils’ is not found in your Python environment. This could be due to a few reasons:
The module ‘llama_index.core.llms.generic_utils’ does not exist in the LlamaIndex package. This could be a bug in the package or the module might have been removed in a recent update.
The LlamaIndex package is not properly installed in your Python environment. You can try reinstalling the package using pip:
The LlamaIndex package is not compatible with the Python version you are using. You can check the Python version compatibility of the LlamaIndex package.
The LlamaIndex package might have dependencies that are not installed in your Python environment. You can check the package documentation for any dependencies and install them.
However, without more information about the LlamaIndex package and its dependencies, it’s hard to provide a definitive solution. Please check the package documentation or contact the package maintainers for more information.
In the meantime, I found some similar issues in the LlamaIndex repository that might be helpful:
[Bug]: Getting module not found error for llama_index.core.llms.generic_utils which was last updated on February 21, 2024. The suggested solution is to wait for about 10 minutes and then run
pip install -U llama-index-core llama-index-llms-openai llama-index-embeddings-openai.[Bug]: No module named ‘llama_index.core.llms.llama_utils’ which was last updated on February 14, 2024. The suggested solution is to use temporary legacy imports or use the command-line tool to update existing code and notebooks after installing v0.10.0.
[Bug]: langchain.embeddings.base not found which was last updated on September 18, 2023. This issue was closed by a pull request: fix and bump langchain dependencies on September 18, 2023.
I hope this helps! If you have any other questions or need further clarification, feel free to ask.
Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@ycd what are your project reqs? It seems like at this point I would just start a fresh venv with latest versions of things
And yea, sadly we can’t control docs from people that use the llama-index package 😅
@logan-markewich to be fair I was just yapping, solved it already but to be more specific and add a little bit context; I encountered like 3 integrations/projects that has obsolete documentation that does not ‘just works’, I even ended up monkey-patching some of them to get it working(not this one).
EDIT: I don’t remember the exact issue but some of the out of date docs were external projects that provides an integration with LLamaIndex (e.g Chainlit)
I had the exact same issue, this worked for me as well, thank you!
Thanks you! This worked for me as well! No other methods mentioned worked! Thank you!
and then select my_venv kernal in your notebook