llama_index: [Bug]: ModuleNotFoundError: No module named 'llama_index.cli.command_line.rag'; 'llama_index.cli.command_line' is not a package

Bug Description

I am getting this error attempting to upgrade a file using llamaindex-cli upgrade-file <file_path>

(venv) PS C:\Users\aaols\PycharmProjects\experiments> llamaindex-cli upgrade-file indexing.py
Traceback (most recent call last):
  File "C:\Python310\lib\runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Python310\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "C:\Users\aaols\PycharmProjects\experiments\venv\Scripts\llamaindex-cli.exe\__main__.py", line 4, in <module>
  File "C:\Users\aaols\PycharmProjects\experiments\venv\lib\site-packages\llama_index\cli\command_line.py", line 4, in <module>
    from llama_index.cli.command_line.rag import RagCLI, default_ragcli_persist_dir
ModuleNotFoundError: No module named 'llama_index.cli.command_line.rag'; 'llama_index.cli.command_line' is not a package

My llama deps

llama-index==0.10.6
llama-index-agent-openai==0.1.1
llama-index-cli==0.1.2.post1
llama-index-core==0.10.10
llama-index-embeddings-openai==0.1.1
llama-index-legacy==0.9.48
llama-index-llms-openai==0.1.2
llama-index-multi-modal-llms-openai==0.1.1
llama-index-program-openai==0.1.2
llama-index-question-gen-openai==0.1.1
llama-index-readers-file==0.1.3
llama-index-readers-github==0.1.5
llama-index-vector-stores-chroma==0.1.2
llama-index-vector-stores-deeplake==0.1.1
llamaindex-py-client==0.1.13

Version

0.10.6

Steps to Reproduce

run the command llamaindex-cli upgrade-file <file_path> on a file that still has some old, pre v0.10.0 imports in it

Relevant Logs/Tracbacks

No response

About this issue

  • Original URL
  • State: closed
  • Created 4 months ago
  • Comments: 17 (3 by maintainers)

Most upvoted comments

🤖

Hey @aaronjolson! Great to see you back with us. I hope all has been going well since our last chat. Let’s dive into this issue and see how we can get it resolved for you.

Based on the error message you provided, it seems like Python is unable to find the module ‘llama_index.cli.command_line.rag’. This could be due to several reasons:

  1. The module ‘llama_index.cli.command_line.rag’ does not exist in your project structure.
  2. The module exists, but it’s not in a directory that Python searches for modules. Python searches for modules in the directories listed in the PYTHONPATH environment variable. If the module is not in one of these directories, Python won’t be able to find it.
  3. The module exists and is in a directory listed in PYTHONPATH, but the directory doesn’t contain an init.py file. Python requires an init.py file in a directory to treat the directory as a package.

In the provided context, there is an import statement for ‘llama_index.cli.command_line.rag’ at the top of the file:

from llama_index.cli.command_line.rag import RagCLI, default_ragcli_persist_dir

This suggests that ‘llama_index.cli.command_line.rag’ is expected to be a module in your project. If you’re encountering a ModuleNotFoundError, I would recommend checking the above points in your project to resolve the issue.

However, based on the LlamaIndex codebase, the module ‘llama_index.cli.command_line.rag’ does not exist. The correct module seems to be ‘llama_index.cli.rag’. You can find this in the llama-index-cli/llama_index/cli/rag/init.py file.

Therefore, you might want to change your import statement to:

from llama_index.cli.rag import RagCLI, default_ragcli_persist_dir

Additionally, I noticed that you’re using LlamaIndex version 0.10.6, but the latest version is v0.10.10. It might be beneficial to upgrade to the latest version, as this issue might have been resolved in a newer version.

You can upgrade LlamaIndex by running:

pip install --upgrade llama-index

If the issue persists after making these changes, please provide more information about your setup, such as your Python version, operating system, and the exact steps you’re taking when you encounter this error. This will help in diagnosing the problem.

I hope this helps! If you have any other questions, feel free to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

OK I can confirm it worked within the last 30 seconds LOL. I literally tried a -U llama-index, didn’t pick up anything. Tried -U llama-index-core, picked up llama_index_core-0.10.11.post1-py3-none-any.whl, but that didn’t fix it. Then -U llama-index again and got 0.10.11. That finally ran upgrade. Thanks.

Proof it works in colab, https://colab.research.google.com/drive/1o9GiupK995a-0UG86716Kli_ZYy3r7wD?usp=sharing

Closing this issue out. For anyone encountering this issue, just run the install above

@logan-markewich Here are my upgraded deps, I had blew away my venv.

llama-index==0.10.10
llama-index-agent-openai==0.1.4
llama-index-cli==0.1.3
llama-index-core==0.10.10
llama-index-embeddings-openai==0.1.5
llama-index-indices-managed-llama-cloud==0.1.2
llama-index-legacy==0.9.48
llama-index-llms-openai==0.1.5
llama-index-multi-modal-llms-openai==0.1.3
llama-index-program-openai==0.1.3
llama-index-question-gen-openai==0.1.2
llama-index-readers-file==0.1.4
llama-index-readers-llama-parse==0.1.2
llama-index-vector-stores-chroma==0.1.2
llama-parse==0.3.4
llamaindex-py-client==0.1.13

still getting

(venv) PS C:\Users\aaols\PycharmProjects\ai-experiments> llamaindex-cli upgrade-file indexing.py
Traceback (most recent call last):
  File "C:\Python310\lib\runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Python310\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "C:\Users\aaols\PycharmProjects\ai-experiments\venv\Scripts\llamaindex-cli.exe\__main__.py", line 4, in <module>
    from llama_index.core.command_line.command_line import main
ModuleNotFoundError: No module named 'llama_index.core.command_line.command_line'