mlc-llm: "dist/vicuna-v1-7b/float16/vicuna-v1-7b_vulkan_float16.so" not found

I installed mlc chat according to the documentation, but encountered an error

Cannot find vicuna-v1-7b lib in preferred path "dist/vicuna-v1-7b/float16/vicuna-v1-7b_vulkan_float16.so" or other candidate paths

image

My environment is Ubuntu 22.04,windows WSL

nvidia-smi output:

Mon May 15 21:34:53 2023
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.46                 Driver Version: 531.61       CUDA Version: 12.1     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                  Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf            Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA GeForce RTX 3060         On | 00000000:01:00.0  On |                  N/A |
|  0%   38C    P8               19W / 170W|   3702MiB / 12288MiB |      1%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+

+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|    0   N/A  N/A        22      G   /Xwayland                                 N/A      |
|    0   N/A  N/A        23      G   /Xwayland                                 N/A      |
|    0   N/A  N/A        25      G   /Xwayland                                 N/A      |
+---------------------------------------------------------------------------------------+

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 24 (5 by maintainers)

Commits related to this issue

Most upvoted comments

Thanks for reporting, this is likely due to the lagging of the M1 build, we are upgrading the build and will report back here

The M1 build should now be fixed, please try to do

conda install -c mlc-ai -c conda-forge mlc-chat-nightly --force-reinstall

And try run the instructions here again https://mlc.ai/mlc-llm/#windows-linux-mac

osx-arm64 nightly shuold be fixed

@ScorpionOO8 Try specifying the version number, e.g.

$ conda install -c mlc-ai mlc-chat-nightly=0.1.dev40

that works for me, thanks!

@ScorpionOO8 Try specifying the version number, e.g.

$ conda install -c mlc-ai mlc-chat-nightly=0.1.dev40

I don’t know why the path error happened. But I have a similar problem on mac1: #138. It seems that following the instruction (here: https://mlc.ai/mlc-llm/#windows-linux-mac) leads to a dist/vicuna-v1-7b folder that does not contain a float16 folder (https://huggingface.co/mlc-ai/demo-vicuna-v1-7b-int3/tree/main)

I don’t know, I’m just a beginner that has a similar problem with the M1 machine.

yes, and I have tried download the other link: “https://huggingface.co/mlc-ai/demo-vicuna-v1-7b-int4” which include float16 folder, but it still doesn’t work .