mlc-llm: "dist/vicuna-v1-7b/float16/vicuna-v1-7b_vulkan_float16.so" not found
I installed mlc chat according to the documentation, but encountered an error
Cannot find vicuna-v1-7b lib in preferred path "dist/vicuna-v1-7b/float16/vicuna-v1-7b_vulkan_float16.so" or other candidate paths
My environment is Ubuntu 22.04,windows WSL
nvidia-smi output:
Mon May 15 21:34:53 2023
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.46 Driver Version: 531.61 CUDA Version: 12.1 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 3060 On | 00000000:01:00.0 On | N/A |
| 0% 38C P8 19W / 170W| 3702MiB / 12288MiB | 1% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| 0 N/A N/A 22 G /Xwayland N/A |
| 0 N/A N/A 23 G /Xwayland N/A |
| 0 N/A N/A 25 G /Xwayland N/A |
+---------------------------------------------------------------------------------------+
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 24 (5 by maintainers)
Commits related to this issue
- Minor follow-up for PR#117 (#149) * make local-id non-required * fix — committed to sunggg/mlc-llm by sunggg 6 months ago
Thanks for reporting, this is likely due to the lagging of the M1 build, we are upgrading the build and will report back here
The M1 build should now be fixed, please try to do
And try run the instructions here again https://mlc.ai/mlc-llm/#windows-linux-mac
osx-arm64 nightly shuold be fixed
that works for me, thanks!
@ScorpionOO8 Try specifying the version number, e.g.
yes, and I have tried download the other link: “https://huggingface.co/mlc-ai/demo-vicuna-v1-7b-int4” which include float16 folder, but it still doesn’t work .