bitsandbytes: AttributeError: module 'bitsandbytes.nn' has no attribute 'Linear4bit'
===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please run
python -m bitsandbytes
and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
bin /usr/local/lib/python3.10/dist-packages/bitsandbytes/libbitsandbytes_cuda118.so
CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching in backup paths…
CUDA SETUP: CUDA runtime path found: /usr/local/cuda/lib64/libcudart.so.11.0
CUDA SETUP: Highest compute capability among GPUs detected: 8.0
CUDA SETUP: Detected CUDA version 118
CUDA SETUP: Loading binary /usr/local/lib/python3.10/dist-packages/bitsandbytes/libbitsandbytes_cuda118.so…
/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: /usr/lib64-nvidia did not contain [‘libcudart.so’, ‘libcudart.so.11.0’, ‘libcudart.so.12.0’] as expected! Searching further paths…
warn(msg)
/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath(‘/sys/fs/cgroup/memory.events /var/colab/cgroup/jupyter-children/memory.events’)}
warn(msg)
/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath(‘http’), PosixPath(‘8013’), PosixPath(‘//172.28.0.1’)}
warn(msg)
/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath(‘–logtostderr --listen_host=172.28.0.12 --target_host=172.28.0.12 --tunnel_background_save_url=https’), PosixPath(‘//colab.research.google.com/tun/m/cc48301118ce562b961b3c22d803539adc1e0c19/gpu-a100-s-2r5rpr2idmjk0 --tunnel_background_save_delay=10s --tunnel_periodic_background_save_frequency=30m0s --enable_output_coalescing=true --output_coalescing_required=true’)}
warn(msg)
/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath(‘/env/python’)}
warn(msg)
/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: WARNING: The following directories listed in your path were found to be non-existent: {PosixPath(‘//ipykernel.pylab.backend_inline’), PosixPath(‘module’)}
warn(msg)
/usr/local/lib/python3.10/dist-packages/bitsandbytes/cuda_setup/main.py:145: UserWarning: Found duplicate [‘libcudart.so’, ‘libcudart.so.11.0’, ‘libcudart.so.12.0’] files: {PosixPath(‘/usr/local/cuda/lib64/libcudart.so.11.0’), PosixPath(‘/usr/local/cuda/lib64/libcudart.so’)}… We’ll flip a coin and try one of these, in order to fail forward.
Either way, this might cause trouble in the future:
If you get CUDA error: invalid device function errors, the above might be the cause and the solution is to make sure only one [‘libcudart.so’, ‘libcudart.so.11.0’, ‘libcudart.so.12.0’] in the paths that we search based on your env.
warn(msg)
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ in <cell line: 10>:10 │
│ │
│ /usr/local/lib/python3.10/dist-packages/peft/init.py:22 in <module> │
│ │
│ 19 │
│ 20 version = “0.4.0.dev0” │
│ 21 │
│ ❱ 22 from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING, PEFT_TYPE_TO_CONFIG_MAPPING, get_ │
│ 23 from .peft_model import ( │
│ 24 │ PeftModel, │
│ 25 │ PeftModelForCausalLM, │
│ │
│ /usr/local/lib/python3.10/dist-packages/peft/mapping.py:16 in <module> │
│ │
│ 13 # See the License for the specific language governing permissions and │
│ 14 # limitations under the License. │
│ 15 │
│ ❱ 16 from .peft_model import ( │
│ 17 │ PeftModel, │
│ 18 │ PeftModelForCausalLM, │
│ 19 │ PeftModelForSeq2SeqLM, │
│ │
│ /usr/local/lib/python3.10/dist-packages/peft/peft_model.py:31 in <module> │
│ │
│ 28 from transformers.modeling_outputs import SequenceClassifierOutput, TokenClassifierOutpu │
│ 29 from transformers.utils import PushToHubMixin │
│ 30 │
│ ❱ 31 from .tuners import ( │
│ 32 │ AdaLoraModel, │
│ 33 │ AdaptionPromptModel, │
│ 34 │ LoraModel, │
│ │
│ /usr/local/lib/python3.10/dist-packages/peft/tuners/init.py:21 in <module> │
│ │
│ 18 # limitations under the License. │
│ 19 │
│ 20 from .adaption_prompt import AdaptionPromptConfig, AdaptionPromptModel │
│ ❱ 21 from .lora import LoraConfig, LoraModel │
│ 22 from .adalora import AdaLoraConfig, AdaLoraModel │
│ 23 from .p_tuning import PromptEncoder, PromptEncoderConfig, PromptEncoderReparameterizatio │
│ 24 from .prefix_tuning import PrefixEncoder, PrefixTuningConfig │
│ │
│ /usr/local/lib/python3.10/dist-packages/peft/tuners/lora.py:735 in <module> │
│ │
│ 732 │ │ │ │ result += output │
│ 733 │ │ │ return result │
│ 734 │ │
│ ❱ 735 │ class Linear4bit(bnb.nn.Linear4bit, LoraLayer): │
│ 736 │ │ # Lora implemented in a dense layer │
│ 737 │ │ def init( │
│ 738 │ │ │ self, │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
AttributeError: module ‘bitsandbytes.nn’ has no attribute ‘Linear4bit’
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 16
Pretty sure it’s related to https://github.com/huggingface/peft/pull/476
How to fix it: Wherever in your
requirements.txtyou usegit+https://github.com/huggingface/peft, replace it withgit+https://github.com/huggingface/peft@smangrul/release-v0.3.0for now. That’s the latest stable release it seems.我TM在win 11下跑通了,–quantization_bit 4 这个改成 --quantization_bit 8 ,然后把bitsandbytes-windows-main\bitsandbytes 全量拷贝到C:\Users\用户\AppData\Roaming\Python\Python310\site-packages\bitsandbytes 报错原因是bitsandbytes不支持window,bitsandbytes-windows目前仅支持8bit量化。