Nuitka: OSError: Can't get source for . TorchScript requires source access in order to carry out compilation, make sure original .py files are available.
1.9 Commercial: None Python: 3.10.2 (tags/v3.10.2:a58ebcc, Jan 17 2022, 14:12:15) [MSC v.1929 64 bit (AMD64)] Flavor: CPython Official Executable: C:\Users\Tensor\Desktop\awqprototype.venv\Scripts\python.exe OS: Windows Arch: x86_64 WindowsRelease: 10 Version C compiler: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.32.31326\bin\Hostx64\x64\cl.exe (cl 14.3).
from awq import AutoAWQForCausalLM
from transformers import AutoTokenizer, TextStreamer
# use instead pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu117
def main():
quant_path = r"C:\Users\Tensor\Desktop\awqprototype\dolphin-2.1-mistral-7B-AWQ"
quant_file = r"model.safetensors"
# Load model
model = AutoAWQForCausalLM.from_quantized(quant_path, quant_file, fuse_layers=True)
tokenizer = AutoTokenizer.from_pretrained(quant_path, trust_remote_code=True)
streamer = TextStreamer(tokenizer, skip_special_tokens=True)
# Convert prompt to tokens
prompt_template = """\
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.
USER: {prompt}
ASSISTANT:"""
tokens = tokenizer(
prompt_template.format(prompt="How are you today?"),
return_tensors='pt'
).input_ids.cuda()
# Generate output
generation_output = model.generate(
tokens,
streamer=streamer,
max_new_tokens=512
)
main()
- Provide in your issue the Nuitka options used
python -m nuitka --standalone --include-distribution-metadata=jinja2 --include-distribution-metadata=psutil --include-distribution-metadata=torchaudio --include-distribution-metadata=pandas --include-distribution-metadata=torchvision --include-distribution-metadata=torch --include-distribution-metadata=pandas --include-distribution-metadata=openai --report=awq.txt --plugin-enable=anti-bloat --noinclude-unittest-mode=allow --include-distribution-metadata=scipy .\main.py
[awq.txt](https://github.com/Nuitka/Nuitka/files/13453393/awq.txt)
Traceback (most recent call last): File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\torch_sources.py”, line 23, in get_source_lines_and_file File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\inspect.py”, line 1129, in getsourcelines File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\inspect.py”, line 958, in findsource OSError: could not get source code
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\main.py”, line 2, in <module> File “<frozen importlib._bootstrap>”, line 1027, in _find_and_load File “<frozen importlib._bootstrap>”, line 1006, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 688, in load_unlocked File "C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\awq_init.py", line 2, in <module awq> File “<frozen importlib._bootstrap>”, line 1027, in _find_and_load File “<frozen importlib._bootstrap>”, line 1006, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 688, in load_unlocked File "C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\awq\models_init.py", line 7, in <module awq.models> File “<frozen importlib._bootstrap>”, line 1027, in _find_and_load File “<frozen importlib._bootstrap>”, line 1006, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 688, in _load_unlocked File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\awq\models\gpt_bigcode.py”, line 2, in <module awq.models.gpt_bigcode> File “<frozen importlib._bootstrap>”, line 1027, in _find_and_load File “<frozen importlib._bootstrap>”, line 1006, in _find_and_load_unlocked File “<frozen importlib._bootstrap>”, line 688, in _load_unlocked File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\transformers\models\gpt_bigcode\modeling_gpt_bigcode.py”, line 55, in <module transformers.models.gpt_bi File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\torch\jit_script.py”, line 1338, in script File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\torch\jit\frontend.py”, line 262, in get_jit_def File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\torch_sources.py”, line 120, in parse_def File “C:\Users\Tensor\Desktop\AWQPRO~1\MAIN~1.DIS\torch_sources.py”, line 32, in get_source_lines_and_file OSError: Can’t get source for <compiled_function upcast_masked_softmax at 0x0000023F260966B0>. TorchScript requires source access in order to carry out compilation, make sure original .py files are available.
About this issue
- Original URL
- State: closed
- Created 7 months ago
- Comments: 29 (17 by maintainers)
Deleting is most effective and allows you to verify it. Otherwise, yes, there are options for Nuitka that allow to not include modules.