SHARK: RX 7900 XTX "rocm://1 is not supported"
Important note: it is working, but only with Vulkan, not with ROCm. However, I have installed ROCm 5.5 and it does support RX 7900 XTX: https://rocm.docs.amd.com/en/docs-5.5.1/release/windows_support.html#windows-supported-gpus
Verifying that model artifacts were downloaded successfully to C:\Users\f1am3d\.local/shark_tank/euler_scale_model_input_fp16_torch\euler_scale_model_input_fp16_torch.mlir...
loading existing vmfb from: C:\Users\f1am3d\Downloads\stable-diffusion\shark\euler_scale_model_input_fp16.vmfb
rocm://1 is not supported.
failed to download model, falling back and using import_mlir
loading existing vmfb from: C:\Users\f1am3d\Downloads\stable-diffusion\shark\euler_scale_model_input_1_512_512_rocm_fp16.vmfb
rocm://1 is not supported.
Traceback (most recent call last):
File "apps\stable_diffusion\src\schedulers\shark_eulerdiscrete.py", line 115, in compile
File "apps\stable_diffusion\src\utils\utils.py", line 110, in get_shark_model
File "apps\stable_diffusion\src\utils\utils.py", line 70, in _compile_module
File "shark\shark_inference.py", line 210, in load_module
self.shark_runner = SharkRunner(
^^^^^^^^^^^^
File "shark\shark_runner.py", line 84, in __init__
sys.exit(1)
SystemExit: 1
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "gradio\routes.py", line 488, in run_predict
File "gradio\blocks.py", line 1431, in process_api
File "gradio\blocks.py", line 1117, in call_function
File "gradio\utils.py", line 350, in async_iteration
File "gradio\utils.py", line 343, in __anext__
File "anyio\to_thread.py", line 33, in run_sync
File "anyio\_backends\_asyncio.py", line 2101, in run_sync_in_worker_thread
File "anyio\_backends\_asyncio.py", line 828, in run
File "gradio\utils.py", line 326, in run_sync_iterator_async
File "gradio\utils.py", line 695, in gen_wrapper
File "ui\txt2img_ui.py", line 160, in txt2img_inf
File "apps\stable_diffusion\src\schedulers\sd_schedulers.py", line 103, in get_schedulers
File "apps\stable_diffusion\src\schedulers\shark_eulerdiscrete.py", line 130, in compile
File "apps\stable_diffusion\src\schedulers\shark_eulerdiscrete.py", line 93, in _import
File "apps\stable_diffusion\src\utils\utils.py", line 178, in compile_through_fx
File "apps\stable_diffusion\src\utils\utils.py", line 70, in _compile_module
File "shark\shark_inference.py", line 210, in load_module
self.shark_runner = SharkRunner(
^^^^^^^^^^^^
File "shark\shark_runner.py", line 84, in __init__
sys.exit(1)
SystemExit: 1
About this issue
- Original URL
- State: closed
- Created 10 months ago
- Comments: 20 (3 by maintainers)
@njsharpe Thank you for this advice. I will try and let you know about my results.