Stable-Diffusion-WebUI-TensorRT: Unable to install TensorRT on automatic1111 1.8.0

Hello

I tried installing TensorRT on automatic1111 version 1.8.0 and it fails, (I also tested on 1.7.0 and there was no problems)

*** Error running install.py for extension D:\stable-diffusion\sd180\webui\extensions\Stable-Diffusion-WebUI-TensorRT.
*** Command: "D:\stable-diffusion\sd180\system\python\python.exe" "D:\stable-diffusion\sd180\webui\extensions\Stable-Diffusion-WebUI-TensorRT\install.py"
*** Error code: 1
*** stderr: Traceback (most recent call last):
***   File "D:\stable-diffusion\sd180\webui\extensions\Stable-Diffusion-WebUI-TensorRT\install.py", line 3, in <module>
***     from importlib_metadata import version
*** ModuleNotFoundError: No module named 'importlib_metadata'
You are up to date with the most recent release.
Launching Web UI with arguments: --update-check --xformers
*** Error loading script: trt.py
    Traceback (most recent call last):
      File "D:\stable-diffusion\sd180\webui\modules\scripts.py", line 527, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "D:\stable-diffusion\sd180\webui\modules\script_loading.py", line 10, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "D:\stable-diffusion\sd180\webui\extensions\Stable-Diffusion-WebUI-TensorRT\scripts\trt.py", line 8, in <module>
        from polygraphy.logger import G_LOGGER
    ModuleNotFoundError: No module named 'polygraphy'
---

On my other pc I am getting almost the same error but instead its “ModuleNotFoundError: Launch”

About this issue

  • Original URL
  • State: open
  • Created 4 months ago
  • Comments: 31

Most upvoted comments

well really simple method solved all the problems i mentioned before (error popups etc.)

starting webui with administrator permission

So I’m just confused about the “Could not open file D:\stable-diffusion\forge test\webui\models\Unet-onnx\juggernautXL_v9Rdphoto2Lightning.onnx” part, where would I get this onnx file from?

broh keep that running and it will probably take few moments and it will start generating the trt engine after few moments

Muhammad Rehan Aslam Machine Learning | ML OPS Engineer at Metex Labz 🌱 Passionate about AI and ML Ops with over 8 years of IT experience. [image: LinkedIn] LinkedIn https://www.linkedin.com/in/ranareehanaslam[image: GitHub] GitHub https://github.com/ranareehanaslam[image: Facebook] Facebook https://facebook.com/ranareehanaslam [image: Email] @.*** @.***> [image: Phone] +923012963333 <+923012963333> [image: WhatsApp] WhatsApp https://wa.me/923012963333

On Wed, Mar 13, 2024 at 10:49 PM NLJ @.***> wrote:

Installed without any problems with the forge “fork” of automatic1111.

@Hapseleg https://github.com/Hapseleg Have you gotten it to function? NVIDIA dev stated 1 month ago that TensorRT cannot work in Forge. Maybe (I’m not saying this would actually work, but) with a TRT model created in A1111 and moved into the Forge files.

Well… I got it installed and I am able to start up Forge but once I press the “Generate Default Engines” it will complain about my checkpoint not containing an onnx model or something like that. I tried to research it a bit but I gave up after some time 😅

— Reply to this email directly, view it on GitHub https://github.com/NVIDIA/Stable-Diffusion-WebUI-TensorRT/issues/286#issuecomment-1995161866, or unsubscribe https://github.com/notifications/unsubscribe-auth/BBWFLTELTNEKHVQL5AMKM6DYYCGRPAVCNFSM6AAAAABEFKTNUKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOJVGE3DCOBWGY . You are receiving this because you commented.Message ID: @.***>