diffusers: Graph compile breaks due to logging.info

Describe the bug

When building a unet via torch.compile(pipeline.unet), I receive an error.

Reproduction

  1. Load SD 2.1 ptx0/pseudo-flex-base
  2. Compile the unet
  3. See that the error prevents continuance
  4. Remove the logging.info line from the module
  5. Compile works.

Logs

in skipfiles: Logger.info  | info /usr/lib/python3.9/logging/__init__.py

from user code:
   File "/notebooks/container/discord-tron-client/.venv/lib/python3.9/site-packages/diffusers/models/unet_2d_condition.py", line 730, in forward
    logger.info("Forward upsample size to force interpolation output size.")

Set torch._dynamo.config.verbose=True for more information

System Info

sd_xl diffusers branch.

  • diffusers version: 0.18.0.dev0
  • Python version: 3.9.16
  • PyTorch version (GPU?): 2.0.1+cu117 (True)
  • Huggingface_hub version: 0.14.1
  • Transformers version: 4.30.2
  • Accelerate version: 0.18.0
  • xFormers version: 0.0.20.dev526

Who can help?

@patrickvonplaten

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 15 (10 by maintainers)

Most upvoted comments

PYTHONVERS="3.9"
sed -i '/logger\.info/d' venv/lib/python${PYTHONVERS}/site-packages/diffusers/models/unet_2d_condition.py