diffusers: Graph compile breaks due to logging.info
Describe the bug
When building a unet via torch.compile(pipeline.unet)
, I receive an error.
Reproduction
- Load SD 2.1
ptx0/pseudo-flex-base
- Compile the unet
- See that the error prevents continuance
- Remove the logging.info line from the module
- Compile works.
Logs
in skipfiles: Logger.info | info /usr/lib/python3.9/logging/__init__.py
from user code:
File "/notebooks/container/discord-tron-client/.venv/lib/python3.9/site-packages/diffusers/models/unet_2d_condition.py", line 730, in forward
logger.info("Forward upsample size to force interpolation output size.")
Set torch._dynamo.config.verbose=True for more information
System Info
sd_xl diffusers branch.
diffusers
version: 0.18.0.dev0- Python version: 3.9.16
- PyTorch version (GPU?): 2.0.1+cu117 (True)
- Huggingface_hub version: 0.14.1
- Transformers version: 4.30.2
- Accelerate version: 0.18.0
- xFormers version: 0.0.20.dev526
Who can help?
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 15 (10 by maintainers)
Most upvoted comments
+2
bghira on Aug 7, 2023