onnx: PyTorch export crash with onnx >= 1.8.0 unless import onnx first on windows
Similar to ~#2808~ https://github.com/apple/coremltools/issues/920 and https://github.com/onnx/onnx/issues/2940
The script crashes when onnx is imported after pytorch. The script completes successfully when onnx is imported before pytorch, or when onnx installed version <= 1.7.0
# import onnx # uncomment and pass
import torch
from onnx import ModelProto
class M(torch.nn.Module):
def forward(self, x, y):
return x + y
x = torch.randn(2, 3)
y = torch.randn(2, 3)
import io
f = io.BytesIO()
torch.onnx.export(M(), (x, y), f, verbose=True, input_names=['x', 'y'])
Both packages are installed like below
pip install torch==1.8.1+cpu -f https://download.pytorch.org/whl/torch_stable.html
pip install onnx==1.8
Based on previous issues, I suspect this is also related to different protobuf versions. Can someone explain what is going on underneath, and how to resolve & prevent this in the future?
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Reactions: 1
- Comments: 31 (31 by maintainers)
Commits related to this issue
- added workaround for https://github.com/onnx/onnx/issues/3493 — committed to GoodDogAI/rossac by lostmsu 3 years ago
- added workaround for https://github.com/onnx/onnx/issues/3493 — committed to GoodDogAI/rossac by lostmsu 3 years ago
Then let’s make a PR and fix this. Meantime I will ask VC++ team for help.
That was because at line https://github.com/jcwchen/onnx/blob/b1c8cdfc137c43364a37a16d9e9b4f3895a2e61d/.github/workflows/release_win.yml#L77
You set protobuf_MSVC_STATIC_RUNTIME to ON. You need set it to off.
Build it with “RelWithDebInfo”, then try it again.
Now I can guess, it must because someone dynamically linked to protobuf and there is a version conflict.