openvino: openvino.tools.pot.load_model fails
System information (version)
- OpenVINO=> 2022.3.0-9052-9752fafe8eb-releases/2022/3
- Operating System / Platform => Ubuntu20.04
- Compiler => gcc
- Problem classification => model quantization
Detailed description
Got a fastspeech2 onnx model, which has 3 inputs and one of these inputs has dynamic shape.
Convert to openvino IR by running: mo --model_path ${path} --input input1,input2,input3 --input_shape [1,1],[1,-1],[1,1], which can be running correctly.
When I tried to quantize this model to INT8 format following official documentation, got an error:
from openvino.tools.pot import load_model
model_config = {"model_name": "fastspeech2", "model": "xxx.xml", "weights": "xxx.bin"}
model = load_model(model_config=model_config) # error
# quantize below
...
Error log:
File "tools/openvino_fastspeech2_compare.py", line 146, in quant_vino_model
model = load_model(model_config=model_config)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/model_utils.py", line 19, in load_model
return CompressedModel(config=model_config, target_device=target_device)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/nx_model.py", line 39, in __init__
self._from_config(kwargs['config'], target_device)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/nx_model.py", line 60, in _from_config
self._models.append({'model': load_graph(model_config, target_device)})
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/graph_utils.py", line 48, in load_graph
graph_from_ir, meta_data = stdout_redirect(restore_graph_from_ir, xml_path, bin_path)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/utils/logger.py", line 120, in stdout_redirect
res = fn(*args, **kwargs)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/utils/ir_reader/restore_graph.py", line 40, in restore_graph_from_ir
new_graph = copy_graph_with_ops(ir.graph)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/utils/ir_reader/layer_to_class.py", line 427, in copy_graph_with_ops
new_graph.clean_up()
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/graph/graph.py", line 999, in clean_up
shape_inference(self)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/middle/passes/eliminate.py", line 164, in shape_inference
node.infer(node)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/ops/broadcast.py", line 58, in infer
assert target_shape is not None, 'Output shape is not defined for node "{}"'.format(node_name)
AssertionError: Output shape is not defined for node "Expand_5927"
I also tried to use onnxsim and convert simplified onnx model to openvino IR, but loading this simplified openvino IR still got an error (differs from above):
File "tools/openvino_fastspeech2_compare.py", line 146, in quant_vino_model
model = load_model(model_config=model_config)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/model_utils.py", line 19, in load_model
return CompressedModel(config=model_config, target_device=target_device)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/nx_model.py", line 39, in __init__
self._from_config(kwargs['config'], target_device)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/nx_model.py", line 60, in _from_config
self._models.append({'model': load_graph(model_config, target_device)})
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/graph/graph_utils.py", line 48, in load_graph
graph_from_ir, meta_data = stdout_redirect(restore_graph_from_ir, xml_path, bin_path)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/pot/utils/logger.py", line 120, in stdout_redirect
res = fn(*args, **kwargs)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/utils/ir_reader/restore_graph.py", line 40, in restore_graph_from_ir
new_graph = copy_graph_with_ops(ir.graph)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/utils/ir_reader/layer_to_class.py", line 427, in copy_graph_with_ops
new_graph.clean_up()
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/graph/graph.py", line 999, in clean_up
shape_inference(self)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/middle/passes/eliminate.py", line 164, in shape_inference
node.infer(node)
File "/root/pyenv/fmo/lib/python3.8/site-packages/openvino/tools/mo/ops/MatMul.py", line 132, in infer
assert compatible_dims(A_shape[-1], B_shape[-2]), \
AssertionError: MatMul input shapes are incorrect. COL_INDEX_DIMs are not equal. Node: MatMul_280. Shapes: [masked_array(data=[1, --, 2],
mask=[False, True, False],
fill_value=-1000000007), masked_array(data=[ 1, 512, 512],
mask=False,
fill_value=-1000000007)]
Anyone meet the same problem?
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 19 (12 by maintainers)
@mvafin Thanks, I set a break point to broadcast.py#L53, and here’s what I can get:
@andrei-kochin hi, any updates or anything need I to provide?