coremltools: RuntimeError: PyTorch convert function for op 'fft_rfftn' not implemented.
I have been trying to convert PyTorch model to coreml, but face some issues during the conversion of fft_rfftn and fft_irfftn operators. Could you kindly introduce me how to register these operators like this?
File "convert.py", line 27, in <module>
ct.ImageType(name='mask', shape=mask_input.shape)])
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/_converters_entry.py", line 316, in convert
**kwargs
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/converter.py", line 175, in mil_convert
return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/converter.py", line 207, in _mil_convert
**kwargs
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/converter.py", line 293, in mil_convert_to_proto
prog = frontend_converter(model, **kwargs)
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/converter.py", line 103, in __call__
return load(*args, **kwargs)
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 80, in load
raise e
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 72, in load
prog = converter.convert()
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 230, in convert
convert_nodes(self.context, self.graph)
File "/Users/admin/anaconda3/envs/py36/lib/python3.6/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 67, in convert_nodes
"PyTorch convert function for op '{}' not implemented.".format(node.kind)
RuntimeError: PyTorch convert function for op 'fft_rfftn' not implemented.
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Reactions: 9
- Comments: 29 (1 by maintainers)
@RahulBhalley - Thanks for the suggest. I’ve created an internal issue.
Coremltools 6.2 includes support for PyTorch’s
fft_rfftn
op.Thanks @junpeiz for adding that.
@hzphzp the PyTorch’s
rfft2
is just a special case ofrfftn
(see https://pytorch.org/docs/stable/generated/torch.fft.rfft2.html), and it should be trivial to userfftn
to replace therfft2
in your model. If you still feel it’s necessary to add rfft2, please open another issue, as this issue is forrfftn
specifically.