coremltools: Error when converting tensorflow model to CoreML
Issue
I have model that is trained in tensorflow 2.x. The model works perfectly with tensorflow, openvino and onnxruntime format but doesn;t get converted in coreml. The model inference is perfect in tensorflow but when I try to convert it into coreml format I get the following error.
---------------------------------------------------------------------------
InvalidArgumentError Traceback (most recent call last)
File ~/SageMaker/envs/coreml_env/lib64/python3.8/site-packages/tensorflow/python/framework/importer.py:496, in _import_graph_def_internal(graph_def, input_map, return_elements, validate_colocation_constraints, name, producer_op_list)
495 try:
--> 496 results = c_api.TF_GraphImportGraphDefWithResults(
497 graph._c_graph, serialized, options) # pylint: disable=protected-access
498 results = c_api_util.ScopedTFImportGraphDefResults(results)
InvalidArgumentError: Input 0 of node Model1/FPN/FPN1/bn/AssignNewValue was passed float from Model1/FPN/FPN1/bn/FusedBatchNormV3/ReadVariableOp/resource:0 incompatible with expected resource.
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
Cell In[15], line 13
6 width = 256
8 input_shape = ct.Shape(shape=(ct.RangeDim(lower_bound=1, upper_bound=-1),
9 ct.RangeDim(lower_bound=height, upper_bound=1024),
10 ct.RangeDim(lower_bound=width, upper_bound=1024),
11 3))
---> 13 c_model = ct.convert(model, inputs=[ct.TensorType(shape=input_shape, name=input_name)], source='tensorflow')
Source Code
Here is the source code for loading and converting the model in coreml format
Import coremltools as ct
model_pth = "./temp_with_weights_model.h5"
model = tf.keras.models.load_model(model_pth)
print(ct.__version__)
input_name = model.inputs[0].name
height = 256
width = 256
input_shape = ct.Shape(shape=(ct.RangeDim(lower_bound=1, upper_bound=-1),
ct.RangeDim(lower_bound=height, upper_bound=1024),
ct.RangeDim(lower_bound=width, upper_bound=1024),
3))
c_model = ct.convert(model, inputs=[ct.TensorType(shape=input_shape, name=input_name)], source='tensorflow')
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 17
@YifanShenSZ I replaced my custom batchnorm with official Tensorflow batchnorm layer but now model is not getting converted for dynamic shapes. But if i make it static shape, the model converts to CoreML and works properly. I can share the error as well as the layer that is causing the issue for dynamic shape.
Code
Issue
I tried bisecting the model into parts. The BatchNorm layer after Conv layer in ConvUnits custom layer is causing the issue. But if I remove the bathcnorm layer, model is getting convertd into coreml. The converted model is not that accurate as the original model though