tensorflow: Can not convert a TF2 saved model to a TensorRT engine and save it.
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
- Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
- TensorFlow installed from (source or binary): binary
- TensorFlow version (use command below): nightly
- Python version: 3.7
- Bazel version (if compiling from source):
- GCC/Compiler version (if compiling from source):
- CUDA/cuDNN version: 10.0
- GPU model and memory: 2070Ti 8GB
You can collect some of this information using our environment capture
script
You can also obtain the TensorFlow version with: 1. TF 1.0: python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
2. TF 2.0: python -c "import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)"
Describe the current behavior I want to convert a TF2 saved model to a TensorRT engine (already built). I following the instruction here. However, it only gives me errors. I would give my code and log in following part.
BTW, I installed TensorRT 6.0.1 in my PC.
Describe the expected behavior
Convert saved model to a TensorRT engine and save it.
Code to reproduce the issue Provide a reproducible test case that is the bare minimum necessary to generate the problem.
import numpy as np
import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt
model = tf.keras.applications.MobileNetV2(input_shape=[128, 128, 3], include_top=False)
model.save("dir1/")
input_saved_model_dir = "dir1/"
output_saved_model_dir = "dir2/"
params = DEFAULT_TRT_CONVERSION_PARAMS._replace(precision_mode='FP16', maximum_cached_engines=16, is_dynamic_op=True)
converter = trt.TrtGraphConverterV2(input_saved_model_dir=input_saved_model_dir, conversion_params=params)
converter.convert()
# converter.save(output_saved_model_dir)
def my_input_fn():
num_runs = 10
for _ in range(num_runs):
inp1,inp2 = np.random.random([4, 128, 128, 3]), np.random.random([4, 128, 128, 1])
yield inp1,inp2
converter.build(input_fn=my_input_fn) # Generate corresponding TRT engines
converter.save(output_saved_model_dir) # Generated engines will be saved.
Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
WARNING:tensorflow:From /home/shl666/anaconda3/envs/tf-nightly/lib/python3.7/site-packages/tensorflow_core/python/ops/resource_variable_ops.py:1788: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
[11/29/2019 21:46:27 WARNING] From /home/shl666/anaconda3/envs/tf-nightly/lib/python3.7/site-packages/tensorflow_core/python/ops/resource_variable_ops.py:1788: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
INFO:tensorflow:Assets written to: inference/sample/saved_model/assets
[11/29/2019 21:46:28 INFO] Assets written to: inference/sample/saved_model/assets
INFO:tensorflow:Linked TensorRT version: (6, 0, 1)
INFO:tensorflow:Loaded TensorRT version: (6, 0, 1)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-5-9b2262c7c046> in <module>
14 yield inp1, inp2
15
---> 16 converter.build(input_fn=my_input_fn) # Generate corresponding TRT engines
17 converter.save(output_saved_model_dir) # Generated engines will be saved.
~/anaconda3/envs/tf-nightly/lib/python3.7/site-packages/tensorflow_core/python/compiler/tensorrt/trt_convert.py in build(self, input_fn)
1049 """
1050 for inp in input_fn():
-> 1051 self._converted_func(*map(ops.convert_to_tensor, inp))
1052
1053 def save(self, output_saved_model_dir):
~/anaconda3/envs/tf-nightly/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in __call__(self, *args, **kwargs)
1549 TypeError: For invalid positional/keyword argument combinations.
1550 """
-> 1551 return self._call_impl(args, kwargs)
1552
1553 def _call_impl(self, args, kwargs, cancellation_manager=None):
~/anaconda3/envs/tf-nightly/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in _call_impl(self, args, kwargs, cancellation_manager)
1568 "of {}), got {}. When calling a concrete function, positional "
1569 "arguments may not be bound to Tensors within nested structures."
-> 1570 ).format(self._num_positional_args, self._arg_keywords, args))
1571 args = list(args)
1572 for keyword in self._arg_keywords[len(args):]:
TypeError: Expected at most 1 positional arguments (and the rest keywords, of ['input_3']), got (<tf.Tensor: shape=(4, 128, 128, 3), dtype=float64, numpy=
array([[[[0.37694877, 0.14433618, 0.15041287],
[0.28948027, 0.52537459, 0.51882755],
[0.48836555, 0.45480828, 0.0779434 ],
....
....
....
[0.02064168, 0.76855384, 0.64690949],
[0.98667418, 0.9156569 , 0.58136711],
[0.84539588, 0.7338271 , 0.15894349]],
[[0.03578103, 0.24472914, 0.62393987],
[0.51681037, 0.75395885, 0.83268599],
[0.81616645, 0.4863684 , 0.25114351],
...,
[0.08293289, 0.90668744, 0.94976784],
[0.39264721, 0.33914333, 0.58584557],
[0.57539905, 0.29829974, 0.33732885]]]])>, <tf.Tensor: shape=(4, 128, 128, 1), dtype=float64, numpy=
array([[[[0.55649055],
[0.97303716],
[0.90465544],
...,
[0.76755675],
[0.07142947],
[0.80021061]],
....
....
....
[[0.88080569],
[0.53601004],
[0.69139559],
...,
[0.89977499],
[0.73776336],
[0.65023825]]]])>). When calling a concrete function, positional arguments may not be bound to Tensors within nested structures.
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Comments: 18 (7 by maintainers)
Are you sure your model has two placeholder inputs? I think input_fn is expecting a list/tuple of input data with same length as the number of inputs in the model. In your case:
this will ensure that an engine with batch_size=4 is created and cached, according to the documentation.
TF doesn’t fully support TRT7 yet. @bixia1 may know more details