tensorflow: Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT
Click to expand!
Issue Type
Bug
Source
binary
Tensorflow Version
2.9.1
Custom Code
No
OS Platform and Distribution
Linux Ubuntu 20.04
Mobile device
No response
Python version
3.8
Bazel version
No response
GCC/Compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current Behaviour?
When running `model.predict` I am getting the warning. What does that mean and how can I fix it ?
Standalone code to reproduce the issue
The model is a keras.functional.Functional model where the first layers are from tensorflow hub pretrained models. I have two dense layers in the end. compiling, fitting do not give any warning however when running `model.predict(["examplestring"])` I get the warning.
Relevant log output
2022-08-09 15:37:17.301561: W tensorflow/core/common_runtime/forward_type_inference.cc:231] Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT: expected a subtype of type_id: TFT_TENSOR
args {
type_id: TFT_LEGACY_VARIANT
}
for input 2 of a homogeneous container 1001, got type_id: TFT_RAGGED
args {
type_id: TFT_INT32
}
while inferring type of node 'model/preprocessing/StatefulPartitionedCall/StatefulPartitionedCall/StatefulPartitionedCall/StatefulPartitionedCall/StatefulPartitionedCall/bert_pack_inputs/PartitionedCall/map/while/body/_337/map/while/TensorArrayV2Write/TensorListSetItem'
About this issue
- Original URL
- State: open
- Created 2 years ago
- Comments: 28 (7 by maintainers)
I also got a
Type inference failederror when runningtf.keras.Model.fit()in tf 2.9. I didn’t see this kind error in tf 2.8 with the identical code.Source
binary
Tensorflow Version
2.9.1
OS Platform and Distribution
Linux Ubuntu 18.04
Python version
3.7
Current behaviour
Run tf.keras.Model.fit() and the error shows up.
Standalone code to reproduce the issue
Link to source code: https://drive.google.com/file/d/1k78lpGVthB7nthEkYgUs3JNJTuR79r5E/view?usp=sharing To reproduce the error, start up Jupyter with a terminal and run all cells. The error will show up in the terminal.
Note: In the source code, I used numpy to generate random training data because the dataset is private and can’t be shared at this moment.
Relevant log output
Cannot reopen from my side will open a new issue
The same warning always occur. I am training a model made of LSTMs with masks and with one dense layer before the output. I use miniconda. Python 3.8 TF 2.11 cudatoolkit 11.1.1
I’m also seeing this warning when trying to train a custom image segmentation model with pixellib
Met similar problem on TF 2.9.2 [1,2]<stderr>:2022-10-21 06:40:24.709289: W tensorflow/core/common_runtime/forward_type_inference.cc:231] Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT: expected compatible input types, but input 1: [1,2]<stderr>:type_id: TFT_OPTIONAL [1,2]<stderr>:args { [1,2]<stderr>: type_id: TFT_PRODUCT [1,2]<stderr>: args { [1,2]<stderr>: type_id: TFT_TENSOR [1,2]<stderr>: args { [1,2]<stderr>: type_id: TFT_INT32 [1,2]<stderr>: } [1,2]<stderr>: } [1,2]<stderr>:} [1,2]<stderr>: is neither a subtype nor a supertype of the combined inputs preceding it: [1,2]<stderr>:type_id: TFT_OPTIONAL [1,2]<stderr>:args { [1,2]<stderr>: type_id: TFT_PRODUCT [1,2]<stderr>: args { [1,2]<stderr>: type_id: TFT_TENSOR [1,2]<stderr>: args { [1,2]<stderr>: type_id: TFT_HALF [1,2]<stderr>: } [1,2]<stderr>: } [1,2]<stderr>:} [1,2]<stderr>: [1,2]<stderr>: while inferring type of node ‘cond_39/output/_24’ [1,0]<stderr>:2022-10-21 06:40:25.581602: W tensorflow/core/common_runtime/forward_type_inference.cc:231] Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT: expected compatible input types, but input 1: [1,0]<stderr>:type_id: TFT_OPTIONAL [1,0]<stderr>:args { [1,0]<stderr>: type_id: TFT_PRODUCT [1,0]<stderr>: args { [1,0]<stderr>: type_id: TFT_TENSOR [1,0]<stderr>: args { [1,0]<stderr>: type_id: TFT_INT32 [1,0]<stderr>: } [1,0]<stderr>: } [1,0]<stderr>:} [1,0]<stderr>: is neither a subtype nor a supertype of the combined inputs preceding it: [1,0]<stderr>:type_id: TFT_OPTIONAL [1,0]<stderr>:args { [1,0]<stderr>: type_id: TFT_PRODUCT [1,0]<stderr>: args { [1,0]<stderr>: type_id: TFT_TENSOR [1,0]<stderr>: args { [1,0]<stderr>: type_id: TFT_HALF [1,0]<stderr>: } [1,0]<stderr>: } [1,0]<stderr>:} [1,0]<stderr>: [1,0]<stderr>: while inferring type of node ‘cond_39/output/_24’ [1,1]<stderr>:2022-10-21 06:40:26.918500: W tensorflow/core/common_runtime/forward_type_inference.cc:231] Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT: expected compatible input types, but input 1: [1,1]<stderr>:type_id: TFT_OPTIONAL [1,1]<stderr>:args { [1,1]<stderr>: type_id: TFT_PRODUCT [1,1]<stderr>: args { [1,1]<stderr>: type_id: TFT_TENSOR [1,1]<stderr>: args { [1,1]<stderr>: type_id: TFT_HALF [1,1]<stderr>: } [1,1]<stderr>: } [1,1]<stderr>:} [1,1]<stderr>: is neither a subtype nor a supertype of the combined inputs preceding it: [1,1]<stderr>:type_id: TFT_OPTIONAL [1,1]<stderr>:args { [1,1]<stderr>: type_id: TFT_PRODUCT [1,1]<stderr>: args { [1,1]<stderr>: type_id: TFT_TENSOR [1,1]<stderr>: args { [1,1]<stderr>: type_id: TFT_INT32 [1,1]<stderr>: } [1,1]<stderr>: } [1,1]<stderr>:} [1,1]<stderr>: [1,1]<stderr>: while inferring type of node ‘cond_39/output/_23’ [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 7865, errno = 1 [1,2]<stderr>:[2cf69587bce7:07011] Read -1, expected 9529, errno = 1 [1,1]<stderr>:[2cf69587bce7:07010] Read -1, expected 9529, errno = 1 [1,3]<stderr>:[2cf69587bce7:07012] Read -1, expected 9529, errno = 1 [1,3]<stderr>:2022-10-21 06:40:27.716307: W tensorflow/core/common_runtime/forward_type_inference.cc:231] Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT: expected compatible input types, but input 1: [1,3]<stderr>:type_id: TFT_OPTIONAL [1,3]<stderr>:args { [1,3]<stderr>: type_id: TFT_PRODUCT [1,3]<stderr>: args { [1,3]<stderr>: type_id: TFT_TENSOR [1,3]<stderr>: args { [1,3]<stderr>: type_id: TFT_LEGACY_VARIANT [1,3]<stderr>: } [1,3]<stderr>: } [1,3]<stderr>:} [1,3]<stderr>: is neither a subtype nor a supertype of the combined inputs preceding it: [1,3]<stderr>:type_id: TFT_OPTIONAL [1,3]<stderr>:args { [1,3]<stderr>: type_id: TFT_PRODUCT [1,3]<stderr>: args { [1,3]<stderr>: type_id: TFT_TENSOR [1,3]<stderr>: args { [1,3]<stderr>: type_id: TFT_HALF [1,3]<stderr>: } [1,3]<stderr>: } [1,3]<stderr>:} [1,3]<stderr>: [1,3]<stderr>: while inferring type of node ‘cond_39/output/_19’ [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 7921, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 18841, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 16417, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 16441, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 7953, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 16233, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 15193, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 21305, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 12473, errno = 1 [1,2]<stderr>:[2cf69587bce7:07011] Read -1, expected 11097, errno = 1 [1,1]<stderr>:[2cf69587bce7:07010] Read -1, expected 11097, errno = 1 [1,3]<stderr>:[2cf69587bce7:07012] Read -1, expected 11097, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 32257, errno = 1 [1,0]<stderr>:[2cf69587bce7:07009] Read -1, expected 16441, errno = 1 [1,2]<stderr>:[2cf69587bce7:07011] Read -1, expected 60177, errno = 1 [1,3]<stderr>:[2cf69587bce7:07012] Read -1, expected 60177, errno = 1 [1,1]<stderr>:[2cf69587bce7:07010] Read -1, expected 60177, errno = 1
Met similar problem on TF 2.9/2.10
2022-10-05 14:05:12.123396: W tensorflow/core/common_runtime/forward_type_inference.cc:231] Type inference failed. This indicates an invalid graph that escaped type checking. Error message: INVALID_ARGUMENT: expected compatible input types, but input 1: type_id: TFT_OPTIONAL args { type_id: TFT_PRODUCT args { type_id: TFT_TENSOR args { type_id: TFT_BOOL } } } is neither a subtype nor a supertype of the combined inputs preceding it: type_id: TFT_OPTIONAL args { type_id: TFT_PRODUCT args { type_id: TFT_TENSOR args { type_id: TFT_LEGACY_VARIANT } } }
@sushreebarsa Ok! I’ve posted the issue on Keras repo. Link Thank you.