tensorflow: [TF2.0.0]ValueError: The two structures don't have the same nested structure.
I can run my code in tensorflow==2.0.0b1, but when I update tensorflow to 2.0.0; I got an error; This is my code:
import tensorflow as tf
from tensorflow import feature_column
print('tf version:', tf.__version__)
feature_description = {
'h_k_u_watchanch_his': tf.io.VarLenFeature(tf.string),
'a_gender': tf.io.FixedLenFeature(shape=(1,), dtype=tf.int64),
'l_label': tf.io.FixedLenFeature([], tf.int64)
}
feature_columns = []
thal = feature_column.categorical_column_with_hash_bucket(
'h_k_u_watchanch_his', hash_bucket_size=100
)
thal_one_hot = feature_column.embedding_column(thal, dimension=10, combiner='mean')
feature_columns.append(thal_one_hot)
dataSet = tf.data.TFRecordDataset(
"/Users/lyx/projects/recommend/embedding/tmp/PUSH.TFRecords/dt=20191012/hour=10/part-r-00000")
def _parse_function(serilized_example):
feature = tf.io.parse_single_example(
serilized_example,
feature_description
)
label = feature.get('l_label')
return feature, label
parsed_dataset = dataSet.map(_parse_function)
input1 = tf.keras.Input(shape=(), name='h_k_u_watchanch_his', sparse=True, dtype=tf.string)
input2 = tf.keras.Input(shape=(), name='a_gender', dtype=tf.int64)
input_layers = {'h_k_u_watchanch_his': input1, 'a_gender': input2}
feature_layer = tf.keras.layers.DenseFeatures(feature_columns, name='DenseFeatures')(input_layers)
outputs = tf.keras.layers.Dense(1, activation='sigmoid')(feature_layer)
model = tf.keras.Model(inputs=[input1, input2], outputs=outputs)
model.compile(
optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy']
)
model.fit(
x=parsed_dataset,
validation_data=parsed_dataset,
epochs=5,
)
loss, accuracy = model.evaluate(parsed_dataset)
print("Accuracy", accuracy)
Error output:
tf version: 2.0.0
2019-10-18 09:59:10.095365: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-10-18 09:59:10.118556: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fbcd9c21e00 executing computations on platform Host. Devices:
2019-10-18 09:59:10.118572: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): Host, Default Version
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/tensorflow_core/python/util/nest.py", line 318, in assert_same_structure
expand_composites)
ValueError: The two structures don't have the same nested structure.
First structure: type=TensorSpec str=TensorSpec(shape=(1,), dtype=tf.int64, name=None)
Second structure: type=SparseTensor str=SparseTensor(indices=Tensor("h_k_u_watchanch_his/indices:0", shape=(None, 1), dtype=int64), values=Tensor("h_k_u_watchanch_his/values:0", shape=(None,), dtype=string), dense_shape=Tensor("h_k_u_watchanch_his/shape:0", shape=(1,), dtype=int64))
More specifically: Substructure "type=SparseTensor str=SparseTensor(indices=Tensor("h_k_u_watchanch_his/indices:0", shape=(None, 1), dtype=int64), values=Tensor("h_k_u_watchanch_his/values:0", shape=(None,), dtype=string), dense_shape=Tensor("h_k_u_watchanch_his/shape:0", shape=(1,), dtype=int64))" is a sequence, while substructure "type=TensorSpec str=TensorSpec(shape=(1,), dtype=tf.int64, name=None)" is not
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/lyx/projects/recommend/embedding/tmp/test.py", line 52, in <module>
epochs=5,
File "/usr/local/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training.py", line 728, in fit
use_multiprocessing=use_multiprocessing)
File "/usr/local/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_v2.py", line 224, in fit
distribution_strategy=strategy)
File "/usr/local/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_v2.py", line 547, in _process_training_inputs
use_multiprocessing=use_multiprocessing)
File "/usr/local/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training_v2.py", line 594, in _process_inputs
steps=steps)
File "/usr/local/lib/python3.7/site-packages/tensorflow_core/python/keras/engine/training.py", line 2497, in _standardize_user_data
nest.assert_same_structure(a, b, expand_composites=True)
File "/usr/local/lib/python3.7/site-packages/tensorflow_core/python/util/nest.py", line 325, in assert_same_structure
% (str(e), str1, str2))
ValueError: The two structures don't have the same nested structure.
First structure: type=TensorSpec str=TensorSpec(shape=(1,), dtype=tf.int64, name=None)
Second structure: type=SparseTensor str=SparseTensor(indices=Tensor("h_k_u_watchanch_his/indices:0", shape=(None, 1), dtype=int64), values=Tensor("h_k_u_watchanch_his/values:0", shape=(None,), dtype=string), dense_shape=Tensor("h_k_u_watchanch_his/shape:0", shape=(1,), dtype=int64))
More specifically: Substructure "type=SparseTensor str=SparseTensor(indices=Tensor("h_k_u_watchanch_his/indices:0", shape=(None, 1), dtype=int64), values=Tensor("h_k_u_watchanch_his/values:0", shape=(None,), dtype=string), dense_shape=Tensor("h_k_u_watchanch_his/shape:0", shape=(1,), dtype=int64))" is a sequence, while substructure "type=TensorSpec str=TensorSpec(shape=(1,), dtype=tf.int64, name=None)" is not
Entire first structure:
.
Entire second structure:
.
Looking forward to your reply!
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Reactions: 5
- Comments: 27 (8 by maintainers)
You basically solved the issue by just avoiding it. I think this is still an important issue even in TF 2.3.0. I get the error on Transfer Learning with Hugging Face Bert models. I created my model and saved on the end of the training. To deploy with Flask, I simply tried to load the saved model but got:
ValueError: The two structures don’t have the same nested structure.
First structure: type=TensorSpec str=TensorSpec(shape=(None, 512), dtype=tf.int32, name=‘inputs’)
Second structure: type=dict str={‘input_ids’: TensorSpec(shape=(None, 5), dtype=tf.int32, name=‘input_ids’)}
@Subfly @MattdaVill
I solved it with
here is the issue ablout this problem in huggingface-transformers https://github.com/huggingface/transformers/issues/3627
I think the reason for that works may be that
self.bert=TFBertMainLayer(config, name="bert")
andTFBertMainLayer
is inherited fromtf.keras.layers.Layer
I had a similar problem involving
tf.map_fn
and I solved it by indicating thedtype
, as advised here. I see thatmap_fn
is not involved here, but maybe something similar is happening where adtype
is not specified where it should be. Formap_fn
, it is actually indicated in the docs that you should provide it if thedtype
of the output is not the same as the input.@amahendrakar I can confirm that this issue was solved in TF-nightly (2.4).
Thank you for the support!