tensorflow: OperatorNotAllowedInGraphError: Iterating over a symbolic `tf.Tensor` is not allowed when using a dataset with tuples

Click to expand!

Issue Type

Bug

Have you reproduced the bug with TF nightly?

No

Source

source

Tensorflow Version

2.11

Custom Code

Yes

OS Platform and Distribution

Windows

Mobile device

No response

Python version

3.9

Bazel version

No response

GCC/Compiler version

No response

CUDA/cuDNN version

No response

GPU model and memory

No response

Current Behaviour?

I am trying to create my own transformer and train it. For this purpose, I use dataset to handle my data. The data is created by a code snippet from the tensorflow dataset.from_tensor_slices() method [documentation article](https://www.tensorflow.org/api_docs/python/tf/data/Dataset#from_tensor_slices) . Nevertheless, tensorflow is giving me the following error when I call the fit() method:

> "OperatorNotAllowedInGraphError: Iterating over a symbolic tf.Tensor is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature."

The used code is reduced significantly just for the purpose of reproducing the issue.

I've also tried passing the data as a dictionary instead of a tuple in the dataset and a couple more things but nothing worked. It seems that I am missing something.
Here is a link to [google colab example](https://colab.research.google.com/drive/1mn6iseJLnJwTmwakYa2XuxszKtR6sV9G#scrollTo=Cj9g0bGN1Fo3)

Standalone code to reproduce the issue

import numpy as np
import tensorflow as tf

batched_features = tf.constant([[[1, 3], [2, 3]],
                                [[2, 1], [1, 2]],
                                [[3, 3], [3, 2]]], shape=(3, 2, 2))
batched_labels = tf.constant([['A', 'A'],
                              ['B', 'B'],
                              ['A', 'B']], shape=(3, 2, 1))
dataset = tf.data.Dataset.from_tensor_slices((batched_features, batched_labels))
dataset = dataset.batch(1)
for element in dataset.as_numpy_iterator():
  print(element)

class MyTransformer(tf.keras.Model):
    def __init__(self):
        super().__init__()
        
    def call(self, inputs, training):
        print(type(inputs))
        feature, lable = inputs
        return feature

model = MyTransformer()
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3),
              loss=tf.keras.losses.BinaryCrossentropy(),
              metrics=[tf.keras.metrics.BinaryAccuracy(),
                       tf.keras.metrics.FalseNegatives()])

model.fit(dataset , batch_size = 1, epochs = 1)

Relevant log output

No response

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 19 (5 by maintainers)

Most upvoted comments

Hi @mihail-vladov ,

  1. Why the model.fit function does not call the model.call function according to the documentation?

The model.fit function called the call() function. You can check by adding print() to confirm same. But here when we pass a dataset as an argument to model.fit, the API converts it into Tensors internally.Outside the model.fit() the dataset might be a tuple but within model.fit the tuple is converting into Tensors which is default behaviour.

2. What is the correct way to obtain the feature and label data in the model.call function when I pass a dataset to the model.fit function?

To get the custom behaviour as per individuals requirement you need to override the train_step . Please refer to attached tutorials for more details.

Thank you!