tensorflow: Keras Model functional API with custom submodel not working in eager execution?

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): Yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Darwin localhost 17.5.0 Darwin Kernel Version 17.5.0: Fri Apr 13 19:32:32 PDT 2018; root:xnu-4570.51.2~1/RELEASE_X86_64 x86_64 Mac OS X 10.13.4
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (use command below): 1.8
  • Python version: 2.7
  • Bazel version (if compiling from source): N/A
  • GCC/Compiler version (if compiling from source): N/A
  • CUDA/cuDNN version: N/A
  • GPU model and memory: N/A
  • Exact command to reproduce: See below

Describe the problem

Consider the following keras model with a custom submodel:

tf.enable_eager_execution()

class SubModel(tf.keras.Model):
    def __init__(self):
        super(SubModel, self).__init__()
        self.layer = tf.keras.layers.Dense(3) 
    def call(self, inputs):
        return self.layer(inputs)

def MyModel():
    input = tf.keras.Input(shape=(3, 3))
    m = SubModel()
    output = m(input)
    return tf.keras.Model(input, output)

m = MyModel()
m(tf.constant(tf.ones([3, 3])))

where the SubModel is a custom model, and MyModel() uses it in the functional API. The code raises error:

File “/Library/Python/2.7/site-packages/tensorflow/python/keras/_impl/keras/engine/network.py”, line 639, in compute_output_shape raise NotImplementedError

I think it might be because the submodel cannot calculate the shape. So I added a compute_output_shape method for the SubModel class as if it’s a custom layer:

class SubModel(tf.keras.Model):
    ...
    def compute_output_shape(self, input_shape):
        return (input_shape[0], 3)

Now the NotImplementedError disappeared, but we have a new error when running the model:

AssertionError: Could not compute output DeferredTensor(‘None’, shape=(3,), dtype=float32)

Now I don’t know what to do. The code actually works in non-eager mode, so I guess it’s a bug for keras functional API in eager execution?

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Comments: 26 (6 by maintainers)

Most upvoted comments

How can I specify input shape in Imperative (or Model Subclassing) API without compile so that I can convert it to SavedModel format? If it is not possible then it’s super strange to position it as main way to create new models…

@David-Mao I know that, but that’s impossible for your situation. Essentially you should understand why nearly all apis in keras are compatible with eager and graph, that is because that classes like tf.keras.layers.Con2D are just a combination of internal tensorflow operations like tf.get_variable and tf.nn.conv2d which are naturally compatible with eager and graph. But there are exceptions, the most typical one is tf.placeholder which stands for a empty and symbolic tensor in graph mode, in eager mode, each operation should immediately have an output tensor with exact value, so operations like tf.placeholder makes no sense in eager mode and will never be compatible with eager, so is tf.keras.Input which make use of it.

Yep, just sent auto list tracking for review, then we’ll do auto dict and tuple/namedtuple tracking.

@David-Mao

class MyModel(Model):
    def __init__(self):
        super(MyModel, self).__init__()
        # self.layers = []
        for i in range(20):
            setattr(self, "layer%i" % i, MyLayer())

    def call(self, inputs):
        x = inputs[0]
        y = inputs[1]
        for i in range(20):
            x = getattr(self, "layer%i" % i)(x, y)
        return x

Hi @David-Mao,

I met the same issue that loop of layers will met “AttributeError: can’t set attribute”, however, I figure out that it seem “self.layer” is conflict with existing attribute “layer/layers”

class MyModel(tf.keras.Model):
    def __init__(self):
        super(MyModel, self).__init__()
        self.layers = []
        for i in range(20):
            self.layers.append(tf.keras.layers.Dense(units=10))
    def call(self, inputs):
        for i in range(20):
            x = self.layers[i](inputs)
        return x
input_layer = tf.keras.layers.Input(shape=(10,))
modules = MyModel()(input_layer)
model = tf.keras.Model(inputs=input_layer, outputs=modules)
model.summary()

which gives you “AttributeError”

Howver,

class MyModel(tf.keras.Model):
    def __init__(self):
        super(MyModel, self).__init__()
        self.iamlist = []
        for i in range(20):
            self.iamlist.append(tf.keras.layers.Dense(units=10))
    def call(self, inputs):
        for i in range(20):
            x = self.iamlist[i](inputs)
        return x
input_layer = tf.keras.layers.Input(shape=(10,))
modules = MyModel()(input_layer)
model = tf.keras.Model(inputs=input_layer, outputs=modules)
model.summary()

this case will pass. The only different is “self.layer” -> “self.iamlist”

So… I think the conclusion is you should not define attributes that will conflict with tf.keras.Model. as far as I know, so common attributes like “self.layer”, “self.layers”, “self.output” … (and it works for both enable_eager_execution or not)

Hope it helps

Yes, the intention is for the Keras API to work smoothly with or without eager execution enabled. Thanks for the report.

@David-Mao : Having list-valued attributes should be functional soon (CC @allenlavoie who was adding support for that). So the example you mentioned in https://github.com/tensorflow/tensorflow/issues/20338#issuecomment-400887101 will be supported in a future release (@allenlavoie can correct me if I’m wrong).

@fchollet @pavithrasv : Could you comment on the issue here?