keras: Masking zeros not supported in some layers

Hi,

I am trying to implement a model over zero-padded sequences. The problem is when I use mask_zero=True some layers do not support it. For example, in the following code, the Dense layer throws an error that says it does not support masking:

# Mean over time implementation
def MeanOverTime():
    layer = Lambda(lambda x: K.mean(x, axis=1), output_shape=lambda s: (s[0], s[2]))
    return layer

model.add(Embedding(vocab_size, emb_dim, mask_zero=True))
model.add(LSTM(lstm_dim, return_sequences=True))
model.add(MeanOverTime())
model.add(Dense(10))
model.add(Activation('softmax'))

Is there an easy way to fix this? Thanks.

Kaveh

About this issue

  • Original URL
  • State: closed
  • Created 8 years ago
  • Reactions: 3
  • Comments: 26 (2 by maintainers)

Most upvoted comments

This is what i did. Hope it helps someone…

class MeanOverTime(Layer):
    def __init__(self, **kwargs):
        self.supports_masking = True
        super(MeanOverTime, self).__init__(**kwargs)

    def call(self, x, mask=None):
        if mask is not None:
            mask = K.cast(mask, 'float32')
            s = mask.sum(axis=1, keepdims=True)
            if K.equal(s, K.zeros_like(s)):
                return K.mean(x, axis=1)
            else:
                return K.cast(x.sum(axis=1) / mask.sum(axis=1, keepdims=True), K.floatx())
        else:
            return K.mean(x, axis=1)

    def get_output_shape_for(self, input_shape):
        return input_shape[0], input_shape[-1]

    def compute_mask(self, input, input_mask=None):
        return None

Hi @braingineer , I’m new to keras and I want to process sentences with different number of words in CNN. I used zero-padding, but layers after Embedding layer doesn’t support masking. Is there any way to solve it?

inputs = Input(shape=(1,max_len),dtype='int32') x= Embedding(vocab_size, dim,weights = GloVe, input_length=max_len)(inputs) x = Reshape((1,max_len,50))(x) x = Convolution2D(nb_filter, n_gram, dim,init='glorot_uniform', activation='linear',border_mode='valid', subsample=(1,1))(x) x = MaxPooling2D(pool_size=(2,1))(x1) x = Flatten()(x) out = Dense(10)(x)

Conv_sen= Model(inputs,out)

By the way, does keras support global pooling? Thanks.