keras: Masking zeros not supported in some layers
Hi,
I am trying to implement a model over zero-padded sequences. The problem is when I use mask_zero=True
some layers do not support it. For example, in the following code, the Dense layer throws an error that says it does not support masking:
# Mean over time implementation
def MeanOverTime():
layer = Lambda(lambda x: K.mean(x, axis=1), output_shape=lambda s: (s[0], s[2]))
return layer
model.add(Embedding(vocab_size, emb_dim, mask_zero=True))
model.add(LSTM(lstm_dim, return_sequences=True))
model.add(MeanOverTime())
model.add(Dense(10))
model.add(Activation('softmax'))
Is there an easy way to fix this? Thanks.
Kaveh
About this issue
- Original URL
- State: closed
- Created 8 years ago
- Reactions: 3
- Comments: 26 (2 by maintainers)
This is what i did. Hope it helps someone…
Hi @braingineer , I’m new to keras and I want to process sentences with different number of words in CNN. I used zero-padding, but layers after Embedding layer doesn’t support masking. Is there any way to solve it?
inputs = Input(shape=(1,max_len),dtype='int32')
x= Embedding(vocab_size, dim,weights = GloVe, input_length=max_len)(inputs)
x = Reshape((1,max_len,50))(x)
x = Convolution2D(nb_filter, n_gram, dim,init='glorot_uniform', activation='linear',border_mode='valid', subsample=(1,1))(x)
x = MaxPooling2D(pool_size=(2,1))(x1)
x = Flatten()(x)
out = Dense(10)(x)
Conv_sen= Model(inputs,out)
By the way, does keras support global pooling? Thanks.