tensorflow: error on saving RNN layer with recurrent_dropout parameter as saved_model

layer_input   = Input(shape=(10, 100)) 
    layer_bi_rnn  = Bidirectional(GRU(units=10,recurrent_dropout=0.2, return_sequences=True))(layer_input)
    layer_dense   = TimeDistributed(Dense(5))(layer_bi_rnn)
    layer_act     = Activation('softmax')(layer_dense)
    model         = Model([layer_input], layer_act)

    model.compile(loss='categorical_crossentropy')

save this model as saved_model on tf-nightly 2.1.0-dev20191104 raise bug as :

Attempted to save a function b'__inference_forward_lstm_1_layer_call_fn_19037' which references a symbolic Tensor Tensor("dropout/mul_1:0", shape=(None, 256), dtype=float32) that is not a simple constant. This is not supported.

after trying to change some parameter, i got the conclusion that this issue happens because of recurrent_dropout parameter. Any suggestion?

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 20 (2 by maintainers)

Most upvoted comments

Thanks! but I should save in tf format to use tf serving.

@NLP-ZY Just make sure the save_path of your model ended with “.h5”, which solved my own problem.

same error! use dropout & recurrent_dropout in GRU, when complete train and use tf.saved_model.save will raise an error. I try to save model directly without train, that will be ok.