keras: Training much slower on Keras 2.0.4 than on Keras 1.2.2
Hi everyone,
I recently updated Keras to version 2.0.4 and I saw a big drop in performance compared to Keras 1.2.2.
I’m basically trying to find a mapping between movie titles and movie plot summaries (called synopses) using seq2seq LSTMs. While my model was pretty fast with Keras 1.2.2, it took a lots of time to train in the last version.
model = Sequential()
model.add(Embedding(vocab_size, EMBEDDING_DIM, input_length=MAX_LENGTH, mask_zero=True))
model.add(LSTM(1024, return_sequences=True))
model.add(LSTM(1024, return_sequences=True))
model.add(TimeDistributed(Dense(vocab_size, activation='softmax')))
model.compile(loss='sparse_categorical_crossentropy', optimizer=(Adam()), metrics=['accuracy'])
I did not change anything to this model or to the code in general between Keras 1.2.2 and Keras 2.0.4, trained on a Nvidia Quadro K6000 with Theano up-to-date in both cases and simply ran: pip uninstall keras
and pip install keras
to update it.
However, the training is much slower.
Using Keras 1.2.2:
Epoch 1/10 100/100 [==============================] - 5s - loss: 2.2261 - acc: 0.8375
Using Keras 2.0.4:
Epoch 1/10 100/100 [==============================] - 18s - loss: 3.3189 - acc: 0.8277
In the example above, the delta is only about a few seconds but when training on the whole dataset, it is much much slower.
Does anyone know what I’m doing wrong and could point me in the right direction ?
About this issue
- Original URL
- State: closed
- Created 7 years ago
- Reactions: 2
- Comments: 17 (7 by maintainers)
@Blockost how do you run the training? Note that if you are using fit_generator the parameters have changed in Keras 2. You might be running more steps than previously.
Yes, it is the number of training samples per epoch.