tensorflow: Cannot train canned estimators in multiple estimator.train() calls when using tf.keras.optimizers or tf.optimizers
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
- Have I written custom code (as opposed to using a stock example script provided in TensorFlow): Yes
- Platform: Code run in google Colab
- Python version: Python 3
- Tensorflow version: v2.0.0-rc1-51-g2646d23 2.0.0-rc2
Describe the current behavior When training a canned estimator with multiple tf.train calls while using any tf.keras.optimizer the optimizer raises an exception.
Describe the expected behavior Repeated tf.train calls train for the given amount of steps.
Code to reproduce the issue Lightly edited example using canned estimators: https://gist.github.com/JoshEZiegler/2a923a707d831ca7efd33dbfbf9779c9
Other info / logs
RuntimeError Traceback (most recent call last) <ipython-input-17-75a55ecc34ba> in <module>() 5 classifier.train( 6 input_fn=lambda: input_fn(train, train_y, training=True), ----> 7 steps=500)
7 frames
/tensorflow-2.0.0-rc2/python3.6/tensorflow_core/python/keras/optimizer_v2/optimizer_v2.py in iterations(self, variable)
660 def iterations(self, variable):
661 if self._iterations is not None:
–> 662 raise RuntimeError("Cannot set iterations
to a new Variable after "
663 “the Optimizer weights have been created”)
664 self._iterations = variable
RuntimeError: Cannot set iterations
to a new Variable after the Optimizer weights have been created
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Reactions: 2
- Comments: 30 (6 by maintainers)
@JoshEZiegler Right. My use case was a bit different and I missed that. But since we can pass any callable, then maybe something like this will work for you:
More on what
partial
is you can find in the docs. In my code it worked as expected.Instead of
partial
, use lambda to create optimizer object each time. It’s way more easier in this way.@yhliang2018 This issue still shows up with tf-nightly-2.2.0.dev20200306. See this colab notebook.
Was there a specific version where you believe it should work?
@awolant Nice! Thanks, that sounds like the perfect workaround.
I’ll leave this issue open because I don’t believe that is the intended way to use
optimizers
withEstimators
. At the very least not the way it is done in the TF docs.I tried running the linked code and I confirm I also see an error:
With some investigation, it looks like running estimator.train() multiple times is ok with the default optimizer, or by specifying the optimizer with a string. I believe that this is because the estimator.train() actually creates a new instance of the string-specified optimizer with each call, but retains the optimizer object if one was specified in the train call (see below).
A possible workaround for the above error could be to modify this function to return a fresh optimizer with the same parameters as the opt instance specified. However, I’m not sure if this is easily achievable or if there are any use cases that this would break…
From /tensorflow/estimator/blob/master/tensorflow_estimator/python/estimator/canned/optimizers.py