tensorflow: TF2.0: Translation model: Error when restoring the saved model: Unresolved object in checkpoint (root).optimizer.iter: attributes
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
- Have I written custom code (as opposed to using a stock example script provided in TensorFlow): No
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): MacOs Mojave 10.14.5
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: NA
- TensorFlow installed from (source or binary):
- TensorFlow version (use command below):’ 2.0.0-beta1’
- Python version: 3.7
- Bazel version (if compiling from source): NA
- GCC/Compiler version (if compiling from source): NA
- CUDA/cuDNN version: NA
- GPU model and memory: NA
You can collect some of this information using our environment capture
script
You can also obtain the TensorFlow version with: 1. TF 1.0: python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
2. TF 2.0: python -c "import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)"
Describe the current behavior
I am trying to restore the checkpoints and predict on different sentences NMT Attention Model. While restoring the checkpoints and predicting, I am getting gibberish results with warning below:
Unresolved object in checkpoint (root).optimizer.iter: attributes { name: "VARIABLE_VALUE" full_name: "Adam/iter" checkpoint_key: "optimizer/iter/.ATTRIBUTES/VARIABLE_VALUE" }
Below is the additional warnings that I am getting and the results:
WARNING: Logging before flag parsing goes to stderr.
W1008 09:57:52.766877 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer.iter
W1008 09:57:52.767037 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer.beta_1
W1008 09:57:52.767082 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer.beta_2
W1008 09:57:52.767120 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer.decay
W1008 09:57:52.767155 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer.learning_rate
W1008 09:57:52.767194 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.embedding.embeddings
W1008 09:57:52.767228 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.gru.state_spec
W1008 09:57:52.767262 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.fc.kernel
W1008 09:57:52.767296 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.fc.bias
W1008 09:57:52.767329 4594230720 util.py:244] Unresolved object in checkpoint: (root).encoder.embedding.embeddings
W1008 09:57:52.767364 4594230720 util.py:244] Unresolved object in checkpoint: (root).encoder.gru.state_spec
W1008 09:57:52.767396 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.gru.cell.kernel
W1008 09:57:52.767429 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.gru.cell.recurrent_kernel
W1008 09:57:52.767461 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.gru.cell.bias
W1008 09:57:52.767493 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.attention.W1.kernel
W1008 09:57:52.767526 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.attention.W1.bias
W1008 09:57:52.767558 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.attention.W2.kernel
W1008 09:57:52.767590 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.attention.W2.bias
W1008 09:57:52.767623 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.attention.V.kernel
W1008 09:57:52.767657 4594230720 util.py:244] Unresolved object in checkpoint: (root).decoder.attention.V.bias
W1008 09:57:52.767688 4594230720 util.py:244] Unresolved object in checkpoint: (root).encoder.gru.cell.kernel
W1008 09:57:52.767721 4594230720 util.py:244] Unresolved object in checkpoint: (root).encoder.gru.cell.recurrent_kernel
W1008 09:57:52.767755 4594230720 util.py:244] Unresolved object in checkpoint: (root).encoder.gru.cell.bias
W1008 09:57:52.767786 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.embedding.embeddings
W1008 09:57:52.767818 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.fc.kernel
W1008 09:57:52.767851 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.fc.bias
W1008 09:57:52.767884 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).encoder.embedding.embeddings
W1008 09:57:52.767915 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.gru.cell.kernel
W1008 09:57:52.767949 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.gru.cell.recurrent_kernel
W1008 09:57:52.767981 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.gru.cell.bias
W1008 09:57:52.768013 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.attention.W1.kernel
W1008 09:57:52.768044 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.attention.W1.bias
W1008 09:57:52.768077 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.attention.W2.kernel
W1008 09:57:52.768109 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.attention.W2.bias
W1008 09:57:52.768143 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.attention.V.kernel
W1008 09:57:52.768175 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).decoder.attention.V.bias
W1008 09:57:52.768207 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).encoder.gru.cell.kernel
W1008 09:57:52.768239 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).encoder.gru.cell.recurrent_kernel
W1008 09:57:52.768271 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'm' for (root).encoder.gru.cell.bias
W1008 09:57:52.768303 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.embedding.embeddings
W1008 09:57:52.768335 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.fc.kernel
W1008 09:57:52.768367 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.fc.bias
W1008 09:57:52.768399 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).encoder.embedding.embeddings
W1008 09:57:52.768431 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.gru.cell.kernel
W1008 09:57:52.768463 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.gru.cell.recurrent_kernel
W1008 09:57:52.768495 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.gru.cell.bias
W1008 09:57:52.768527 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.attention.W1.kernel
W1008 09:57:52.768559 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.attention.W1.bias
W1008 09:57:52.768591 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.attention.W2.kernel
W1008 09:57:52.768623 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.attention.W2.bias
W1008 09:57:52.768654 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.attention.V.kernel
W1008 09:57:52.768686 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).decoder.attention.V.bias
W1008 09:57:52.768718 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).encoder.gru.cell.kernel
W1008 09:57:52.768750 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).encoder.gru.cell.recurrent_kernel
W1008 09:57:52.768782 4594230720 util.py:244] Unresolved object in checkpoint: (root).optimizer's state 'v' for (root).encoder.gru.cell.bias
W1008 09:57:52.768816 4594230720 util.py:252] A checkpoint was restored (e.g. tf.train.Checkpoint.restore or tf.keras.Model.load_weights) but not all checkpointed values were used. See above for specific issues. Use expect_partial() on the load status object, e.g. tf.train.Checkpoint.restore(...).expect_partial(), to silence these warnings, or use assert_consumed() to make the check explicit. See https://www.tensorflow.org/alpha/guide/checkpoints#loading_mechanics for details.
Input: <start> hola <end>
Predicted translation: ? attack now relax hello
The warning at the very end says:
‘A checkpoint was restored (e.g. tf.train.Checkpoint.restore or tf.keras.Model.load_weights) but not all checkpointed values were used…’ what does it mean? Describe the expected behavior Checkpoints should be restored properly Code to reproduce the issue Provide a reproducible test case that is the bare minimum necessary to generate the problem.
checkpoint_dir = './training_checkpoints'
checkpoint_prefix = os.path.join(checkpoint_dir, "ckpt")
checkpoint = tf.train.Checkpoint(optimizer=optimizer,
encoder=encoder,
decoder=decoder)
checkpoint.restore(tf.train.latest_checkpoint(checkpoint_dir))
Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Reactions: 15
- Comments: 32 (1 by maintainers)
Closing this issue as it has been inactive for more than 15 days. Please add additional comments and we can open this issue again. Thanks!
The error even shows up in the guide https://www.tensorflow.org/tutorials/keras/save_and_load
Train on 1000 samples Epoch 1/5 WARNING:tensorflow:Unresolved object in checkpoint: (root).optimizer.iter WARNING:tensorflow:Unresolved object in checkpoint: (root).optimizer.beta_1 WARNING:tensorflow:Unresolved object in checkpoint: (root).optimizer.beta_2 WARNING:tensorflow:Unresolved object in checkpoint: (root).optimizer.decay WARNING:tensorflow:Unresolved object in checkpoint: (root).optimizer.learning_rate WARNING:tensorflow:A checkpoint was restored (e.g. tf.train.Checkpoint.restore or tf.keras.Model.load_weights) but not all checkpointed values were used. See above for specific issues. Use expect_partial() on the load status object, e.g. tf.train.Checkpoint.restore(…).expect_partial(), to silence these warnings, or use assert_consumed() to make the check explicit. See https://www.tensorflow.org/guide/checkpoint#loading_mechanics for details. WARNING:tensorflow:Unresolved object in checkpoint: (root).optimizer.iter
Hopefully the simplest example that can reproduce the issue
This is my take: The issue is that the new model object does not have the variable
iter
which is created when the optimiser is called on training samples. If you add these lines after you created the new model:Then the issue is resolved. Or by using
status.assert_existing_objects_matched()
, you can avoid the error as well, as it only ensures variables in the new model gets restored.Finally found out a cause and solution for this problem. Basically, it’s because
Adam
is initializing its attributes likebeta_1
,beta_2
, etc, as a Python float object, which are not tracked by tensorflow. By converting them intotf.Variable
, we can let them tracked by tensorflow and thus we can load weights without issues.https://gist.github.com/yoshihikoueno/4ff0694339f88d579bb3d9b07e609122
Please take a look at the similar issue and let me know if it helps!!. Also try to test it with Tensorflow 2.0 and let me know if the issue still persists.
I have a similar issue, and couldn’t resolve it. Error log:
I got the same error message and warning message in tensorflow-gpu 2.0
The optimizer is Adam
The model is a subclassed tf.keras.Model
@gowthamkpr I had upgraded to TF2.0 and tested it, but still has the same error. From https://github.com/tensorflow/tensorflow/issues/27937, I tried following the steps:
Is it possible that the tokenizer is assigning random ID s to tokens when reloading the checkpoint?
I also encountered this issue. However, most of my warnings were for training parameters like optimizers state. When looking the issue up, I came across an answer that said this occurs because I was loading weights from a checkpoint and then only using it for inference/prediction and not for training. Therefore the training parameters (like optimizer state) in the checkpoint were being unused and hence the warnings.
It would be great if someone could let me know how accurate an explanation this is. (Have attached the link below if someone wants to check the answer out)
https://stackoverflow.com/a/59582698
For custom tensorflow models, I found that I needed to feed in input into them before loading the weights/checkpoints. That’s because some model weight shapes do not get properly configured before any input is fed into the network.
Same problem, please reopen this issue
I have also the same issue, could you please reopen this issue?
I have the same issue, can this please be reopened?
anyone could solve it? i’m using efficientDet
I also found that many people won’t even notice this issue because they are not using
assert_consumed
method, which will postpone the checks. The conversion intotf.Variable
will take place once the optimizer actually gets called, then the weights (beta_1
for example) will be loaded silently.