tensorflow: [TF2.0] Not JSON Serializable error wasn thrown when using tf.keras.activations operators in keras model.

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10 x64 1809
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
  • TensorFlow installed from (source or binary): pip
  • TensorFlow version (use command below): tensorflow-gpu 2.0.0a0
  • Python version: 3.6.7
  • Bazel version (if compiling from source):
  • GCC/Compiler version (if compiling from source):
  • CUDA/cuDNN version: 10
  • GPU model and memory: Geforce 1070

Describe the current behavior When I used tf.keras.activations operators in my keras model, serialization of model was failed due to Not JSON Serializable error.

Describe the expected behavior It should be serialized without any error.

Code to reproduce the issue

from tensorflow import keras
from tensorflow.keras import layers

inputs = keras.Input(shape=(784,), name='digits')
x = layers.Activation('relu')(inputs)
# x = keras.activations.relu(inputs)
outputs = layers.Dense(10, activation='softmax', name='predictions')(x)


model = keras.Model(inputs=inputs, outputs=outputs, name='3_layer_mlp')
model.summary()

model.save('path_to_my_model.h5')

If you changed from Activation() to relu, it failed to serialize.

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 15 (4 by maintainers)

Most upvoted comments

Had the problem callbacks.ModelCheckpoint and ModelCheckpoint, setting save_weights_only=True solved the problem for me.

I am facing the same issue, any suggestions or solutions for work around for now ??

Had the problem callbacks.ModelCheckpoint and ModelCheckpoint, setting save_weights_only=True solved the problem for me.

@yassinetb Did you have any solution for the error “TypeError: (‘Not JSON Serializable:’, tf.float32)” from setting save_weights_only=False

Closing this issue since its fixed in latest tf nightly build ‘2.0.0-dev20190802’ Thanks!

I have slightly edited your code to be consistent with tf.keras and was able to reproduce the error using TF-gpu 2.0 alpha.

import tensorflow as tf
#from tensorflow import keras
#from tensorflow.keras import layers

inputs = tf.keras.Input(shape=(784,), name='digits')
#x = tf.keras.layers.Activation('relu')(inputs)
x = tf.keras.activations.relu(inputs)
outputs = tf.keras.layers.Dense(10, activation='softmax', name='predictions')(x)

model = tf.keras.Model(inputs=inputs, outputs=outputs, name='3_layer_mlp')
model.summary()

model.save('path_to_my_model.h5')

Output:

---> 13 model.save('path_to_my_model.h5')
TypeError: ('Not JSON Serializable:', b'\n\x06Relu_8\x12\x04Relu\x1a\tdigits_25*\x07\n\x01T\x12\x020\x01')

Having same issue with TF 1.14 and Python 3.6 with tf.transpose and tensorflow.python.keras.layers.concatenate :

W0717 09:24:35.629342 14908 summary_ops_v2.py:1110] Model failed to serialize as JSON. Ignoring... ('Not JSON Serializable:', b'\n\ttranspose\x12\tTranspose\x1a\ndense/Relu\x1a\x0etranspose/perm*\x0b\n\x05Tperm\x12\x020\x03*\x07\n\x01T\x12\x020\x01')

W0717 13:56:19.515430 16984 summary_ops_v2.py:1110] Model failed to serialize as JSON. Ignoring... ('Not JSON Serializable:', b'\n\x05Shape\x12\x05Shape\x1a\x15concatenate_28/concat*\x07\n\x01T\x12\x020\x01*\x0e\n\x08out_type\x12\x020\x03')

Thus callbacks.ModelCheckpoint with save_weights_only=False causes the training to stop because it fails to backup the model (I get a useless model file of 6 KB). As @mketcha mentionned, a workaround is to use save_weights_only=True. But it should only be temporary waiting a fix because saving only the weights forces to keep the code of the corresponding model somewhere and this is really annoying if the model code changes often.

I am having the same issue when using tf.concat in an otherwise keras-only model (can’t use keras Concatenations due to issue #30355 ).

Only work around I have found so far is to use model.save_weights(). When loading, you need to redefine the model and then use model.load_weights()