tensorflow: Freeze graph: node is not in graph (even though it's been named)

Environment info

Operating System: Ubuntu 14.04 LTS 64-bit

Installed version of CUDA and cuDNN: none

If installed from source, provide

  1. The commit hash (git rev-parse HEAD): fc9162975e52978d3af38549b570cc3cc5f0ab66
  2. The output of bazel version
Build label: 0.3.0
Build target: bazel-out/local-fastbuild/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Fri Jun 10 11:38:23 2016 (1465558703)
Build timestamp: 1465558703
Build timestamp as int: 1465558703

Steps to reproduce

  1. Copy the IPython Notebook for Assignment 6 of Udacity’s deep learning course (here)
  2. Change saved_sample_output = tf.Variable(tf.zeros([1, num_nodes])) to saved_sample_output = tf.Variable(tf.zeros([1, num_nodes]), name="saved_sample_output")
  3. Modify the code like so:
with tf.Session(graph=graph) as session:
  tf.initialize_all_variables().run()
  print('Initialized')
  mean_loss = 0
  # code omitted (no changes)
  # new code below:
  saver = tf.train.Saver(tf.all_variables())
  saver.save(session, '/home/me/Documents/checkpoint.ckpt', write_meta_graph=False)
  tf.train.write_graph(graph.as_graph_def(), '/home/me/Documents', 'graph.pb')
  1. Run.
  2. Verify that checkpoint.ckpt and graph.pb have been successfully created in the directory.
  3. In the tensorflow source directory, run:
bazel build tensorflow/python/tools:freeze_graph && bazel-bin/tensorflow/python/tools/freeze_graph --input_graph=/home/me/Documents/graph.pb --input_checkpoint=/home/me/Documents/checkpoint.ckpt --output_graph=/home/me/Documents/frozen_graph.pb --output_node_names=saved_sample_output

What have you tried?

Checked the graph.pb file to make sure that node had actually been named properly. Seems like it was:

# other stuff
node {
  name: "saved_sample_output"
  op: "Variable"
  attr {
    key: "container"
    value {
      s: ""
    }
  }
  attr {
    key: "dtype"
    value {
      type: DT_FLOAT
    }
  }
  attr {
    key: "shape"
    value {
      shape {
        dim {
          size: 1
        }
        dim {
          size: 64
        }
      }
    }
  }
  attr {
    key: "shared_name"
    value {
      s: ""
    }
  }
}
# etc.

I’m pretty much stumped with this one since this issue on StackOverflow says to pass in a name parameter for the node you want, which is what I did, to no avail (even without the name parameter, it still gave the same error). Edit: Got freeze_graph to run successfully with the sample_prediction node (changed sample_prediction = tf.nn.softmax(tf.nn.xw_plus_b(sample_output, w, b)) to sample_prediction = tf.nn.softmax(tf.nn.xw_plus_b(sample_output, w, b), name="sample_prediction")). However, I still haven’t figured out why that worked, and this didn’t.

Logs or other output that would be helpful

Traceback (most recent call last):
  File "/home/me/tf_m/tensorflow/bazel-bin/tensorflow/python/tools/freeze_graph.runfiles/org_tensorflow/tensorflow/python/tools/freeze_graph.py", line 134, in <module>
    tf.app.run()
  File "/home/me/tf_m/tensorflow/bazel-bin/tensorflow/python/tools/freeze_graph.runfiles/org_tensorflow/tensorflow/python/platform/app.py", line 30, in run
    sys.exit(main(sys.argv))
  File "/home/me/tf_m/tensorflow/bazel-bin/tensorflow/python/tools/freeze_graph.runfiles/org_tensorflow/tensorflow/python/tools/freeze_graph.py", line 131, in main
    FLAGS.output_graph, FLAGS.clear_devices, FLAGS.initializer_nodes)
  File "/home/me/tf_m/tensorflow/bazel-bin/tensorflow/python/tools/freeze_graph.runfiles/org_tensorflow/tensorflow/python/tools/freeze_graph.py", line 120, in freeze_graph
    sess, input_graph_def, output_node_names.split(","))
  File "/home/me/tf_m/tensorflow/bazel-bin/tensorflow/python/tools/freeze_graph.runfiles/org_tensorflow/tensorflow/python/framework/graph_util.py", line 232, in convert_variables_to_constants
    inference_graph = extract_sub_graph(input_graph_def, output_node_names)
  File "/home/me/tf_m/tensorflow/bazel-bin/tensorflow/python/tools/freeze_graph.runfiles/org_tensorflow/tensorflow/python/framework/graph_util.py", line 156, in extract_sub_graph
    assert d in name_to_node_map, "%s is not in graph" % d
AssertionError: saved_sample_output is not in graph

About this issue

  • Original URL
  • State: closed
  • Created 8 years ago
  • Reactions: 5
  • Comments: 32 (3 by maintainers)

Most upvoted comments

Hi all, I am getting this error

in extract_sub_graph
    assert d in name_to_node_map, "%s is not in graph" % d
AssertionError: predictions is not in graph,

what that mean sub graph?

I am sure that the name of my output is correct,

this is my code for freezing:

dir = os.path.dirname(os.path.realpath(__file__))
tf.train.write_graph(sess.graph_def, '/home/saria/Downloads/sentiment_analysis_tensorflow-master', 'har.pbtxt')
saver.save(sess,save_path = "../har.ckpt")

freeze_graph.freeze_graph(input_graph = "../har.pbtxt",  input_saver = "",
             input_binary = False, input_checkpoint = "../har.ckpt", output_node_names = "predictions",
             restore_op_name = "save/restore_all", filename_tensor_name = "save/Const:0",
             output_graph = "frozen_har.pb", clear_devices = True, initializer_nodes = "")

input_graph_def = tf.GraphDef()

may please some one help with this problem, I am get stuck in this error no ways for getting the right answer

TF 2.3, my issue got solved by

import tensorflow.compat.v1 as tf
tf.disable_v2_behavior()

I have no idea why this works. Explanation is welcome.

I faced the same problem, and the saved_model_cli command saved me. (https://www.tensorflow.org/guide/saved_model)

So if you are in a linux environment and has tensorflow installed, you can enter the following command.

saved_model_cli show --dir ./ --all

of course, use the directory path of yours

It will show something like below:

outputs['output'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 512)
    name: some/kind/of/tensor/Add:0

Then your name should be some/kind/of/tensor/Add without the numbering

I met similar issue, even after i print out all the node name by using print([node.name for node in graph.as_graph_def().node]) and chose a name from them, the freeze_graph.py still return AssertionError: MobileNet/Predictions/Softmax is not in graph

yes, the node name is incorrect according to the .pb file, you have to print the actual name used during the saving by print model.output.op.name same with input if that happened occasionally model.input.op.name, then you can use the printed name as an argument in the export function. i hope this solved the problem.

I got the same error, then when I printed the node names using out_names = [x.op.name for x in model.outputs] I got u’activation_1_1/Sigmoid’ as a name but it did not work. Then I searched in the generated tmp.pb file for Sigmoid and the name I found is “activation_1/Sigmoid”. I used it and it worked!

Just in case - I found the eager execution caused my problem.

Hi all, In my network, I make name of node with 'name = ‘prob’ as code below

self.prob_output_layer = tf.nn.softmax(self.output_layer, name=‘prob’)

–> and we expect that node name should be ‘prob’

But we need to check the file.pb

node { name: “dnn_01/Softmax” op: “Softmax” input: “dnn_01/fully_layer03/fully_layer03/Add_1” device: “/device:GPU:0” attr { key: “T” value { type: DT_FLOAT } } } –> We found that node name is ‘dnn_01/Softmax’

Therefore, we need use ‘dnn_01/Softmax’ name when freezing

@hoaquocphan

hi @wysohn

it is ok with this model, but when I tried with other models from tensorflow hub as below:

saved_model_cli show --all --dir imagenet_mobilenet_v1_025_128_classification_4/
saved_model_cli show --all --dir imagenet_resnet_v1_50_classification_4/
saved_model_cli show --all --dir tf2-preview_inception_v3_classification_4/

the output as below:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['__saved_model_init_op']:
  The given SavedModel SignatureDef contains the following input(s):
  The given SavedModel SignatureDef contains the following output(s):
    outputs['__saved_model_init_op'] tensor_info:
        dtype: DT_INVALID
        shape: unknown_rank
        name: NoOp
  Method name is: 

Defined Functions:
  Function Name: '__call__'
    Option #1
      Callable with:
        Argument #1
          inputs: TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='inputs')
        Argument #2
          DType: bool
          Value: True
        Argument #3
          DType: bool
          Value: True
        Argument #4
          batch_norm_momentum: TensorSpec(shape=(), dtype=tf.float32, name='batch_norm_momentum')
    Option #2
      Callable with:
        Argument #1
          inputs: TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='inputs')
        Argument #2
          DType: bool
          Value: True
        Argument #3
          DType: bool
          Value: False
        Argument #4
          batch_norm_momentum: TensorSpec(shape=(), dtype=tf.float32, name='batch_norm_momentum')
    Option #3
      Callable with:
        Argument #1
          inputs: TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='inputs')
        Argument #2
          DType: bool
          Value: False
        Argument #3
          DType: bool
          Value: False
        Argument #4
          batch_norm_momentum: TensorSpec(shape=(), dtype=tf.float32, name='batch_norm_momentum')
    Option #4
      Callable with:
        Argument #1
          inputs: TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='inputs')
        Argument #2
          DType: bool
          Value: False
        Argument #3
          DType: bool
          Value: True
        Argument #4
          batch_norm_momentum: TensorSpec(shape=(), dtype=tf.float32, name='batch_norm_momentum')

so how can I get the output_node_names? Please help me

For that, I am not really sure how to figure that out. It might not be named tensor or something, but it’s out of my knowledge. There is a quote in the documentation, however, and it might provide some clue

Key Point: Unless you need to export your model to an environment other than TensorFlow 2.x with Python, you probably don’t need to export signatures explicitly. If you’re looking for a way of enforcing an input signature for a specific function, see the input_signature argument to tf.function.

I had the error, but my problem was that I was iterating over the tf.GraphDef.node and modifying that list at the same time (inserting and removing nodes) so be careful not to do that