tensorflow: Errors loading inception v3 in iOS example
Environment info
Operating System: Mac OS X / iOS
If installed from source, provide
- The commit hash (
git rev-parse HEAD) : fc9162975e52978d3af38549b570cc3cc5f0ab66 - The output of
bazel version: Build label: 0.3.0-homebrew
Steps to reproduce
- Download the .pb file from https://storage.googleapis.com/download.tensorflow.org/models/inception_dec_2015.zip
- Insert the .pb file in the data folder of the
cameraiOS project - Launch the project from Xcode, console outputs following errors:
Running model failed:Invalid argument: Session was not created with a graph before Run()!
Running model failed:Invalid argument: No OpKernel was registered to support Op 'DecodeJpeg' with these attrs [[Node: DecodeJpeg = DecodeJpeg[acceptable_fraction=1, channels=3, fancy_upscaling=true, ratio=1, try_recover_truncated=false](DecodeJpeg/contents)]]
What have you tried?
- ran the following script referenced in #2883:
bazel build tensorflow/python/tools:strip_unused && \ bazel-bin/tensorflow/python/tools/strip_unused \ --input_graph=your_retrained_graph.pb \ --output_graph=stripped_graph.pb \ --input_node_names=Mul \ --output_node_names=final_result \ --input_binary=true
However, I receive the following error:
/tensorflow/bazel-bin/tensorflow/python/tools/strip_unused.runfiles/org_tensorflow/tensorflow/python/framework/graph_util.py", line 156, in extract_sub_graph assert d in name_to_node_map, "%s is not in graph" % d AssertionError: final_result is not in graph
About this issue
- Original URL
- State: closed
- Created 8 years ago
- Comments: 16 (9 by maintainers)
I’ve got a tutorial explaining how to get this running up at https://petewarden.com/2016/09/27/tensorflow-for-mobile-poets/ now, so I’m going to close this bug. Please open new bugs if there are issues with the process. Thanks @jeffxtang for your work on this too!
For those of you who may be interested, I just posted a blog documenting the whole process of using a retrained inception v3 model for my app above at http://jeffxtang.github.io
No @shrutisharmavsco the performance seems to be the same to me. You can check out my recently released iOS app which uses a quantized model for comparison: https://itunes.apple.com/us/app/dog-breeds-recognition-powered/id1150923794?mt=8
@shrutisharmavsco cool. I’m able to run the quantized model on my iPhone 6 too without memory warnings or other problems.