tensorflow: DNNClassifier estimator cannot be exported

Please go to Stack Overflow for help and support:

https://stackoverflow.com/questions/tagged/tensorflow

If you open a GitHub issue, here is our policy:

  1. It must be a bug or a feature request.
  2. The form below must be filled out.
  3. It shouldn’t be a TensorBoard issue. Those go here.

Here’s why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.


System information

  • tensorflow/tensorflow:latest container
  • ubuntu linux
  • installed from pip
  • TensorFlow version ‘v1.2.0-5-g435cdfc’, ‘1.2.1’:
  • Python 2.7:
  • Bazel version (if compiling from source):

Exact command to reproduce

classifier = DNNClassifier(feature_columns=feature_columns,
                         hidden_units=[10, 20, 10],
                         n_classes=3,
                         model_dir=model_path)

classifier.export_savedmodel(MODEL_PATH, script.serving_input_receiver_fn)

Describe the problem

Trying to export the model DNNClassifier throws the exception:

Exception during training: A default input_alternative must be provided.
 Traceback (most recent call last):
  File "algo.py", line 78, in train
    nn.export_savedmodel(MODEL_PATH, script.serving_input_receiver_fn, default_output_alternative_key=None)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 1280, in export_savedmodel
    actual_default_output_alternative_key)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/utils/saved_model_export_utils.py", line 259, in build_all_signature_defs
    raise ValueError('A default input_alternative must be provided.')

The problem happens because DNNClassifier constructor creates a head with name None: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/learn/python/learn/estimators/dnn.py#L365

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Reactions: 2
  • Comments: 25 (10 by maintainers)

Most upvoted comments

Hello @samithaj

Canned estimator don’t have a lot of documentation yet. Here it go my code:

INPUT_TENSOR_NAME = 'inputs'


def estimator(model_path):
    feature_columns = [tf.feature_column.numeric_column(INPUT_TENSOR_NAME, shape=[4])]
    return tf.estimator.DNNClassifier(feature_columns=feature_columns,
                                      hidden_units=[10, 20, 10],
                                      n_classes=3,
                                      model_dir=model_path)


def serving_input_receiver_fn():
    feature_spec = {INPUT_TENSOR_NAME: tf.FixedLenFeature(dtype=tf.float32, shape=[4])}
    return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()


def train_input_fn(training_dir):
    training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
        filename=os.path.join(training_dir, 'iris_training.csv'),
        target_dtype=np.int,
        features_dtype=np.float32)

    return tf.estimator.inputs.numpy_input_fn(
        x={INPUT_TENSOR_NAME: np.array(training_set.data)},
        y=np.array(training_set.target),
        num_epochs=None,
        shuffle=True)()

Hope it helps you!

Cool, @mvsusp .

By the way, if tf.feature_column is used, feature_spec can be generated automatically, like:

    feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)

Hi @facaiy

The issue was solved using tf.estimator.export.build_parsing_serving_input_receiver_fn.

Thank you!

@Anmol-Sharma I guess you need to return export_outputs in your model_fn, like

        return tf.estimator.EstimatorSpec(
            mode=mode,
            predictions=predictions,
            loss=loss,
            train_op=train_op,
            export_outputs=export_outputs)

@AKhilGarg91

I got the too many values to unpack on 1.4, too.

a small dig into that I changed tf.estimator.export.build_parsing_serving_input_receiver_fn to tensorflow.contrib.learn.build_parsing_serving_input_fn

In tensorflow/contrib/learn/python/learn/utils/saved_model_export_utils.py line 157-161

  if isinstance(input_ops, input_fn_utils.InputFnOps):    
    features, unused_labels, default_inputs = input_ops #<- should go here
    input_alternatives[DEFAULT_INPUT_ALTERNATIVE_KEY] = default_inputs
  else:
    features, unused_labels = input_ops #<- this line fails

In InputFnOps, it says

Contents of this file are moved to tensorflow/python/estimator/export.py.
InputFnOps is renamed to ServingInputReceiver.
build_parsing_serving_input_fn is renamed to
  build_parsing_serving_input_receiver_fn.
build_default_serving_input_fn is renamed to
  build_raw_serving_input_receiver_fn.

Seems the new class causes the error.

if isinstance(input_ops, input_fn_utils.InputFnOps):

should change to something like

if isinstance(input_ops, input_fn_utils.InputFnOps) or isinstance(input_ops, export.ServingInputReceiver):

My work around is to use tensorflow.contrib.learn.build_parsing_serving_input_fn instead for now.

thanks @mvsusp Check here anyone looking for complete example: https://github.com/MtDersvan/tf_playground

@mvsusp Can you please post the fixed code for your example ,I’m trying tho export and serve a DNNLinearCombinedRegressor model , and i cant find any working example