ranking: When using ELWC - OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input
Hello Team,
I trained a TF ranking model (basing my training on the following example: https://github.com/tensorflow/ranking/blob/master/tensorflow_ranking/examples/tf_ranking_tfrecord.py) and saved it using estimator.export_saved_model('my_model', serving_input_receiver_fn), the model was trained successfully & saved without any warnings/errors.
I deployed the model to a local TensorFlow ModelServer and made an call to it over HTTP using cURL as described on https://www.tensorflow.org/tfx/serving/api_rest#request_format. Unfortunately I see the following error after making the request:
W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1655] OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input, value: '
ctx_f0
{ "error": "Could not parse example input, value: \'\n\035\n\021ctx_f0\022\010\022\006\n\004\000\000\340@\'\n\t [[{{node ParseExample/ParseExample}}]]" }
I understand that this is a problem that may be related to serialization where my input was not properly serialized, but saving the model by generating serving_input_receiver_fn & using it produced no errors/warnings, so I am not sure where to start looking to resolve this.
I am providing some details below, please let me know if you need more information.
Details
TF framework module versions
tensorflow-serving-api==2.0.0tensorflow==2.0.0tensorflow-ranking==0.2.0
Some training parameters and functions
_CONTEXT_FEATURES={'ctx_f0'}_DOCUMENT_FEATURES={'f0', 'f1', 'f2'}_DATA_FORMAT=tfr.data.ELWC_PADDING_LABEL=-1
def example_feature_columns():
spec = {}
for f in _DOCUMENT_FEATURES:
spec[f] = tf.feature_column.numeric_column(f, shape=(1,), default_value=_PADDING_LABEL, dtype=tf.float32)
return spec
def context_feature_columns():
spec = {}
for f in _CONTEXT_FEATURES:
spec[f] = tf.feature_column.numeric_column(f, shape=(1,), default_value=_PADDING_LABEL, dtype=tf.float32)
return spec
Creating the serving_input_receiver_fn
context_feature_spec = tf.feature_column.make_parse_example_spec(context_feature_columns().values())
example_feature_spec = tf.feature_column.make_parse_example_spec(example_feature_columns().values())
serving_input_receiver_fn = tfr.data.build_ranking_serving_input_receiver_fn(
data_format=_DATA_FORMAT,
list_size=20,
default_batch_size=None,
receiver_name="input_ranking_data",
context_feature_spec=context_feature_spec,
example_feature_spec=example_feature_spec)
When making a REST API to a local TensorFlow ModelServer using the following cURL request
curl -H "Content-Type: application/json" \
-X POST http://192.168.99.100:8501/v1/models/my_model/versions/1587842143:regress \
-d '{"context": {"ctx_f0": 7.2}, "examples":[{"f0":[35.92],"f1":[5.258],"f2":[5.261]},{"f0":[82.337],"f1":[2.06],"f2":[2.068]}]}'
The error is as follows:
W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1655] OP_REQUIRES failed at example_parsing_ops.cc:91 : Invalid argument: Could not parse example input, value: '
ctx_f0
{ "error": "Could not parse example input, value: \'\n\035\n\021ctx_f0\022\010\022\006\n\004\000\000\340@\'\n\t [[{{node ParseExample/ParseExample}}]]" }
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Reactions: 2
- Comments: 28 (6 by maintainers)
Hi @ramakumar1729 ,
Sorry to comment on this closed issue, but I thought I would share my latest (positive & successful) findings regarding making requests to
predictserving REST API. Perhaps others may find them useful:TL;DR
I was able to successfully POST serialized & base64 encoded ELWC proto to the
predictREST API and get the expected predictions. These predictions match exactly the predictions that I get if I make a gRPC request using the same ELWC proto to the TensorFlow model server over gRPC.This gave me the confidence in behavior parity that making inference requests over HTTP vs gRPC produces consistent results for the same ELWC.
Details
Previously in https://github.com/tensorflow/ranking/issues/189#issuecomment-620256362, I was creating a
tensor_protoout of the serialized ELWC proto, then serializing the tensor_proto & base64 that, which then I was POSTing to the API.I found out that I should not create
tensor_protoout of serialized ELWC proto string. What needs to be serialized & base64 encoded is the ELWC proto itself. As a result, my cURL looks like as before before, the difference is that theb64string holds the ELWC proto:We can be a little more descriptive by specifying the
signature_nameand thereceiver_name(whatever was defined when creatingserving_input_receiver_fn):The predictions from the above cURL requests match the gRPC response prediction after making the gRPC request as follows:
@azagniotov Thanks for sharing your experience in successfully serving the ranking model with serialized ELWC inputs~ This will be a useful reference for others trying to do the same.
@davidmosca
TL;DR
I took your code as-is and ran it. I was able successfully (i.e.: I did not observe your aforementioned error and got back a list of predictions) to make a request to the Docker container over gRPC using TF/TFR
v2.1.0and also usingv2.3.0as another attempt, which is the versions you are on.Question
Are you pinning down the docker container version when pulling it down or are you getting the latest? i.e.:
docker pull tensorflow/serving:2.3.0ordocker pull tensorflow/serving? The latter will pull down thelatesttag, which may or may not be what you want. I remember some time ago when I was not pinning down the container version and one day things stopped working for me because I probably pulled down new functionality or changes from the upstream. So, I had to fix the Docker version.In my Docker start command, I also explicitly start the version that I am on, e.g.:
MORE DETAILS
In the following code I am using to generate
ranking_serving_input_receiver_fnwhen saving the model. Not sure if this helps, but I thought to show how I do it:TF-Ranking model’s predict signature
@ramakumar1729, I got the making request to the REST API to work when hitting
predictAPI:and the cURL call is:
the
tensor_proto_base64holds theCAcSBBICCBRCQwpBCg4K.....stringThe results are as follows:
What’s interesting though is that I always get this ^ same result set, no matter what the input is (i.e.: what the feature numeric values are). I can make 3 different cURL requests with a different
b64string as a payload, but the predictions will always be as the aforementioned ^In fact, I can use completely bogus example feature names (i.e.:
apples,oranges&lemons) when constructing the tensor_proto, the API still is able to return the same above predictions. Should not the model complain that the expected featuresf0,f1andf2are not there?Any thoughts?