tensorflow: Failed to apply delegate: TfLiteGpuDelegate Init: MUL: Expected a 3D tensor of shape HxWxC or a 4D tensor of shape 1xHxWxC but got 98x8 (Android)

System information

Describe the current behavior I upgraded tensorflow-lite and tensorflow-lite-gpu from 2.3.0 to 2.4.0 And getting this error on initialization interpeteur

java.lang.IllegalArgumentException: Internal error: Failed to apply delegate: TfLiteGpuDelegate Init: MUL: Expected a 3D tensor of shape HxWxC or a 4D tensor of shape 1xHxWxC but got 98x8 TfLiteGpuDelegate Prepare: delegate is not initialized Node number 329 (TfLiteGpuDelegateV2) failed to prepare.

` val tfliteOptions = Interpreter .Options() .setNumThreads(THREADS_COUNT) .addDelegate(GpuDelegate())

Interpreter(loadModelFile(context), tfliteOptions)`

The problem is somewhere in tensorflow-lite-gpu 2.4.0

If i’m using such configuration with older version all works well

implementation(“org.tensorflow:tensorflow-lite:2.4.0”) implementation(“org.tensorflow:tensorflow-lite-gpu:2.3.0”)

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 23 (5 by maintainers)

Most upvoted comments

Internally (at Google), we don’t have those TF versions and we’re always using the master branch. As a non-TF person (believe it or not, I’m not in the TF organization), it’s hard for me to keep track of what is in a specific release, or to know how significant of a change a release was. Having said that, I’m not sure what changes were introduced in 2.4.0, maybe it be in TF, or TFLite, or TFLite GPU, or tflite_convert. These components are all owned by different teams.

I found a some check in model builder that checking all tensors for batch, it also was added in 2.4.0, I’m not sure what is purpose of this check

Failing for a bad batch size is better than secretly running and producing bad results when the shader is written with a certain assumption.

I had to also change start input tensor with 1d array to 2d array with fake batch size.

As long as it works, we’re absolutely fine with that solution. In fact, we do that internally all the time too.

who should find a solution to fix this behavior.

Ideally yes, but unfortunately we don’t have enough people to keep up with minor issues that can be worked around.