tensorflow: Internal error: Error applying delegate when trying to use TensorFlow Lite NNAPI delegate on Google Pixel 7

Click to expand!

Issue Type

Bug

Have you reproduced the bug with TF nightly?

No

Source

binary

Tensorflow Version

tensorflow-lite 2.12.0

Custom Code

Yes

OS Platform and Distribution

No response

Mobile device

Google Pixel 7

Python version

No response

Bazel version

No response

GCC/Compiler version

No response

CUDA/cuDNN version

No response

GPU model and memory

No response

Current Behaviour?

When I try to use the NNAPI delegate to run the OpenAI whisper model on a Pixel 7, it crashes when trying to initialize the TFLite interpreter.

Standalone code to reproduce the issue

MainActivity.kt
class MainActivity : ComponentActivity() {
    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)

        setContent {
            NNAPIWhisperTestTheme {
                // A surface container using the 'background' color from the theme
                Surface(
                    modifier = Modifier.fillMaxSize(),
                    color = MaterialTheme.colorScheme.background
                ) {
                    Test(applicationContext)
                }
            }
        }
    }
}

@Composable
fun Test(applicationContext: Context, modifier: Modifier = Modifier, viewModel: NnApiViewModel = viewModel()) {
    Scaffold { innerPadding ->
        Column {
            Button(modifier = Modifier.padding(innerPadding), onClick = { viewModel.initialize(applicationContext) }) {
                Text(text = "INITIALIZE")
            }
            Button(
                onClick = {
                    viewModel.runInference()
                },
                modifier = Modifier.padding(innerPadding)
            ) {
                Text(viewModel.output + viewModel.elapsed)
            }
        }
    }
}



NnApiViewModel.kt
class NnApiViewModel : ViewModel() {

    var options = Interpreter.Options()
    var nnApiDelegate: NnApiDelegate? = null
    var tfLite: Interpreter? = null
    var output by mutableStateOf("")
    var elapsed by mutableStateOf(0L)

    // Initialize interpreter with NNAPI delegate for Android Pie or above
    fun initialize(applicationContext: Context) {
        nnApiDelegate = NnApiDelegate()
        options.useNNAPI = true
        NnApiDelegate.Options().useNnapiCpu = false
        options.addDelegate(nnApiDelegate)

        val model = applicationContext.assets.open("models/whisper-tiny.tflite")
        val file = model.readBytes()

        val fileName = "whisper-tiny.tflite"
        applicationContext.openFileOutput(fileName, Context.MODE_PRIVATE).use {
            it.write(file)
        }

        val modelFile = File(applicationContext.filesDir,"whisper-tiny.tflite")

        // Initialize TFLite interpreter
        try {
            tfLite = Interpreter(modelFile, options)
        } catch (e: Exception) {
            throw RuntimeException(e)
        }
    }

    fun runInference() {
        try {
            val outputShape = tfLite?.getOutputTensor(0)
            val input = TensorBuffer.createFixedSize(intArrayOf(1, 80, 3000), DataType.FLOAT32)
            val output = TensorBuffer.createFixedSize(outputShape?.shape(), DataType.FLOAT32)

            val start = System.currentTimeMillis()
            tfLite?.run(input.buffer, output.buffer)
            elapsed = System.currentTimeMillis() - start
        } catch (e: RuntimeException) {
            throw RuntimeException(e)
        }
    }

    fun unload() {
        tfLite?.close()
        nnApiDelegate?.close()
    }
}

Relevant log output

I  Loaded native library: tensorflowlite_jni
I  Didn't load native library: tensorflowlite_jni_gms_client
I  Initialized TensorFlow Lite runtime.
I  DeviceManager::DeviceManager
I  findAvailableDevices
E  Error opening trace file: No such file or directory (2)
I  Found interface google-edgetpu (version = 2.0)
I  Found interface google-armnn (version = ArmNN)
I  Created TensorFlow Lite delegate for NNAPI.
E  FATAL EXCEPTION: main                                                                                                                                                                                                   java.lang.RuntimeException: java.lang.IllegalArgumentException: Internal error: Error applying delegate:

About this issue

  • Original URL
  • State: open
  • Created a year ago
  • Comments: 18 (2 by maintainers)

Most upvoted comments

@mitchelldehaven how would you go about creating a tiny model that supports GPU or NNAPI?

You would need to do a couple things:

  • First, you would need to separate the encoder and decoder and re-implement the control flow in whatever language you are using (presumably Java if you are using NNAPI).
  • Second, the particular op NDScatterAndUpdate is not supported. You would need to re-implement this functionality with supported ops (presumably possible, I’m not sure exactly how).