mediapipe: Android hand tracking error:use aar in my own project

I use my own android studio project to get the camera data. And then transform it to a ARGB_8888 bitmap. At last I use the above bitmap as input to call public void onNewFrame(final Bitmap bitmap, long timestamp) method in FrameProcessor.class. But I got errors like: demo E/FrameProcessor: Mediapipe error: com.google.mediapipe.framework.MediaPipeException: invalid argument: Graph has errors: Packet type mismatch on calculator outputting to stream "input_video": The Packet stores "mediapipe::ImageFrame", but "mediapipe::GpuBuffer" was requested. at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native Method) at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:380) at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:286) at org.tensorflow.demo.DetectorActivity.processImage(DetectorActivity.java:509) at org.tensorflow.demo.CameraActivity.onPreviewFrame(CameraActivity.java:267) at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1124) at android.os.Handler.dispatchMessage(Handler.java:105) at android.os.Looper.loop(Looper.java:164) at android.app.ActivityThread.main(ActivityThread.java:6600) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:240) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:772) So does it use gpu as default? How can I solve the problem? By the way,I found in the V0.6.6 multi hand aar example, if I just put hand_landmark_3d.tflite instead of hand_landmark.tflite, there will be wrong. The phone got a black window. If I rename hand_landmark_3d.tflite to hand_landmark.tflite, it will get right 3d result. It seems the code only load tflite named hand_landmark.tflite. I’m not sure is it a problem. Anybody help?Thank you.

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 28

Most upvoted comments

@afsaredrisy Thanks a lot for your help. I have solved my problem.

Hi Have a look on this. https://github.com/afsaredrisy/MediapipeHandtracking_GPU_Bitmap_Input I uploaded same work with Bitmap.

@afsaredrisy Here is my code:

`processor = new FrameProcessor( this, eglManager.getNativeContext(), BINARY_GRAPH_NAME, INPUT_VIDEO_STREAM_NAME, OUTPUT_VIDEO_STREAM_NAME);

renderer = new TextureRenderer(); renderer.setFlipY(FLIP_FRAMES_VERTICALLY); renderer.setup();

destinationTextureId = ShaderUtil.createRgbaTexture(previewWidth, previewHeight); outputFrame = new AppTextureFrame(destinationTextureId, previewWidth, previewHeight);`

When rgbFrameBitmap is available:

renderer.render(outputFrame.getTextureName(), rgbFrameBitmap); outputFrame.setTimestamp(timestamp); outputFrame.setInUse(); processor.onNewFrame(outputFrame);

Here is the TextureRenderer:

`public class TextureRenderer { private static final FloatBuffer TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float[]{0.0F, 0.0F, 1.0F, 0.0F, 0.0F, 1.0F, 1.0F, 1.0F}); private static final FloatBuffer FLIPPED_TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float[]{0.0F, 1.0F, 1.0F, 1.0F, 0.0F, 0.0F, 1.0F, 0.0F}); private static final String TAG = “TextureRenderer”; private static final int ATTRIB_POSITION = 1; private static final int ATTRIB_TEXTURE_COORDINATE = 2; private int program = 0; private int frameUniform; private int textureTransformUniform; private float[] textureTransformMatrix = new float[16]; private boolean flipY;

private final String vertexShaderCode =
        "uniform mat4 texture_transform;\n"
                + "attribute vec4 position;\n"
                + "attribute mediump vec4 texture_coordinate;\n"
                + "varying mediump vec2 sample_coordinate;\n"
                + "\n"
                + "void main() {\n"
                + "  gl_Position = position;\n"
                + "  sample_coordinate = (texture_transform * texture_coordinate).xy;\n"
                + "}";

private final String fragmentShaderCode =
        "#extension GL_OES_EGL_image_external : require\n"
                + "varying mediump vec2 sample_coordinate;\n"
                + "uniform sampler2D video_frame;\n"
                + "\n"
                + "void main() {\n"
                + "  gl_FragColor = texture2D(video_frame, sample_coordinate);\n"
                + "}";

public TextureRenderer() {
}

public void setup() {
    Map<String, Integer> attributeLocations = new HashMap();
    attributeLocations.put("position", 1);
    attributeLocations.put("texture_coordinate", 2);
    this.program = ShaderUtil.createProgram(vertexShaderCode, fragmentShaderCode, attributeLocations);
    this.frameUniform = GLES20.glGetUniformLocation(this.program, "video_frame");
    this.textureTransformUniform = GLES20.glGetUniformLocation(this.program, "texture_transform");
    ShaderUtil.checkGlError("glGetUniformLocation");
}

public void setFlipY(boolean flip) {
    this.flipY = flip;
}

public void render(int textureName, Bitmap bmp) {
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    ShaderUtil.checkGlError("glActiveTexture");
    GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bmp, 0);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
    ShaderUtil.checkGlError("glTexParameteri");
    GLES20.glUseProgram(program);
    ShaderUtil.checkGlError("glUseProgram");
    GLES20.glUniform1i(frameUniform, 0);
    ShaderUtil.checkGlError("glUniform1i");
    GLES20.glUniformMatrix4fv(textureTransformUniform, 1, false, textureTransformMatrix, 0);
    ShaderUtil.checkGlError("glUniformMatrix4fv");
    GLES20.glEnableVertexAttribArray(ATTRIB_POSITION);
    GLES20.glVertexAttribPointer(
            ATTRIB_POSITION, 2, GLES20.GL_FLOAT, false, 0, CommonShaders.SQUARE_VERTICES);
    GLES20.glEnableVertexAttribArray(ATTRIB_TEXTURE_COORDINATE);
    GLES20.glVertexAttribPointer(
            ATTRIB_TEXTURE_COORDINATE,
            2,
            GLES20.GL_FLOAT,
            false,
            0,
            flipY ? FLIPPED_TEXTURE_VERTICES : TEXTURE_VERTICES);
    ShaderUtil.checkGlError("program setup");
    GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
    ShaderUtil.checkGlError("glDrawArrays");
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureName);
    ShaderUtil.checkGlError("glBindTexture");
    GLES20.glFinish();
}

public void release() {
    GLES20.glDeleteProgram(this.program);
}

}`

@liaohong I tried transforming a bitmap to texture by a function in ShaderUtil.class named: public static int createRgbaTexture (Bitmap bitmap). And then used the texture above to new an AppTextureFrame which implements TextureFrame. At last called the function public void onNewFrame(final TextureFrame) in FrameProcessor.calss. But now I get a new problem. I got empty results of handPresence and handLandmarks. I’m not sure if the method above is right or not. I’m checking.