mediapipe: Android hand tracking error:use aar in my own project
I use my own android studio project to get the camera data.
And then transform it to a ARGB_8888 bitmap.
At last I use the above bitmap as input to call public void onNewFrame(final Bitmap bitmap, long timestamp) method in FrameProcessor.class.
But I got errors like:
demo E/FrameProcessor: Mediapipe error: com.google.mediapipe.framework.MediaPipeException: invalid argument: Graph has errors: Packet type mismatch on calculator outputting to stream "input_video": The Packet stores "mediapipe::ImageFrame", but "mediapipe::GpuBuffer" was requested. at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native Method) at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:380) at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:286) at org.tensorflow.demo.DetectorActivity.processImage(DetectorActivity.java:509) at org.tensorflow.demo.CameraActivity.onPreviewFrame(CameraActivity.java:267) at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1124) at android.os.Handler.dispatchMessage(Handler.java:105) at android.os.Looper.loop(Looper.java:164) at android.app.ActivityThread.main(ActivityThread.java:6600) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:240) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:772)
So does it use gpu as default?
How can I solve the problem?
By the way,I found in the V0.6.6 multi hand aar example, if I just put hand_landmark_3d.tflite instead of hand_landmark.tflite, there will be wrong. The phone got a black window. If I rename hand_landmark_3d.tflite to hand_landmark.tflite, it will get right 3d result.
It seems the code only load tflite named hand_landmark.tflite.
I’m not sure is it a problem.
Anybody help?Thank you.
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Comments: 28
@afsaredrisy Thanks a lot for your help. I have solved my problem.
Hi Have a look on this. https://github.com/afsaredrisy/MediapipeHandtracking_GPU_Bitmap_Input I uploaded same work with Bitmap.
@afsaredrisy Here is my code:
`processor = new FrameProcessor( this, eglManager.getNativeContext(), BINARY_GRAPH_NAME, INPUT_VIDEO_STREAM_NAME, OUTPUT_VIDEO_STREAM_NAME);
renderer = new TextureRenderer(); renderer.setFlipY(FLIP_FRAMES_VERTICALLY); renderer.setup();
destinationTextureId = ShaderUtil.createRgbaTexture(previewWidth, previewHeight); outputFrame = new AppTextureFrame(destinationTextureId, previewWidth, previewHeight);`
When rgbFrameBitmap is available:
renderer.render(outputFrame.getTextureName(), rgbFrameBitmap); outputFrame.setTimestamp(timestamp); outputFrame.setInUse(); processor.onNewFrame(outputFrame);
Here is the TextureRenderer:
`public class TextureRenderer { private static final FloatBuffer TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float[]{0.0F, 0.0F, 1.0F, 0.0F, 0.0F, 1.0F, 1.0F, 1.0F}); private static final FloatBuffer FLIPPED_TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float[]{0.0F, 1.0F, 1.0F, 1.0F, 0.0F, 0.0F, 1.0F, 0.0F}); private static final String TAG = “TextureRenderer”; private static final int ATTRIB_POSITION = 1; private static final int ATTRIB_TEXTURE_COORDINATE = 2; private int program = 0; private int frameUniform; private int textureTransformUniform; private float[] textureTransformMatrix = new float[16]; private boolean flipY;
}`
@liaohong I tried transforming a bitmap to texture by a function in ShaderUtil.class named:
public static int createRgbaTexture (Bitmap bitmap)
. And then used the texture above to new an AppTextureFrame which implements TextureFrame. At last called the functionpublic void onNewFrame(final TextureFrame)
in FrameProcessor.calss. But now I get a new problem. I got empty results of handPresence and handLandmarks. I’m not sure if the method above is right or not. I’m checking.