rpi-object-tracking: RuntimeError:

  • rpi-deep-pantilt version: 1.0.1
  • Python version: 3.7.3
  • TensorFlow version: 2.0.0
  • edgetpu version: 2.15.0
  • tflite-runtime version: 2.1.0.post1
  • Operating System: Raspbian 10 (buster)

Description

I was trying to run the detect application, but I get an error.

*Note: I can run the commands detect and track with no error when I not call edgetpu.

What I Did

$ rpi-deep-pantilt detect --edge-tpu

Traceback (most recent call last):
  File "/home/pi/.virtualenvs/dl4rpi/bin/rpi-deep-pantilt", line 8, in <module>
    sys.exit(main())
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 107, in main
    cli()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 52, in detect
    model = SSDMobileNet_V3_Coco_EdgeTPU_Quant()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/detect/ssd_mobilenet_v3_coco.py", line 56, in __init__
    self.tflite_interpreter.allocate_tensors()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter.py", line 244, in allocate_tensors
    return self._interpreter.AllocateTensors()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 106, in AllocateTensors
    return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
RuntimeError: Internal: Unsupported data type in custom op handler: 0Node number 2 (EdgeTpuDelegateForCustomOp) failed to prepare.

I think that the problem could be originated by some incompatibility between all the packages listed above.

Would you mind to take a look?

Thank you in advance!

About this issue

  • Original URL
  • State: open
  • Created 4 years ago
  • Comments: 30 (4 by maintainers)

Most upvoted comments

Jorge,

Thank you for your kindness!

I started to looking into FOC controller because of it’s almost zero mechanical latency & to eliminate noise that typically associated with conventional mechanical gear reduction meshing gear noise.(such as Pimorini hat servo motor noise) and here are few links you may want to explore.

Simplebgc32 support PWM. I2c and serial input and only extended version support 2 to 3 closed loop position control together with magnetic or rotary encoder, it’s not open-source. https://www.basecamelectronics.com/simplebgc32bit/ Reliable hardware vendor: https://shop.iflight-rc.com/index.php?route=product/category&path=34_48 This product is rather pricy but I used this anyway because this is the only one I could use with PWM output and closed loop position control

Storm32 support PWM, I2c and serial input and only T-STorM32 support 2 to 3 closed loop position control and servo motor mode together with magnetic encoder, it’s open source. T-Storm32 is not readily available yet & I’m hesitated order through small external vendor located in Russia although it supposed have servo motor mode.

Good: These two product is based on IMU sensor based FOC controller and great for application that require 3 axes stabilization as they were originally designed to stabilize video camera mounted on to drones and are widely being used in Cinema and professional videography as a stabilizer currently. They both have PID controller.

Bad: Since these two above are IMU sensor based controller they develop drift and absolute positioning reference is required to counter that drift(sensor offset) and further require Magnetometer or GPS input —even with that additional requirement, it won’t be 100% absolute reference point as Magnetometer is susceptible to magnetic field & near by iron source interference and even GPS also have offset when it is in stationary. Their PID controller is based on physical force orientation are not capable of compensating software side such as vision processing latency.

Better Alternatives for absolute closed loop positioning FOC controller;

Pablo https://www.youtube.com/watch?v=MKNkZOja7-s has 2 ch control board and it’s open source Good: small 2 ch board with high-power motor output Bad: No PWM input example yet(But I’m sure someone good with Arduino can write this code without problem, it just me waiting Pablo to put out PWM example David from SimpleFOC community indicated to me this board & Pablo’s code heat up motor hotter but I have not noticed that yet

Justine https://youtu.be/OZvjfbpXpro also has 2 ch control board and it’s open source Good: support PWM, I2c, ROS and example code. Bad: Pricy, 16bit, too big.

SimpleFOC https://www.simplefoc.com has 1 ch stackable control board and it’s open source Good: Arduino Library support, Good price point Bad: Only 1 ch, stackable design makes too tall when multiple board are stacked, no PWM input support yet(No such Arduino library source)

👏I hear you all, I’m still researching FOC brushless motor control so hopefully It’ll make rpi-deep-pantilt tracks object 360 degree continuously and in silence and I already see limitation of using PWM(only up to 180 degree) for this–need to study I2c & SPI for this newbie, Ugh…

Jorge,

Looking at my working setup (I have 2 Pi4 running rpi-deep-pantilt, 1 using standard Pimorini servo hat like you see in Leigh’s post & another using large brushless gimbal and this link https://youtu.be/Ce-c9StqzsE

My working setup (following Leigh’s Toward Data Science & Github instruction) are follows; Pi4 8-20-2020 OS (most recent), 8 GB, 4GB Works too rpi_deep_pantilt-1.2.1 tflite-runtime==2.1.0 Tensorflow 2.2.0 libedgetpu1-std:armhf 14.1

As you can see, somehow your setup is not same as following Leigh’s post (may be something has been changed that I don’t know) but I know those combination works very well except when you move out of FOV for few second & come back then it freezes.