frigate: ffmpeg quicksync hw accel not working per documentation

Describe the bug When configuring QuickSync per the documents the system does not appear to use hw acceleration.

Version of frigate 0.8.4-5043040

Config file Include your full config file wrapped in triple back ticks.

detectors:
  cpu1:
    type: cpu
mqtt:
  host: 192.168.50.13
  port: 1883
  topic_prefix: frigate
  client_id: frigate
  user: frigate
  password: Password1
  stats_interval: 60
cameras:
  doorbell:
      ffmpeg:
        hwaccel_args:
            - -hwaccel
            - qsv
            - -qsv_device
            - /dev/dri/renderD128
      inputs:
        - path: rtsp://192.168.50.13:8554/front-doorbell
          roles:
            - detect
            - clips
      output_args:
        detect: -vf transpose=1 -f rawvideo -pix_fmt yuv420p
        clips: -vf transpose=1 -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c:v libx264 -an
    objects:
      track:
        - person
        - dog
        - cat
    width: 1296
    height: 1728
    fps: 5
    motion:
      mask:
        - 609,982,0,986,0,0,1126,0,1170,1234
    zones:
      zone_0:
        coordinates: 0,1728,1258,1728,1190,1250,614,966,26,1000
    clips:
      enabled: True
      pre_capture: 5
      post_capture: 15
      objects:
        - person
        - dog
        - cat
      retain:
        default: 1
        objects:
          person: 1

Frigate container logs

frigate    |  * Starting nginx nginx
frigate    |    ...done.
frigate    | frigate.app                    WARNING : Camera doorbell has rtmp enabled, but rtmp is not assigned to an input.
frigate    | Starting migrations
frigate    | peewee_migrate                 INFO    : Starting migrations
frigate    | There is nothing to migrate
frigate    | peewee_migrate                 INFO    : There is nothing to migrate
frigate    | frigate.mqtt                   INFO    : MQTT connected
frigate    | detector.cpu1                  INFO    : Starting detection process: 39
frigate    | frigate.app                    INFO    : Camera processor started for doorbell: 42
frigate    | frigate.app                    INFO    : Capture process started for doorbell: 43

Frigate stats

{
  "detection_fps": 0.3, 
  "detectors": {
    "cpu1": {
      "detection_start": 0.0, 
      "inference_speed": 58.21, 
      "pid": 39
    }
  }, 
  "doorbell": {
    "camera_fps": 5.1, 
    "capture_pid": 43, 
    "detection_fps": 0.3, 
    "pid": 42, 
    "process_fps": 5.1, 
    "skipped_fps": 0.0
  }, 
  "service": {
    "storage": {
      "/dev/shm": {
        "free": 63.2, 
        "mount_type": "tmpfs", 
        "total": 67.1, 
        "used": 3.9
      }, 
      "/media/frigate/clips": {
        "free": 84131.7, 
        "mount_type": "ext4", 
        "total": 105152.2, 
        "used": 15635.0
      }, 
      "/media/frigate/recordings": {
        "free": 84131.7, 
        "mount_type": "ext4", 
        "total": 105152.2, 
        "used": 15635.0
      }, 
      "/tmp/cache": {
        "free": 982.3, 
        "mount_type": "tmpfs", 
        "total": 1000.0, 
        "used": 17.7
      }
    }, 
    "uptime": 451, 
    "version": "0.8.4-5043040"
  }
}

FFprobe from your camera

Run the following command and paste output below

ffprobe version 4.3.1 Copyright (c) 2007-2020 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.3.0-17ubuntu1~20.04)
  configuration: --disable-debug --disable-doc --disable-ffplay --enable-shared --enable-avresample --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-gpl --enable-libfreetype --enable-libvidstab --enable-libmfx --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libxcb --enable-libx265 --enable-libxvid --enable-libx264 --enable-nonfree --enable-openssl --enable-libfdk_aac --enable-postproc --enable-small --enable-version3 --enable-libzmq --extra-libs=-ldl --prefix=/opt/ffmpeg --enable-libopenjpeg --enable-libkvazaar --enable-libaom --extra-libs=-lpthread --enable-vaapi --extra-cflags=-I/opt/ffmpeg/include --extra-ldflags=-L/opt/ffmpeg/lib
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100
[rtsp @ 0x5650b4f80b80] method SETUP failed: 461 Unsupported Transport
Input #0, rtsp, from 'rtsp://192.168.50.13:8554/front-doorbell':
  Metadata:
    title           : Stream
  Duration: N/A, start: -0.211367, bitrate: N/A
    Stream #0:0: Video: h264, yuv420p(tv, unknown/bt709/unknown, progressive), 1728x1296, 20 fps, 20 tbr, 90k tbn, 40 tbc

Screenshots If applicable, add screenshots to help explain your problem.

Computer Hardware

  • OS: Ubuntu 20.04
  • Install method: docker-compose
  • Virtualization: N/A - Physical
  • Coral Version: None
  • Network Setup: Wired

Camera Info:

  • Manufacturer: Wyze
  • Model: Doorbell Cam
  • Resolution: 1728x1296
  • FPS: 20

Additional context

Wyze Doorbell Cam, using wyze-bridge, comes in and needs to be rotated 90 degrees. I have this configured but it results in extremely high CPU usage by ffmpeg. I have Intel Gen 9 QuickSync hw on the machine passed through to Frigate. I monitor the GPU using intel_gpu_top and I do not see any indication that ffmpeg running inside of Frigate is using HW Acceleration on decoding or encoding.

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 4
  • Comments: 29 (1 by maintainers)

Commits related to this issue

Most upvoted comments

After a good bit of noodling around, I managed to get qsv hardware decode working.

  • All of my cameras output h264
  • Intel i7 Rocket Lake

image

ffmpeg:
  hwaccel_args:
    - -hwaccel
    - qsv
    - -c:v
    - h264_qsv
  output_args:
    detect: -f rawvideo -vf hwdownload,format=nv12 -pix_fmt yuv420p

I hope this helps.

I thought I’d share that I also got hardware acceleration to work, however in my case I was looking at the wrong thing: it was the host machine and not the config. I run Proxmox on a headless Lenovo M73 Tiny, with 2 vms, 1 for Home Assistant and 1 Ubuntu with Docker. I had stumbled on a post in another forum and found that the GPU does not kick in unless something is plugged into the video out port. Since I don’t have room for a monitor, I ended up buying a dummy DisplayPort plug from Amazon. As soon as I plugged it in, HW acceleration started working. In previous testing, I had no errors when HW acceleration was enabled, I would just get no picture. After plugging in the dummy plug, it started to magically work.

I’ve got some more tweaking to do, but am a lot further than I was a week ago!

Hopefully this helps others out.

@blakeblackshear

I was able to get ffmpeg to work with hw_accel and QuickSync.

To do so I did the following:

  1. Build all of the containers except ffmpeg
  2. Removed opying ffmpeg in Dockerfile.base
  3. Added the following the Dockerfile.amd64

I did this with the 0.9 RC2 code.

RUN apt-get -qq update && \
    apt-get -qq install --no-install-recommends -y software-properties-common && \
    add-apt-repository ppa:savoury1/ffmpeg4 -y && \
    add-apt-repository ppa:savoury1/graphics -y && \
    add-apt-repository ppa:savoury1/multimedia -y && \
    apt -qq install -y ffmpeg && \
    rm -rf /var/lib/apt/lists/* && \
    (apt-get autoremove -y; apt-get autoclean -y)

Screen Shot 2021-09-01 at 1 35 49 PM

detectors:
  coral:
    type: edgetpu
    device: usb
mqtt:
  host: 192.168.50.13
  port: 1883
  topic_prefix: frigate
  client_id: frigate 
  stats_interval: 60
cameras:
  doorbell:
    ffmpeg:
      inputs:
        - path: rtsp://192.168.50.13:8554/front-doorbell
          roles:
            - detect
            - record
      output_args:
        detect: -vf transpose=1 -f rawvideo -pix_fmt yuv420p
        record: -vf transpose=1 -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c:v h264_qsv -an

    detect:
      width: 1296
      height: 1728
      fps: 5
      enabled: True
      max_disappeared: 25

    motion:
      mask: 0,900,1080,900,1080,1920,0,1920

    best_image_timeout: 60

    record:
      enabled: True
      retain_days: 0
      events:
        enabled: True
        pre_capture: 5
        post_capture: 5
        objects:
          - person
          - cat
          - dog
        required_zones: []
        retain:
          default: 1
          objects:
            person: 7

    live:
      height: 1728
      quality: 8

    snapshots:
      enabled: False
      timestamp: False
      bounding_box: False
      crop: False
      height: 175
      required_zones: []
      retain:
        default: 10
        objects:
          person: 15

    mqtt:
      enabled: True
      timestamp: True
      bounding_box: True
      crop: True
      height: 270
      quality: 70
      required_zones: []

    objects:
      track:
        - person
        - car
        - dog
        - cat
      filters:
        person:
          min_area: 5000
          max_area: 100000
          min_score: 0.5
          threshold: 0.7

Great to see you did it! 👍

note: (5 FPS is meant for originally grabbing the stream. 5 FPS should be enough to feed into frigate) (what CPU your are on for QSV? a more modern one?)

edit (for test the new mermaid markup):

  graph TD;
      TRY-->ERROR;
      REPEAT-->TRY;
      ERROR-->REPEAT;

Alright, sorry for the spam. Hopefully others can do the same and figure it out just like I did with trial and error. Big shout to @ozett and his original full verbose command line to try to run inside the container: ffmpeg -loglevel verbose -rtsp_transport tcp -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -stimeout 5000000 -use_wallclock_as_timestamps 1 -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -i rtsp://admin:pass@192.168.20.130/Streaming/channels/101 -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c h264_vaapi -an -f null - from comment: https://github.com/blakeblackshear/frigate/issues/1607#issuecomment-907804483

Cutting it down to bare necessities left me with: ffmpeg -loglevel verbose -hwaccel qsv -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -i rtsp://MYSTREAMHERE -c h264_qsv -f null -

Gave error of:

Invalid encoder type 'h264_qsv'

Very misleading. But adding -an switch to ffmpeg to remove audio from the stream fixes this:

ffmpeg -loglevel verbose -hwaccel qsv -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -i rtsp://MYSTREAMHERE -c h264_qsv -an -f null -

I’m now given this error:

[h264_qsv @ 0x5556b2859580] Current frame rate is unsupported
[h264_qsv @ 0x5556b2859580] some encoding parameters are not supported by the QSV runtime. Please double check the input parameters.
Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

ffmpeg -loglevel verbose -hwaccel qsv -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -i rtsp://MYSTREAMHERE -filter:v fps=30 -c h264_qsv -an -f null -

Note if we switch -c h264_qsv to -c:v h264_qsv we can drop the -an.

And with this, it doesn’t error out and shows GPU utilization!

working

The only remaining question I have is… do I do the FPS in this encode at 30? Or at 5 which is the default for Frigate playback? Still a little confused at which value to use.

Unrecognized hwaccel: h264_qsv. Supported hwaccels: vaapi qsv

the problem and solution are right in the error. -hwaccel arg should just be just “qsv” or “vaapi”