dlstreamer: OpenVINO not getting installed via Dockerfile.

It seems that OpenVINO is not getting installed in the container via both the binaries.Dockerfile and Dockerfile provided.

For binaries.Dockerfile specifically, In the following line:

https://github.com/opencv/gst-video-analytics/blob/a1f5efb660d3916d63f2a26d23e246f8d8651634/docker/binaries.Dockerfile#L557

I replaced it with:

RUN wget http://registrationcenter-download.intel.com/akdlm/irc_nas/16612/l_openvino_toolkit_p_${OpenVINO_VERSION}.tgz

Which gave me successful execution, and thereafter a successful container run (plus, clear output for l_openvino_toolkit_p_2020.2.120/install.sh being ran). However, when exploring the file system (and going down to the tutorial for “Acquire models and videos” the following instructions:

source /opt/intel/openvino/bin/setupvars.sh
source ~/gva/gst-video-analytics/scripts/setup_env.sh

# Download models listed in models.lst into ~/intel/dl_streamer/models folder
cd ~/gva/gst-video-analytics/samples
./download_models.sh

Does not work because /opt/intel/openvino is not a path in the container.

root@4aac8f4d2944:~# ls /opt/intel/
dldt  mediasdk

This is the same when building with opensource via the Dockerfile provided:

root@bf6b3402dee1:~# ls /opt/intel/
dldt  mediasdk

Any ideas what this could be? I’ve been able to create a custom docker file that installs all the required dependencies (gstreamer, dl-streamer, and OpenVINO), but when running gstreamer with the dl-streamer plugins + OpenVino models I receive multiple warnings and the output video is the same as the input. My current assumption is that dl-streamer does not support latest versions of gstreamer+plugins… but I digress. Currently exploring the above route again (using the provided Dockerfiles in the repo), but if anything, I’ll open a new issue discussing revamping the Dockerfile in this repo.

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 20 (3 by maintainers)

Most upvoted comments

I have installed OpenVINO in my HOST machine (i.e. outside of the Docker container). By calling “source /opt/intel/openvino/bin/setupvars.sh” the OpenVINO env variables and paths are exposed, e.g. INTEL_OPENVINO_DIR. When downloading the models on HOST (i.e. outside of the Docker container) then set the env variable e.g. “export MODEL_PATH=/My/Path/gva/data/models/”, (and “make -p /My/Path/gva/data/models/”) the downloaded models then will be in subfolders “intel” and “common”.

@brmarkus I have installed OpenVINO in my HOST machine (i.e. outside of the Docker container).

Would you know if this is a must for it to work? My purpose with the Docker container is to eventually deploy it as an API pipeline, so having OpenVINO installed in the host machine isn’t much of an option.

Thank you for taking your time to chime in what worked for you. Honeslty appreciated.

@umed short answer is our inference elements does not draw boxes, they attaches meta data, but we have gvawatermark (should be placed after inference elements) that draws meta on video frames. If that does not solve your problem please create separate issue and we can continue discussion there.

I’ll open a new issue from here then. I’m indeed using gvawatermark (ref), but for whatever reason still getting runtime errors when trying to pipe videos through it, with an empty video output. With @brmarkus insights, I think I know why now. Thank you for taking the time to answer, and my apologies for the thread growing so large.

In short, the key thing I believe I’ve gathered from the discussion above and other opened issues is the following:

  • OpenVINO must be installed in the host machine, and DL Streamer may be installed via Docker.
  • (#56 (comment)) DL Streamer has specific gstreamer dependencies.

I’ll be trying later today and tomorrow to setup a new Dockerfile to install all dependencies in a single container. I’ll leave the thread open for a day (if that sits well with the maintainers here) until I have a single Dockerfile working with OpenVINO + gstreamer + DL Stream.

@synchronizing Another project that might be interesting / useful is our VA Serving which allows to easily create a pipeline service using DL Streamer. https://github.com/intel/video-analytics-serving, would be interested in any feedback you have on that as well.

Not able to cover the Acquire models and videos after installing via docker method. Can someone guide !!!