openvino: [Bug] Cannot run inference on OpenVino CPU after update to OpenVino 2022.2

System information (version)
  • OpenVINO => 2022.2
  • Operating System / Platform => Windows 64 Bit
  • Compiler => Visual Studio 2022
  • Problem classification: Model Inference
  • Framework: onnx infered through OpenCV
  • Model name: Unet/ResNet
Detailed description

Copy of https://github.com/opencv/opencv/issues/22640

We’ve been inferring a resnet based dnn for years using OpenCV, with support for different backends. Recently, after upgrading to OpenVino 2022.2 and recompiling OpenCV 4.6 we have an exception when inferring on several Intel CPUs (one i9-gen10, one i5-gen8) (it works on Intel GPUs): Exception: OpenCV(4.6.0-dev) D:\Dev\opencv\modules\dnn\src\ie_ngraph.cpp:747: error: (-2:Unspecified error) Failed to initialize Inference Engine backend (device = CPU): Cannot get memory! in function 'cv::dnn::InfEngineNgraphNet::initPlugin'

This might be a bug for https://github.com/openvinotoolkit/openvino/ as the exception is trigger here: https://github.com/opencv/opencv/blob/347246901eccabe503985a64f16813ca859af25a/modules/dnn/src/ie_ngraph.cpp#L1070 but with a message from https://github.com/openvinotoolkit/openvino/blob/042bd7274ac36715e16386be5c1f17924d53dede/src/plugins/intel_cpu/src/cpu_memory.h#L206

Issue submission checklist
  • I report the issue, it’s not a question
  • I checked the problem with documentation, FAQ, open issues, Stack Overflow, etc and have not found solution
  • There is reproducer code and related data files: images, videos, models, etc.

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 26 (14 by maintainers)

Most upvoted comments

@JulienMaille if you have no further questions about OpenVINO CPU or GPU, can I close the ticket?

@liubo-intel I have been bisecting commits to find when it stops working and localized it to the changes made to modules/dnn/src/onnx/onnx_importer.cpp in this commit: https://github.com/opencv/opencv/commit/ed69bcae2d171d9426cd3688a8b0ee14b8a140cd

More specifically, the problem is gone if I revert the replacement of https://github.com/opencv/opencv/commit/ed69bcae2d171d9426cd3688a8b0ee14b8a140cd#diff-f9fc55b2657f441025bc0adfeb683491b1e3d8275702e159efcb368a1b2c0dd5L3797 dispatch["Add"] = &ONNXImporter::parseBias; with &ONNXImporter::parseElementWise;

I’ve also tried this two version on my side. but meet some onnx import issues, I suspect it’s my environment set up issues(trying to fix it). so could you please also have a try on your side?

@liubo-intel were you able to use my reproducer?

@alvoron I already recompile OpenCV from source against OpenVino (and I’ve done this for months, it worked well until recently) A ticket was opened with opencv/opencv#22640 first and they asked me to move it here.

@JulienMaille : I have also tried your test model “models/test1.onnx” by openvino internal benchmark_app tool with version “OpenVino 2022.2”: ./benchmark_app -m ./models/test1.onnx -d CPU -hint tput -niter 10 it works fine without such memory issue. So it looks quite like that openvino(2022.2) CPU plugin support this model, but some issues happens when opencv integrate “OpenVino 2022.2” as its backend. If we try to look into this case from openvino side, I think we would better need opencv team to help reproduce it by openvino component(e.g. openvino benchmark_app tool), I mean without dependency of other (opencv) components .