openvino: ReadNetwork API is failing while loading models from memory.

System information

  • OpenVINO => 2021.4.752
  • Operating System / Platform => Ubuntu 20.04 LTS

–Detailed Description:-

I am using OpenVINO 2021.4 in my project. I want to use ReadNetwork() for loading models from memory and get CNNNetwork in return. I am able to create Tensordesc and blob, but ReadNetwork() is throwing error like:

unknown file: Failure C++ exception with description “base_inference plugin intitialization failed” thrown in the test body.

I have some ambiguity about blob and tensors i am creating. Because of that only i think it is throwing this error. One thing , I am checking size of tensor and blob created (using sizeof()) and it is not same as bin vector.

I tried to call ReadNetwork in many ways: I will share the code snippet how i am calling. I referred https://docs.openvino.ai/2021.4/openvino_docs_IE_DG_protecting_model_guide.html docs for my usecase.

Can you please help me on this. How can i resolve this bug.

//I will get this vectors after decrypting encrypted model files
std::vector <uint8_t> weights;
std::vector <uint8_t> model;
// weights and model is not empty here. 

std::string strModel(model.begin(), model.end());

InferenceEngine::Core ie;
        
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel,
                                      InferenceEngine::make_shared_blob<uint8_t>({InferenceEngine::Precision::U8,
                                        {weights.size()}, InferenceEngine::C}, weights.data()));

AND

 InferenceEngine::TensorDesc O_tensor(InferenceEngine::Precision::U8,{weights.size()},InferenceEngine::Layout::ANY);
 std::cout<<"size of tensor"<<sizeof(O_tensor)<<std::endl;
 InferenceEngine::TBlob<uint8_t>::Ptr wei_blob = InferenceEngine::make_shared_blob<uint8_t>(O_tensor,&weights[0]);
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel, wei_blob);

AND

 InferenceEngine::Blob::Ptr blobWts = InferenceEngine::make_shared_blob<uint8_t>({InferenceEngine::Precision::U8,
                                                      {weights.size()},InferenceEngine::Layout::C});
 blobWts->allocate();
std::memcpy(blobWts->buffer(), weights.data(), weights.size());
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel,blobWts);

AND

InferenceEngine::TensorDesc bindesc(InferenceEngine::Precision::U8, {weights.size()}, InferenceEngine::Layout::C);

InferenceEngine::Blob::Ptr blobWts = InferenceEngine::make_shared_blob<uint8_t>(bindesc, weights.data());
InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel, blobWts);

AND

InferenceEngine::CNNNetwork network = ie.ReadNetwork(strModel,InferenceEngine::make_shared_blob<uint8_t> 
                                                                    ({InferenceEngine::Precision::U8,{weights.size()},InferenceEngine::C }, weights.data()));
							

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 21 (12 by maintainers)

Most upvoted comments

@ilyachur Yeah, I want to rebuild OpenVINO 2021.4 with custom protobuf version. I need help on this. Like I have installed OpenVINO on my machine from the .tgz file by following the steps from official webpage.

TF is needed for my application, so without that I won’t be able to proceed further.