onnxruntime: Problem with loading converted onnx model

Describe the bug I have converted to onnx ssdlite_mobilenet_v2_coco model from tensorflow detection model zoo (could be found here). Now I’m trying to load the model using ML.NET and get an error

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS 10.13.6
  • ONNX Runtime installed from (source or binary): NuGet
  • ONNX Runtime version: 0.4.0
  • Python version: 3.6.7
  • Visual Studio version (if applicable): 8.0.5
  • GCC/Compiler version (if compiling from source): none
  • CUDA/cuDNN version: none
  • GPU model and memory: none

To Reproduce ` public struct ImageSettings { public const int ImageWidth = 300; public const int ImageHeight = 300; public const bool ChannelLast = true; }

    public struct SSDSettings
    {
        public const string SSDInput = "image_tensor:0";
        public const string SSDDetectionsOutput = "num_detections:0";
        public const string SSDClassesOutput = "detection_classes:0";
        public const string SSDBoxesOutput = "detection_boxes:0";
        public const string SSDScoresOutput = "detection_scores:0";
    }

    public PredictionEngine<ImageData, ImagePredictions> LoadModel(string modelLocation,
                                                                   string imagesLocation,
                                                                   string tagsLocation)
    {
        IDataView data = mLContext.Data.LoadFromTextFile<ImageData>(path: tagsLocation,
                                                                    hasHeader: false);
        var pipeline = mLContext.Transforms.LoadImages(outputColumnName: "image_tensor:0",
                                                       imageFolder: imagesLocation,
                                                       inputColumnName: nameof(ImageData.ImagePath))
               .Append(mLContext.Transforms.ResizeImages(outputColumnName: "image_tensor:0",
                                                         imageWidth: ImageSettings.ImageWidth,
                                                         imageHeight: ImageSettings.ImageHeight,
                                                         inputColumnName: "image_tensor:0"))
               .Append(mLContext.Transforms.ExtractPixels(outputColumnName: "image_tensor:0",
                                                          interleavePixelColors: ImageSettings.ChannelLast))
               .Append(mLContext.Transforms.ApplyOnnxModel(modelFile: modelLocation,
                                                           outputColumnNames: new[] { SSDSettings.SSDDetectionsOutput,
                                                                                      SSDSettings.SSDClassesOutput,
                                                                                      SSDSettings.SSDBoxesOutput,
                                                                                      SSDSettings.SSDScoresOutput },
                                                           inputColumnNames: new[] { SSDSettings.SSDInput }));


        var model = pipeline.Fit(data);

        var predictionEngine = mLContext.Model.CreatePredictionEngine<ImageData, ImagePredictions>(model);
        return predictionEngine;
    }`

Expected behavior Model should be loaded correctly

Screenshots

Additional context With OnnxRuntime 0.4.0 I got 2019-06-12 14:32:53.802528 [W:onnxruntime:InferenceSession, session_state_initializer.cc:502 SaveInputOutputNamesToNodeMapping] Graph input with name i__19 is not associated with a node. 2019-06-12 14:32:53.802581 [W:onnxruntime:InferenceSession, session_state_initializer.cc:502 SaveInputOutputNamesToNodeMapping] Graph input with name cond__21 is not associated with a node. 2019-06-12 14:32:54.004760 [W:onnxruntime:InferenceSession, session_state_initializer.cc:502 SaveInputOutputNamesToNodeMapping] Graph input with name i__42 is not associated with a node. 2019-06-12 14:32:54.004790 [W:onnxruntime:InferenceSession, session_state_initializer.cc:502 SaveInputOutputNamesToNodeMapping] Graph input with name cond__44 is not associated with a node. 2019-06-12 14:32:54.005072 [W:onnxruntime:InferenceSession, session_state_initializer.cc:502 SaveInputOutputNamesToNodeMapping] Graph input with name i is not associated with a node. Onnx type not supported

With earlier versions I got Error initializing model :Microsoft.ML.OnnxRuntime.OnnxRuntimeException: [ErrorCode:InvalidGraph] Load model from /my/path/to/file/ssd_mobilenet.onnx failed:Node:Preprocessor/map/strided_slice Node (Preprocessor/map/strided_slice) has input size 4 not in range [min=1, max=1]. at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options) in C:\agent\_work\6\s\csharp\src\Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 83 at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath) in C:\agent\_work\6\s\csharp\src\Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 31 at Microsoft.ML.Transforms.Onnx.OnnxModel..ctor(String modelFile, Nullable 1 gpuDeviceId, Boolean fallbackToCpu) at Microsoft.ML.Transforms.Onnx.OnnxTransformer..ctor(IHostEnvironment env, Options options, Byte[] modelBytes)

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 31 (20 by maintainers)

Most upvoted comments

@antonfr – are you able to build from source and load the model successfully now, as @hariharans29 was able to do?