coremltools: Unable to load CoreML.framework. Cannot make predictions.

I’ve got the latest coremltools installed from source.

I can’t run model.predict because it gives exception:

Exception                                 Traceback (most recent call last)
<ipython-input-32-c70d9e92a144> in <module>
     54 
     55 
---> 56 coreml_model.predict({"input": np.zeros((1, 10, 4, 1, 1))})
     57 

~/.local/share/virtualenvs/FaceBoxes.PyTorch-kg-NzRFz/src/coremltools/coremltools/models/model.py in predict(self, data, useCPUOnly, **kwargs)
    339 
    340             if not _MLModelProxy:
--> 341                 raise Exception('Unable to load CoreML.framework. Cannot make predictions.')
    342             elif _MLModelProxy.maximum_supported_specification_version() < self._spec.specificationVersion:
    343                 engineVersion = _MLModelProxy.maximum_supported_specification_version()

Exception: Unable to load CoreML.framework. Cannot make predictions.

Here is my snippet:

import onnx
import torch
import numpy as np

from torch import nn
from onnx import onnx_pb
from onnx_coreml import convert


class Dummy(nn.Module):
    def __init__(self):
        super().__init__()
        self.priors = torch.randn(10, 4)
        self.variances = [1, 1]
    
    def forward(self, loc):
        boxes = torch.cat([
            self.priors[:, :2] + loc[:, :2] * self.variances[0] * self.priors[:, 2:],
            self.priors[:, 2:] * torch.exp(loc[:, 2:] * self.variances[1])], 1)
        boxes = torch.cat([boxes[:, :2] - boxes[:, 2:] / 2, boxes[:, 2:]], 1)
        boxes = torch.cat([boxes[:, :2], boxes[:, 2:] + boxes[:, :2]], 1)
        return boxes

model = Dummy()

dummy_input = torch.randn(10, 4)

model_name = 'dummy'
input_names = ['input']
output_names = ['out']
torch.onnx.export(model, dummy_input, f"{model_name}.onnx", verbose=False,
                  input_names=input_names, output_names=output_names)

model_in = f"{model_name}.onnx"
model_out = f"{model_name}.mlmodel"

model_file = open(model_in, 'rb')
model_proto = onnx_pb.ModelProto()
model_proto.ParseFromString(model_file.read())

coreml_model = convert(model_proto)
coreml_model.save(model_out)

# this line gives Exception
coreml_model.predict({"input": np.zeros((1, 10, 4, 1, 1))})

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Reactions: 5
  • Comments: 38 (6 by maintainers)

Most upvoted comments

I’m having the same issue, with coremltools==4.0b2 (also tried down to 3.3).

the library file seems to be in the right place. /lib/python3.7/site-packages/coremltools/libcoremlpython.so

with python=3.7.7 torchvision=0.6.1 and torch=1.5.1

I’m getting this error message: exception loading model proxy: dlopen(/Users/[…]/miniconda3/envs/[…]/lib/python3.7/site-packages/coremltools/libcoremlpython.so, 2): Symbol not found: OBJC_CLASS$_MLModelConfiguration Referenced from: /Users/[…]/miniconda3/envs/[…]/lib/python3.7/site-packages/coremltools/libcoremlpython.so (which was built for Mac OS X 10.16) Expected in: /System/Library/Frameworks/CoreML.framework/Versions/A/CoreML in /Users/[…]/miniconda3/envs/[…]/lib/python3.7/site-packages/coremltools/libcoremlpython.so

CoreML export failure: Unable to load CoreML.framework. Cannot make predictions.

mac os version: 10.13.6

I can’t even get it to work on the example from the coremltools documentation:

import coremltools as ct
import torch
import torchvision

model = torchvision.models.mobilenet_v2()
model.eval()
example_input = torch.rand(1, 3, 256, 256)
traced_model = torch.jit.trace(model, example_input)

input = ct.TensorType(name='input_name', shape=(1, 3, 256, 256))
mlmodel = ct.convert(traced_model, inputs=[input], minimum_deployment_target=ct.target.macOS15)
results = mlmodel.predict({"input": example_input.numpy()})
print(results['1651']) # 1651 is the node name given by PyTorch's JIT

@motasay - I just tried with Python 3.6 and 3.0b4. You’re right it’s not working. I’m reopening this issue.

Looks like we’ve had a regression. This issue is fixed in 3.0b3.

I have 3.0b4 and am facing the same issue when trying to predict with the model. Python 3.6.9 and macOS 10.14.6.

> import coremltools

> feature_names = list(map(lambda x: str(x), X_train.columns.values))
> output_names = "card_index"
> coreml_model = coremltools.converters.sklearn.convert(clf, input_features=feature_names,
                                                  output_feature_names=output_names)

> dummy_inputs = {x:0 for x in feature_names}
> coreml_model.predict(dummy_inputs)

exception loading model proxy: dlopen(/Users/motasim/.pyenv/versions/3.6.9/envs/BalootAI-py369/lib/python3.6/site-packages/coremltools/libcoremlpython.so, 2): Symbol not found: _objc_opt_class
  Referenced from: /Users/xxxx/.pyenv/versions/3.6.9/envs/AI-py369/lib/python3.6/site-packages/coremltools/libcoremlpython.so (which was built for Mac OS X 10.15)
  Expected in: /usr/lib/libobjc.A.dylib
 in /Users/xxxx/.pyenv/versions/3.6.9/envs/AI-py369/lib/python3.6/site-packages/coremltools/libcoremlpython.so

---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
<ipython-input-20-b61708773172> in <module>
      5 
      6 dummy_inputs = {x:0 for x in feature_names}
----> 7 model.predict(dummy_inputs)

~/.pyenv/versions/3.6.9/envs/AI-py369/lib/python3.6/site-packages/coremltools/models/model.py in predict(self, data, useCPUOnly, **kwargs)
    343 
    344             if not _MLModelProxy:
--> 345                 raise Exception('Unable to load CoreML.framework. Cannot make predictions.')
    346             elif _MLModelProxy.maximum_supported_specification_version() < self._spec.specificationVersion:
    347                 engineVersion = _MLModelProxy.maximum_supported_specification_version()

Exception: Unable to load CoreML.framework. Cannot make predictions.

@pokidyshev - This issue should be fixed now. It looks like you’re using Python 3.9. Coremltools doesn’t support Python 3.9 yet.

I think this is an installation issue. I think you must have installed coremltools using an egg rather than a wheel. Building libcoremlpython must have failed during installation. Did you receive an error message during installation?

I suggest just using Python 3.8 for now.

This appears to be an issue with latest CoreML Tools with Miniconda / M1 Mac. I am using the following method to install CoreMLtools 4.0 on macOS 11 on an M1 Mac mini:

https://github.com/apple/coremltools/issues/1011#issuecomment-753009335

It still reproduces with 4.0b3

Yes, this is broken again, at least in Python 3.8. It was working properly in 4.0b2. Reopening.