scikit-learn-intelex: Intel oneDAL FATAL ERROR on Windows10

Describe the bug The windows wheel is unable to load the dll’s needed to run the examples. I tried one of the examples and got the following when calling a library function

Intel oneDAL FATAL ERROR: onedal_thread.1.dll. Error code is 0x80096005. Intel oneDAL FATAL ERROR: onedal_sequential.1.dll. Error code is 0x80096005. Intel oneDAL FATAL ERROR: Cannot load neither onedal_thread.1.dll nor onedal_sequential.1.dll. Intel oneDAL FATAL ERROR: onedal_thread.1.dll. Error code is 0x80096005. Intel oneDAL FATAL ERROR: onedal_sequential.1.dll. Error code is 0x80096005. Intel oneDAL FATAL ERROR: Cannot load neither onedal_thread.1.dll nor onedal_sequential.1.dll.

To Reproduce Steps to reproduce the behavior:

  1. pip install daal4py pandas lightgbm on Windows10
  2. Alter the __init__.py to have path_to_libs = os.path.join(os.path.dirname(sys.executable), "..\\Library\\bin") (note the ..)
  3. Run the following script
import lightgbm as lgb
import numpy as np
import daal4py as d4p


def get_data(n, m):
    x_train = np.random.randn(n, m).astype(np.float32)
    A = np.random.randint(-5, 5, size=(m, 1))
    y_train = (x_train @ A).astype(np.float32)

    return x_train, y_train

n = 1000
m = 25
x_train, y_train = get_data(n, m)

params = {
    'task': 'train',
    'boosting_type': 'gbdt',
    'objective': 'regression',
    'metric': ['rmse'],
    'device': 'cpu',
    'num_leaves': 31,
    'bagging_fraction': 0.5,
    'feature_fraction': 0.5,
    'learning_rate': 0.001,
    'verbose': 2,
    'max_bin': 255,
}
ds_train = lgb.Dataset(x_train, y_train.ravel())#, free_raw_data=False)
gbm = lgb.train(
    params,
    ds_train,
    num_boost_round=10,
    # keep_training_booster=args.keep_training_booster,
)

print("Converting...", flush=True)
daal_model = d4p.get_gbt_model_from_lightgbm(gbm)
print("Converted...", flush=True)
daal_prediction = d4p.gbt_regression_prediction().compute(x_train, daal_model).prediction
  1. See output
[LightGBM] [Debug] Dataset::GetMultiBinFromAllFeatures: sparse rate 0.000000
[LightGBM] [Debug] init for col-wise cost 0.000021 seconds, init for row-wise cost 0.000844 seconds
[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.001624 seconds.
You can set `force_col_wise=true` to remove the overhead.
[LightGBM] [Info] Total Bins 6375
[LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 25
[LightGBM] [Info] Start training from score 0.386410
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 8
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 9
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 9
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 9
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 9
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 6
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 7
[LightGBM] [Debug] Trained a tree with leaves = 31 and max_depth = 8
Converting...
Converted...
Intel oneDAL FATAL ERROR: onedal_thread.1.dll. Error code is 0x80096005.
Intel oneDAL FATAL ERROR: onedal_sequential.1.dll. Error code is 0x80096005.
Intel oneDAL FATAL ERROR: Cannot load neither onedal_thread.1.dll nor onedal_sequential.1.dll.
Intel oneDAL FATAL ERROR: onedal_thread.1.dll. Error code is 0x80096005.
Intel oneDAL FATAL ERROR: onedal_sequential.1.dll. Error code is 0x80096005.
Intel oneDAL FATAL ERROR: Cannot load neither onedal_thread.1.dll nor onedal_sequential.1.dll.

Expected behavior Successfully converted lightgbm model

Output/Screenshots Would note that I do see the dlls in the correct folder and the directory is added to PATH

Environment:

  • OS: Windows10
  • Compiler: ?
  • Version: ?
  • Python: 3.6.6

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 15 (8 by maintainers)

Most upvoted comments

@CHDev93 Great news!

Today we have merged the fix for the virtualenv issue.