xgboost: Jupyter kernel dies / Segmentation fault : 11, when upgrading xgboost to > v0.90 on MacOS M1 Chip
I am training a xgboost model locally. My data is not large, few 1000 rows and 100 columns. I have successfully trained model using xgboost v0.90 on python v3.9. I need to upgrade xgboost to v > 1.0 as the older ones are being deprecated. I run %pip install xgboost==1.1.0 within jupyter notebook and cmd terminal as well. Upon upgrading, when I attempt to fit the model, my jupyter kernel dies.
import pandas
import xgboost
from xgboost import XGBRegressor
import numpy as np
from sklearn.model_selection import train_test_split, RandomizedSearchCV
print(xgboost.__version__)
1.1.0
# read data
df = pd.read_csv('')
# split df into train and test
X_train, X_test, y_train, y_test = train_test_split(df.iloc[:,0:21], df.iloc[:,-1], test_size=0.1)
X_train.shape
(2000,100)
# xgboost regression model
model = XGBRegressor(objective = 'reg:squarederror')
# Parameter distributions
params = {
"colsample_bytree": uniform(0.5, 0.3),
"gamma": uniform(0, 0.5),
"learning_rate": uniform(0.01, 0.5),
"max_depth": randint(2, 8),
"n_estimators": randint(100, 150),
"subsample": uniform(0.3, 0.6)
}
This is the step where my kernel dies.
# Hyperparameter tuning
r = RandomizedSearchCV(model, param_distributions=params, n_iter=10, scoring="neg_mean_absolute_error", cv=3, verbose=1, n_jobs=1, return_train_score=True, error_score='raise')
# Fit model
r.fit(X_train.toarray(), y_train.values)
Upon check versions installed in pip and conda:
pip list
xgboost 1.1.0
conda list
xgboost 1.1.0 pypi_0 pypi
I have also tried to use conda-forge.
conda install -c conda-forge py-xgboost=1.0.2
Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: -
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed
UnsatisfiableError: The following specifications were found
to be incompatible with the existing python installation in your environment:
Specifications:
- py-xgboost=1.0.2 -> python[version='>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0']
Your python: python=3.9
If python is on the left-most side of the chain, that's the version you've asked for.
When python appears to the right, that indicates that the thing on the left is somehow
not available for the python version you are constrained to. Note that conda will not
change your python version to a different minor version unless you explicitly specify
that.
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Comments: 32 (17 by maintainers)
I have successfully compiled and installed XGBoost v1.3.3 on an M1 in Python, using a M1 native version of homebrew & python 3.9 installed from that homebrew. It compiled with no trouble, though I did compile the native library first (by doing
mkdir build; cd build; cmake ..; make -j4in the root of the XGBoost repo) before doingpython setup.py installfrom thepython_packagedirectory. I did need to install numpy and scipy in the virtual environment first. I had some difficulty making scikit-learn work, looks like they don’t have binaries available for M1, but I didn’t try to figure that out.I built a test regression using some randomly generated data from numpy on a squared loss and the
xgb.trainandpredict(DMatrix)interfaces worked fine.