scikit-learn: Our R^2 makes odd assumptions, doesn't work with LeaveOneOut

from sklearn.datasets import make_regression
from sklearn.cross_validation import cross_val_score, LeaveOneOut
from sklearn.linear_model import Ridge

X, y, coef_ = make_regression(random_state=42, noise=1, n_samples=200, coef=True)
cross_val_score(Ridge(), X, y, cv=LeaveOneOut(len(X)))

array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])

Maybe the single sample in the LOO is not interpreted correctly? 😕

About this issue

  • Original URL
  • State: open
  • Created 9 years ago
  • Comments: 20 (19 by maintainers)

Most upvoted comments

Is it already fixed? I am curious about why it works for RFECV but doesn’t work for cross_val_score (both conditions use leaveOneOut cross validation)