8000 Leave one out cross validation with CalibratedClassifierCV and LinearSVC · Issue #7796 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

Leave one out cross validation with CalibratedClassifierCV and LinearSVC #7796

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
macaodha opened this issue Oct 31, 2016 · 1 comment
Closed
Labels
Milestone

Comments

@macaodha
Copy link

Hi there,

This could be a usage problem, so I apologize in advance. I'm trying to use LeaveOneOut with CalibratedClassifierCV and for two classes, after I fit the model and run predict_proba() I get a matrix whos rows sum to greater than one. This code reproduces the problem:

from sklearn.calibration import CalibratedClassifierCV
from sklearn import datasets
from sklearn.svm import LinearSVC
from sklearn.model_selection import LeaveOneOut

num_classes = 2
X, y = datasets.make_classification(n_samples=100, n_features=20,
                                    n_informative=18, n_redundant=2,
                    n_classes=num_classes)
clf = LinearSVC(C=1.0)
clf_prob = CalibratedClassifierCV(clf, method="sigmoid", cv=LeaveOneOut())
clf_prob.fit(X, y)

probs_1 = clf_prob.predict_proba(X)
print probs_1.sum(1) # here the sum of the probabilities for each example sums to 2

If I instead try and fit three classes instead of two, I get an error that says:
index 1 is out of bounds for axis 1 with size 1.

If I replace cv=LeaveOneOut() with cv=KFold() everything works fine.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants
0