8000 change LogisticRegression default solver to lbfgs and multiclass to multinomial? · Issue #9997 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

change LogisticRegression default solver to lbfgs and multiclass to multinomial? #9997

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
jnothman opened this issue Oct 25, 2017 · 11 comments · Fixed by #11905
Closed

change LogisticRegression default solver to lbfgs and multiclass to multinomial? #9997

jnothman opened this issue Oct 25, 2017 · 11 comments · Fixed by #11905

Comments

@jnothman
Copy link
Member

The current defaults are so for historical reasons. It may make sense to change them to more consistent and theoretically sound options.

@TomDLT
Copy link
Member
TomDLT commented Oct 25, 2017

+1

We need to use FutureWarning as it will break some code, but the regularized intercept in solver='liblinear' is really not clean.

@thechargedneutron
Copy link
Contributor

I want to take up this issue provided its not too difficult. Can one of you instruct me what needs to be done here?

@jnothman
Copy link
Member Author
jnothman commented Oct 25, 2017 via email

@amueller
Copy link
Member

while in practice the ovr might not be a big deal, I think it's perceived as a red flag by anyone coming from a stats angle.

I prefer deprecations to FutureWarnings. And yes, we need an "auto" solver. We also need benchmarks to see what heuristic would be good for "auto".

@amueller
Copy link
Member

This might be a good opportunity to also add "auto" to LinearSVC.

@jnothman
Copy link
Member Author
jnothman commented Oct 25, 2017 via email

@edlee123
Copy link
edlee123 commented Dec 4, 2017

Read #6595 and agree with TomTLC. NOT regularizing intercept by default would make it consistent with scikit-learn's Ridge and Lasso.

Also bit of a gotcha with ElasticNet:
https://stats.stackexchange.com/questions/203816/logistic-regression-scikit-learn-vs-glmnet

Ridge is as one would expect:
https://stackoverflow.com/questions/26126224/scikit-learn-ridge-regression-with-unregularized-intercept-term

@jnothman
Copy link
Member Author
jnothman commented Dec 4, 2017 via email

@edlee123
Copy link
edlee123 commented Dec 5, 2017

Yea wish some days I could get it backwards, it is pretty funny...

@amueller
Copy link
Member

also see #6830

@jnothman
Copy link
Member Author

I'm glad we're improving our quality on this front!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
0