8000 FIX : broken MultiTaskLasso with warm_start=True by agramfort · Pull Request #12853 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

FIX : broken MultiTaskLasso with warm_start=True #12853

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

agramfort
Copy link
Member

Related to #12844

warm_start=True is currently broken in 0.20.X branch but not in master. I'll send a PR to master now to make sure we don't break this in the future.

@@ -1796,7 +1796,7 @@ def fit(self, X, y):
X, y, X_offset, y_offset, X_scale = _preprocess_data(
X, y, self.fit_intercept, self.normalize, copy=False)

if not self.warm_start or self.coef_ is None:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we were trying to access a fit param before actually setting it.

Copy link
Member
@qinhanmin2014 qinhanmin2014 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can backport #12361? The test there seems better.

@jnothman jnothman added the Bug label Dec 30, 2018
@jnothman
Copy link
Member

Are we sure we rather not allow coef_=None initially?

@agramfort
Copy link
Member Author

@jnothman what do you initially? what is sure is that now the code breaks out of the box if you call fit with warm_start=True

@jnothman jnothman modified the milestones: 0.21, 0.20.3 Jan 8, 2019
Copy link
Member
@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just meant that we could use getattr(self, 'coef_', None) is None rather than not hasattr to make it a little more general

@@ -1796,7 +1796,7 @@ def fit(self, X, y):
X, y, X_offset, y_offset, X_scale = _preprocess_data(
X, y, self.fit_intercept, self.normalize, copy=False)

if not self.warm_start or self.coef_ is None:
if not self.warm_start or not hasattr(self, "coef_"):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jnothman shall I replace not hasattr(self, "coef_") by a getattr and we're good?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm okay with it either way, actually. I just never know how flexible we should make these kinds of changes... I mean if someone's gone and changed their code now to set coef_ = None so that it works, using getattr would make sure it continues to work in the next release... but it's also hacky.

@ogrisel
Copy link
Member
ogrisel commented Jan 14, 2019

As @qinhanmin2014 suggested on gitter I think we need a new 0.20.3 section in the changelog for this fix.

@ogrisel
Copy link
Member
ogrisel commented Jan 14, 2019

+1 also for backporting #12361 to 0.20.X

@qinhanmin2014
Copy link
Member

As @qinhanmin2014 suggested on gitter I think we need a new 0.20.3 section in the changelog for this fix.

There's already an entry in 0.21, I'm not sure whether we should have two entries or move the entry to 0.20.3.

+1 also for backporting #12361 to 0.20.X

I prefer the test in #12361, I think it'll be better to keep consistency between master and 0.20.X (if it's possible)

@agramfort
Copy link
Member Author

this has been closed without being fixed.

the test test_multi_task_lasso_warm_start still fails on the 0.20.X branch

did I miss something @jnothman ?

@qinhanmin2014
Copy link
Member

@agramfort 0.20.3 in not released. We need to backport #12361 and #12984 to 0.20.X and release 0.20.3 (both PRs are tagged 0.20.3)

@agramfort
Copy link
Member Author

ok I had missed that sorry. It was not referenced in the PR apparently.

@agramfort agramfort closed this Feb 17, 2019
@jnothman
Copy link
Member
jnothman commented Feb 17, 2019 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants
0