-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
MAINT Make fastica call FastICA #14987
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
what bug? |
this one #14988 |
Thanks @agramfort I updated the PR according to your suggestions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also please check that you added the attributes properly + add a test to make sure that we have always components = unmixing * whitener etc. thanks !
sklearn/decomposition/fastica_.py
Outdated
The unmixing matrix. | ||
The linear operator to apply to the data to get the independent | ||
sources. This is equal to the unmixing matrix when ``whiten`` is | ||
False, and equal to ``np.dot(unmixing_matrix, whitening_)`` when |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
False, and equal to ``np.dot(unmixing_matrix, whitening_)`` when | |
False, and equal to ``np.dot(unmixing_matrix_, whitening_)`` when |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The unmixing matrix is a private attribute
|
||
alpha = fun_args.get('alpha', 1.0) | ||
if not 1 <= alpha <= 2: | ||
raise ValueError('alpha must be in [1,2]') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you say what value you got?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't touch the code logical at all. Only copy pasted
sklearn/decomposition/fastica_.py
Outdated
% self.fun | ||
) | ||
|
||
n, p = X.shape |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a good occasion to change this to
n_samples, n_features = X.shape
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK but I'd rather not change to code, this is out of scope for this PR
The current tests would fail if anything went wrong. I'd rather not add new tests here, this is not the goal of the PR. |
The linear operator to apply to the data to get the independent | ||
sources. This is equal to the unmixing matrix when ``whiten`` is | ||
False, and equal to ``np.dot(unmixing_matrix, self.whitening_)`` when | ||
``whiten`` is True. | ||
|
||
mixing_ : array, shape (n_features, n_components) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you're going to hate me @NicolasHug but now mixing_ is not the inverse of unmixing in your description. unmixing is always square where here mixing_ is not.
also why having mixing_ as public attribute and not unmixing_ ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How would you fix it? Change the docstring?
This issue is not caused by the changes: mixing_
has never been the inverse of the unmixing matrix.
else: | ||
self.components_ = unmixing | ||
self.components_ = W | ||
|
||
self.mixing_ = linalg.pinv(self.components_) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cf here
@agramfort it seems to me that the issues you're pointing out are pre-existing the proposed changes. I'd rather just focus on fixing the calling order. |
@adrinjalali @glemaitre @thomasjpfan , easy and boring again? |
@NicolasHug I pushed to your branch. I am happy now :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not too familiar with FastICA, but LGTM aside from the following comments.
sklearn/decomposition/fastica_.py
Outdated
else: | ||
return None, W, S, None | ||
return None, sources, None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is changing the number of output arguments, and est._unmixing
is missing?
self.mean_ = X_mean | ||
self.whitening_ = whitening | ||
self.whitening_ = K |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe keep previous names as more explicit for W and K? Those are defined just a few lines above.
…stica_proper_order
For some reason I can't comment on the reviews
I'd rather not change the original code of |
@rth merge if you're happy |
Thanks @NicolasHug ! |
I merged too fast. The decrease in coverage is genuine and should be addressed IMO in particular for the @NicolasHug would be able to make a PR with an additional test? |
Sigh... I'll do it. There's no decrease in coverage. This is just a side effect of the changes. |
NOMRG, I think there's a bug in fastica that needs to be fixed firstCloses #14988
This PR makes
fastica
callFastICA
. Following the discussion in #14988 I have created a private_unmixing
attribute and clarified the docstring of thecomponents_
attribute.