-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG+1] Make return_std faster in Gaussian Processes #9236
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
flake8 failures. Please mark as MRG when fixed, unless you feel this requires more tests. |
sklearn/gaussian_process/gpr.py
Outdated
L_inv = solve_triangular(self.L_.T, np.eye(self.L_.shape[0])) | ||
K_inv = L_inv.dot(L_inv.T) | ||
# cache result of K_inv computation | ||
if not hasattr(self,'K_inv_'): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we're going to memoize this, we should not be storing a new public attribute at predict time. If be more comfortable if this were a private attribute, i.e. _K_inv
LGTM, thanks! |
# decomposition L and its inverse L_inv | ||
L_inv = solve_triangular(self.L_.T, | ||
np.eye(self.L_.shape[0])) | ||
self._K_inv = L_inv.dot(L_inv.T) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self._K_inv
needs to be deleted if self.L_
is changed during a second fit
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi. Thank you. How do I test if self.L_
has changed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.L_
is changed only in this line, so you should add for example self._K_inv = None
just after, and change the condition to if self._K_inv is None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apologize for the delay. I've made the changes.
Thanks for the update. Could you also add a test in After that it should be ready for merging. |
Hi. I have added a new test. |
Maybe a better test would be that doing Needs a whatsnew entry? |
@amueller I have updated my test. Could you please have a look and let me know if it is right? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
sklearn/gaussian_process/gpr.py
Outdated
# Compute variance of predictive distribution | ||
y_var = self.kernel_.diag(X) | ||
y_var -= np.einsum("ij,ij->i", np.dot(K_trans, K_inv), K_trans) | ||
sum1 = np.dot(K_trans, self._K_inv).T | ||
y_var -= np.einsum("ki,ik->k", K_trans, sum1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason you changed this? I thought this PR was only about caching self._K_inv
. As @TomDLT mentioned in #9234 (comment), the einsum optimization was arleady done in #8591.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverted the change. Thank you @lesteve.
0169d2a
to
d698126
Compare
flake8 is failing |
fix white line
ea05b00
to
2a37685
Compare
Fixed flake8 |
@minghui-liu Can you add an entry in doc/whats_new.rst? |
Rebased, added whats new and merged as: 6e01fef. |
Reference Issue
Issue #9234
What does this implement/fix? Explain your changes.
K_inv
to
Any other comments?
No.