-
-
Notifications
You must be signed in to change notification settings - Fork 26k
Qda lda consistent #8000
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Qda lda consistent #8000
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also check the test failures.
self.priors = np.asarray(priors) if priors is not None else None | ||
self.reg_param = reg_param | ||
self.store_covariances = store_covariances | ||
#self.store_covariances = store_covariances |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove this line
@@ -623,11 +623,15 @@ class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin): | |||
Discriminant Analysis | |||
""" | |||
|
|||
def __init__(self, priors=None, reg_param=0., store_covariances=False, | |||
tol=1.0e-4): | |||
def __init__(self, priors=None, reg_param=0.,store_covariance=False, tol=1.0e-4,store_covariances=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is not pep8
if store_covariances is not None: | ||
warnings.warn("'store_covariances' was renamed to 'store_covariance' in version 0.17 and " | ||
"will be removed in later versions.", DeprecationWarning) | ||
self.store_covariance = store_covariances |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are not storing store_covariance
...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But in the docs it was mentioned that if we want to deprecate a parametre, if that is not none, then we need to assign the value of the old parametre to the new parametre.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you should handle the deprecation in fit
, not here. In scikit-learn, __init__
is just used to store parameters, otherwise, in the general case, we will break parameter searches.
In fit
set a local copy of store_covariance
on the basis of both old and new params.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@chinmay0301 I'm not sure how your comment relates to mine. You are not storing store_covariance
. So is someone uses the new parameter, nothing happens.
So should I implement the change I made in the __init__ to the fit
function, and assign the value of store_covariances (if specified)
to store_covariance in the fit function?
…On Thu, Dec 8, 2016 at 9:53 AM, Joel Nothman ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In sklearn/discriminant_analysis.py
<#8000>:
> @@ -623,11 +623,15 @@ class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin):
Discriminant Analysis
"""
- def __init__(self, priors=None, reg_param=0., store_covariances=False,
- tol=1.0e-4):
+ def __init__(self, priors=None, reg_param=0.,store_covariance=False, tol=1.0e-4,store_covariances=None):
+ if store_covariances is not None:
+ warnings.warn("'store_covariances' was renamed to 'store_covariance' in version 0.17 and "
+ "will be removed in later versions.", DeprecationWarning)
+ self.store_covariance = store_covariances
I think you should handle the deprecation in fit, not here. In
scikit-learn, __init__ is just used to store parameters, otherwise, in
the general case, we will break parameter searches.
In fit set a local copy of store_covariance on the basis of both old and
new params.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#8000>, or mute the
thread
<https://github.com/notifications/unsubscribe-auth/AO7ubH6woGLjWbfIrHc0XjkS5LME84wgks5rF4Y6gaJpZM4LGvZg>
.
--
--
*Chinmay Talegaonkar*
Cultural and Events Coordinator, Mood Indigo
..............................................
+91-8879178724
chinmay0301@gmail.com <bajajkshitij19@gmail.com>
www.moodi.org
|
Do not store in an attribute of the object, but as a local variable.
On 8 December 2016 at 15:35, Chinmay Talegaonkar <notifications@github.com>
wrote:
… So should I implement the change I made in the __init__ to the fit
function, and assign the value of store_covariances (if specified)
to store_covariance in the fit function?
On Thu, Dec 8, 2016 at 9:53 AM, Joel Nothman ***@***.***>
wrote:
> ***@***.**** commented on this pull request.
> ------------------------------
>
> In sklearn/discriminant_analysis.py
> <#8000>:
>
> > @@ -623,11 +623,15 @@ class QuadraticDiscriminantAnalysis(BaseEstimator,
ClassifierMixin):
> Discriminant Analysis
> """
>
> - def __init__(self, priors=None, reg_param=0., store_covariances=False,
> - tol=1.0e-4):
> + def __init__(self, priors=None, reg_param=0.,store_covariance=False,
tol=1.0e-4,store_covariances=None):
> + if store_covariances is not None:
> + warnings.warn("'store_covariances' was renamed to 'store_covariance'
in version 0.17 and "
> + "will be removed in later versions.", DeprecationWarning)
> + self.store_covariance = store_covariances
>
> I think you should handle the deprecation in fit, not here. In
> scikit-learn, __init__ is just used to store parameters, otherwise, in
> the general case, we will break parameter searches.
>
> In fit set a local copy of store_covariance on the basis of both old and
> new params.
>
> —
> You are receiving this because you authored the thread.
> Reply to this email directly, view it on GitHub
> <#8000>, or mute the
> thread
> <https://github.com/notifications/unsubscribe-auth/
AO7ubH6woGLjWbfIrHc0XjkS5LME84wgks5rF4Y6gaJpZM4LGvZg>
> .
>
--
--
*Chinmay Talegaonkar*
Cultural and Events Coordinator, Mood Indigo
..............................................
+91-8879178724 <+91%2088791%2078724>
***@***.*** ***@***.***>
www.moodi.org
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#8000 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz6xPog_2wr_r15nrli_Bi1ofJiD6tks5rF4kMgaJpZM4LGvZg>
.
|
def fit(self, X, y, store_covariance=None, store_covariances=None, tol=None): Is this correct? Or should I store the value of the old parameter (if specified) in a different variable? |
no. read the parameters from self, but don't modify what's stored there.
…On 8 December 2016 at 16:05, Chinmay Talegaonkar ***@***.***> wrote:
def fit(self, X, y, store_covariance=None, store_covariances=None,
tol=None):
if store_covariances:
warnings.warn("The parameter 'store_covariances' is deprecated as "
"store_covariance of version 0.17 and will be removed "
"in 0.19 parameter is no longer necessary because the "
"value is set via the estimator initialisation or "
"set_params method.", DeprecationWarning)
store_covariance = store_covariances
Is this correct? Or should I store the value of the old parameter (if
specified) in a different variable?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#8000 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz66hFd8-eN9j0JcaPCCJzTKqLvvXVks5rF5ApgaJpZM4LGvZg>
.
|
Sorry I am not quite understanding what is to be done exactly :/ |
Look for other deprecated parameters in the codebase.
…On 8 December 2016 at 16:24, Chinmay Talegaonkar ***@***.***> wrote:
Sorry I am not quite understanding what is to be done exactly :/
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#8000 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz67MT_a1KdTp9EScB5qlLV0vxIfB3ks5rF5SGgaJpZM4LGvZg>
.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implement the changes I've mentioned in a serial order and you'll be done with your first PR. :)
@@ -623,11 +623,15 @@ class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin): | |||
Discriminant Analysis | |||
""" | |||
|
|||
def __init__(self, priors=None, reg_param=0., store_covariances=False, | |||
tol=1.0e-4): | |||
def __init__(self, priors=None, reg_param=0.,store_covariance=False, tol=1.0e-4,store_covariances=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@chinmay0301 Just be calm, I'll guide you through the process.
Firstly, change this to:
def __init__(self, priors=None, reg_param=0., store_covariance=False,
tol=1.0e-4, store_covariances=False):
This differs from the one you wrote because this follows the PEP8 guidelines. Please go throught the Contributor's guide for more information.
if store_covariances is not None: | ||
warnings.warn("'store_covariances' was renamed to 'store_covariance' in version 0.17 and " | ||
"will be removed in later versions.", DeprecationWarning) | ||
self.store_covariance = store_covariances |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to make the value of the parameter store_covariance
equal to that of the store_covariances
if it is passed. This means, you need to replace:
self.store_covariance = store_covariances
With:
store_covariance = store_covariances
self.priors = np.asarray(priors) if priors is not None else None | ||
self.reg_param = reg_param | ||
self.store_covariances = store_covariances | ||
#self.store_covariances = store_covariances |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is where you store the value of store_covariance
. You need to change this commented line to:
self.store_covariance = store_covariance
tol=1.0e-4): | ||
def __init__(self, priors=None, reg_param=0.,store_covariance=False, tol=1.0e-4,store_covariances=None): | ||
if store_covariances is not None: | ||
warnings.warn("'store_covariances' was renamed to 'store_covariance' in version 0.17 and " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change this to: in version 0.19 and
def __init__(self, priors=None, reg_param=0.,store_covariance=False, tol=1.0e-4,store_covariances=None): | ||
if store_covariances is not None: | ||
warnings.warn("'store_covariances' was renamed to 'store_covariance' in version 0.17 and " | ||
"will be removed in later versions.", DeprecationWarning) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change this to: will be removed in version 0.21.
if self.store_covariances: | ||
self.covariances_ = cov | ||
if self.store_covariance: | ||
self.covariance_ = cov |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You don't need to change this line. Revert this back to covariances_
.
Thanks a lot! Will do this asap :D
…On Dec 9, 2016 11:17 AM, "Aman Dalmia" ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In sklearn/discriminant_analysis.py
<#8000 (review)>
:
> # cov = V * (S^2 / (n-1)) * V.T
cov.append(np.dot(S2 * Vt.T, Vt))
scalings.append(S2)
rotations.append(Vt.T)
- if self.store_covariances:
- self.covariances_ = cov
+ if self.store_covariance:
+ self.covariance_ = cov
You don't need to change this line. Revert this back to covariances_.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#8000 (review)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AO7ubGfRhE8kqRipQ5trCrrUYM_s_4khks5rGOtegaJpZM4LGvZg>
.
|
Hi, made the suggested changes. Please have look once @dalmia if its correct. |
@@ -623,11 +623,28 @@ class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin): | |||
Discriminant Analysis | |||
""" | |||
|
|||
def __init__(self, priors=None, reg_param=0., store_covariances=False, | |||
tol=1.0e-4): | |||
<<<<<<< HEAD |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oops :)
@@ -623,11 +623,28 @@ class QuadraticDiscriminantAnalysis(BaseEstimator, ClassifierMixin): | |||
Discriminant Analysis | |||
""" | |||
|
|||
def __init__(self, priors=None, reg_param=0., store_covariances=False, | |||
tol=1.0e-4): | |||
<<<<<<< HEAD |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything looks fine apart from the fact that your merge wasn't very clean. Remove lines:
- 626
- 637-647
- 721
- 723-725
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, corrected that.
warnings.warn("'store_covariances' was renamed to 'store_covariance'" | ||
"in version 0.19 and will be removed in the " | ||
"version 0.21.", DeprecationWarning) | ||
store_covariance = store_covariances |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indentation error!
tol=1.0e-4): | ||
def __init__(self, priors=None, reg_param=0., store_covariance=False, | ||
tol=1.0e-4, store_covariances=False): | ||
if store_covariances is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should be done in fit
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So should leave the init function as it was in the master branch?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No you need to add store_covariance
and store it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since store_covariances
is deprecated in fit
and will be removed as a parameter in 0.19
do we need to do this in fit
? @amueller
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not talking about the parameter passed into fit. I'm talking about the parameter passed in __init__
. That should be validated in fit
not in __init__
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh I get you now. Since 0.19
deprecates store_covariances
I don't think we need to add this particular deprecation warning, the following change in fit
should suffice:
self.store_covariance = store_covariances
tol=1.0e-4): | ||
def __init__(self, priors=None, reg_param=0., store_covariance=False, | ||
tol=1.0e-4, store_covariances=False): | ||
if store_covariances is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@chinmay0301 remove this check entirely here and change this in fit
:
self.store_covariances = store_covariances
to:
self.store_covariance = store_covariances
@dalmia, I don't really think it's appropriate to make statements like
Perhaps we don't give contributors enough assurances about how much work they'll need to complete a PR, but that's because sometimes we find that it is more work than initially anticipated. It's better to instruct, or point to precedent of similar changes, with the hope that all will proceed smoothly, than to make promises!) |
@jnothman I am sorry. I was just trying to reassure but yes, I shouldn't be making promises. Thank you for letting me know. I'll keep that in mind from the next time onwards. |
@chinmay0301 do you want to work on the issue further? |
Yes. I was out of town. Will get back to it tomorrow.
…On Dec 17, 2016 1:25 PM, "Aman Dalmia" ***@***.***> wrote:
@chinmay0301 <https://github.com/chinmay0301> do you want to work on the
issue further?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#8000 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AO7ubOVeUPgexigK96jJfTyF5jmlCUGkks5rI5WHgaJpZM4LGvZg>
.
|
"is set via the estimator initialisation or " | ||
"set_params method.", DeprecationWarning) | ||
self.store_covariances = store_covariances | ||
warnings.warn("'store_covariances' was renamed to 'store_covariance'" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You shouldn't change the warning message here. Carefully reading the earlier message shows that this statement won't be there from 0.19 onwards and hence, this is just added to ensure compatibility until 0.19. So, your warning message is not valid. The best you could do is add a note alongside the one already present as:
Also, 'store_covariances' is renamed to 'store_covariance' in 0.19.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, change the docstring and the example written along with it to use store_covariance
now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to update your copy of master and merge with it. You'll find this code path has disappeared.
The errors are: Exception occurred:
File "/home/ubuntu/scikit-learn/doc/sphinxext/sphinx_gallery/docs_resolv.py", line 48, in _get_data
with open(url, 'r') as fid:
IOError: [Errno 2] No such file or directory: '/home/ubuntu/scikit-learn/doc/_build/html/stable/modules/generated/sklearn.feature_selection.SelectKBest.rst.html'
The full traceback has been saved in /tmp/sphinx-err-9IJ5UT.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
A bug report can be filed in the tracker at <https://github.com/sphinx-doc/sphinx/issues>. Thanks!
Embedding documentation hyperlinks in examples..
processing: feature_stacker.html
make: *** [html-noplot] Error 1
./build_tools/circle/build_doc.sh returned exit code 2
Action failed: ./build_tools/circle/build_doc.sh @jnothman Should rebuilding remove the error? |
Not rebuilding, but merging with updated master will remove the error.
…On 20 December 2016 at 22:36, Aman Dalmia ***@***.***> wrote:
The errors are:
Exception occurred:
File "/home/ubuntu/scikit-learn/doc/sphinxext/sphinx_gallery/docs_resolv.py", line 48, in _get_data
with open(url, 'r') as fid:IOError: [Errno 2] No such file or directory: '/home/ubuntu/scikit-learn/doc/_build/html/stable/modules/generated/sklearn.feature_selection.SelectKBest.rst.html'
The full traceback has been saved in /tmp/sphinx-err-9IJ5UT.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.A bug report can be filed in the tracker at <https://github.com/sphinx-doc/sphinx/issues>. Thanks!
Embedding documentation hyperlinks in examples..
processing: feature_stacker.html
make: *** [html-noplot] Error 1
./build_tools/circle/build_doc.sh returned exit code 2
Action failed: ./build_tools/circle/build_doc.sh
@jnothman <https://github.com/jnothman> Should rebuilding remove the
error?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#8000 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz60KMYD1JGC8U5KDK-BiZi1u09Sfmks5rJ720gaJpZM4LGvZg>
.
|
@chinmay0301, this remains quite broken. Are you still working on it? |
No, I think I would try some other issue later as this getting too delayed
and as it's pretty easy, it should not be a bottleneck.
On Dec 27, 2016 1:58 AM, "Joel Nothman" <notifications@github.com> wrote:
@chinmay0301 <https://github.com/chinmay0301>, this remains quite broken.
Are you still working on it?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#8000 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AO7ubL99qXwhsr9V3UKVA7qktcF6b2QCks5rMCNwgaJpZM4LGvZg>
.
|
Okay. I'll close then. |
Reference Issue
#7998
What does this implement/fix? Explain your changes.
Renamed the variable store_variances to store_variance and tried to add a deprecation cycle in the init function. Please suggest corrections, as this is my first attempt to add a deprecation cycle
Any other comments?