8000 moved class_prior in NB to __init__ by mrshu · Pull Request #1499 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

moved class_prior in NB to __init__ #1499

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 9 commits into from

Conversation

mrshu
Copy link
Contributor
@mrshu mrshu commented Dec 30, 2012

This pull request is trying to fix issue #1486.

Please let me know if this is the way you expected it to be.

@amueller
Copy link
Member

Thanks for working on this.
In general it looks good but we don't usually just remove parameters. Instead we deprecate them and remove them ~2 versions later. So could you please add the parameter to fit back in and raise a deprecation warning if it is not none?

You also need to adjust the doctest (look at the failed travis tests).

I'd like to get feedback from the other devs on whether the parameter should be called class_priors or class_weights.

@mrshu
Copy link
Contributor Author
mrshu commented Dec 30, 2012

@amueller Thanks for your comments!

The deprecation warnings should be there for sure.

I'm not sure if this is the correct way of doing it. Please comment!

Thanks.

if class_prior is not None:
warnings.warn('class_prior is deprecated in fit function. Use it'
' in __init__ instead.')
self.class_prior = class_prior
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should not overwrite parameters to __init__ (which would mean that one call to fit would influence the next call).
So instead of self.class_prior = class_prior do

else:
    class_prior = self.class_prior

and then use the local variable class_prior in the following - unless it is used also in other placed than fit. Please check that. If it is also used in predict, then you need to store it in self.class_prior_.

Please also add the version in which it will be removed to the deprecation warning, which should be 0.15 I guess.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, will do!

@mrshu
Copy link
Contributor Author
mrshu commented Dec 30, 2012

Ok, I updated what I could.

I am not sure what you meant by the predict part. Could you be more specific please?

@amueller
Copy link
Member

Never mind, I think in principal it should be ok now. There is a failing test, though, so I guess there is some minor bug somewhere.

@mrshu
Copy link
Contributor Author
mrshu commented Dec 30, 2012

The test seem to work.

I believe it's ready to be merged.

@amueller
Copy link
Member

Why did you need to change the test? It should still work with the deprecated fit parameter, right?

@mrshu
Copy link
Contributor Author
mrshu commented Dec 31, 2012

You are right, I forgot to use the local variable.

The test should now pass.

@amueller
Copy link
Member

Ok. thanks :), looks good to me.

@larsmans @ogrisel wdyt?

@amueller
Copy link
Member
amueller commented Jan 2, 2013

So @larsmans said it would be a good thing to directly rename the new parameter to class_weight. Could you please do that?

@mrshu
Copy link
Contributor Author
mrshu commented Jan 3, 2013

Sure thing!

Do you think it's OK this way?

@ogrisel
Copy link
Member
ogrisel commented Jan 3, 2013

Looks good. Are the BernoulliNB and MultinomialNB classes now covered by the test_common test that checks the consistency of the class_weight constructor param?

@mrshu
Copy link
Contributor Author
mrshu commented Jan 3, 2013

There is one test in test_naive_bayes.py here but I'm not sure if it's sufficient.

That test also throws the warning.

@amueller
Copy link
Member
amueller commented Jan 3, 2013

yes, it should be covered. I'll do the merging in a bit.

Olivier Grisel notifications@github.com schrieb:

Looks good. Are the BernoulliNB and MultinomialNB classes now
covered by the test_common test that checks the consistency of the
class_weight constructor param?


Reply to this email directly or view it on GitHub:
#1499 (comment)

Diese Nachricht wurde von meinem Android-Mobiltelefon mit K-9 Mail gesendet.

@amueller
Copy link
Member
amueller commented Jan 3, 2013

Merged, thanks :)

@amueller
Copy link
Member
amueller commented Jan 3, 2013

@ogrisel The class_weight here now doesn't have the same form as for SGD and SVM. There it is a dict, here it is a list. I think we should move this to a list, but will not do that in my PR.

@ogrisel
Copy link
Member
ogrisel commented Jan 3, 2013

Ok then. Thanks for checking.

@mrshu mrshu deleted the class_prior branch January 3, 2013 12:48
@amueller
Copy link
Member
amueller commented Jan 3, 2013

err, it should become a dict is what I meant. will try to do that asap (before release).

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 1844820 on mrshu:class_prior into * on scikit-learn:master*.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants
0