8000 ENH Add custom loss support for HistGradientBoosting by gbolmier · Pull Request #16908 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

ENH Add custom loss support for HistGradientBoosting #16908

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

gbolmier
Copy link
Contributor

Reference Issues/PRs

Resolves: #15841

What does this implement/fix? Explain your changes.

Add custom loss support for HistGradientBoostingClassifier and
HistGradientBoostingRegressor as a private API without any
documentation. A BaseLoss object can now be passed a loss
parameter.

Add custom loss support for HistGradientBoostingClassifier and
HistGradientBoostingRegressor as a private API without any
documentation. A `BaseLoss` object can now be passed a loss
parameter.

Resolves: scikit-learn#15841
@gbolmier gbolmier changed the title Add custom loss support for HistGradientBoosting [MRG] Add custom loss support for HistGradientBoosting Apr 13, 2020
Copy link
Member
@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @gbolmier !

Copy link
Member
@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the issue discussion is okay with making this private (for now).

LGTM

@thomasjpfan thomasjpfan changed the title [MRG] Add custom loss support for HistGradientBoosting ENH Add custom loss support for HistGradientBoosting Apr 15, 2020
@thomasjpfan thomasjpfan merged commit 9d366a4 into scikit-learn:master Apr 15, 2020
gio8tisu pushed a commit to gio8tisu/scikit-learn that referenced this pull request May 15, 2020
viclafargue pushed a commit to viclafargue/scikit-learn that referenced this pull request Jun 26, 2020
@gbolmier gbolmier deleted the histgradientboosting_custom_loss branch November 15, 2020 00:52
@Sandy4321
Copy link

@Sandy4321
Copy link

people really need it
https://stackoverflow.com/questions/54267745/implementing-custom-loss-function-in-scikit-learn

It's tricky, but you can do it...

  1. Open up your classifier. Let's use an RFC for example: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html

  2. click [source]

  3. See how it's inheriting from ForestClassifier? Right there in the class definition. Click that word to jump to it's parent definition.

  4. See how this new object is inheriting from ClassifierMixin? Click that.

  5. See how the bottom of that ClassifierMixin class says this?

from .metrics import accuracy_score
return accuracy_score(y, self.predict(X), sample_weight=sample_weight)
That's your model being trained on accuracy. You need to inject at this point if you want to train your model to be a "recall model" or a "precision model" or whatever model. This accuracy metric is baked into SKlearn. Some day, a better man than I will make this a parameter which models accept, however in the mean time, you gotta go into your sklearn installation, and tweak this accuracy_score to be whatever you want.

@NicolasHug
Copy link
Member

@Sandy4321 please see #15841 (comment). The instructions copied pasted from SO above are incorrect.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

HistGradientBoosting: Implement custom loss function like LightGBM permits it
4 participants
0