-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
ENH Add custom loss support for HistGradientBoosting #16908
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH Add custom loss support for HistGradientBoosting #16908
Conversation
Add custom loss support for HistGradientBoostingClassifier and HistGradientBoostingRegressor as a private API without any documentation. A `BaseLoss` object can now be passed a loss parameter. Resolves: scikit-learn#15841
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @gbolmier !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like the issue discussion is okay with making this private (for now).
LGTM
is it possible to add custom loss function to classification tree as well? |
people really need it It's tricky, but you can do it...
from .metrics import accuracy_score |
@Sandy4321 please see #15841 (comment). The instructions copied pasted from SO above are incorrect. |
Reference Issues/PRs
Resolves: #15841
What does this implement/fix? Explain your changes.
Add custom loss support for HistGradientBoostingClassifier and
HistGradientBoostingRegressor as a private API without any
documentation. A
BaseLoss
object can now be passed a lossparameter.