-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
[Feature Request] Custom loss functions for GradientBoostingRegressor/Classifier and HistGradientBoostingRegressor/Classifier #17659
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
We opened using custom loss functions in for Support for this feature is summarized here: #15841 (comment) |
8000
tr>
Oh awesome! I'm sorry I missed that, that's very cool! |
is it possible to add custom loss function to classification tree as well? |
people really need it It's tricky, but you can do it...
from .metrics import accuracy_score |
@Sandy4321 please see #15841 (comment). The instructions copied pasted from SO above are incorrect. |
Describe the workflow you want to enable
In Xgboost and LightGBM, you can specify a custom objective function (as well as gradients for that objective): https://xgboost.readthedocs.io/en/latest/tutorials/custom_metric_obj.html
It would be pretty cool if Sklearn supported that too
Describe your proposed solution
Use Xgboost / LightGBM instead of sklearn, but who wants to do that 😁
Describe alternatives you've considered, if relevant
Xgboost / LightGBM
Additional context
If you have a problem-specific objective function, it can be really useful to boost directly on that function.
The text was updated successfully, but these errors were encountered: