8000 FIX Corrects negative gradient of AdaBoost loss in GBDT (#22050) · venkyyuvy/scikit-learn@f45b09f · GitHub
[go: up one dir, main page]

Skip to content

Commit f45b09f

Browse files
glemaitrelorentzenchr
authored andcommitted
FIX Corrects negative gradient of AdaBoost loss in GBDT (scikit-learn#22050)
Co-authored-by: Christian Lorentzen <lorentzen.ch@gmail.com>
1 parent 690f15d commit f45b09f

File tree

3 files changed

+22
-2
lines changed

3 files changed

+22
-2
lines changed

doc/whats_new/v1.1.rst

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -211,6 +211,11 @@ Changelog
211211
:class:`ensemble.ExtraTreesClassifier`.
212212
:pr:`20803` by :user:`Brian Sun <bsun94>`.
213213

214+
- |Fix| Solve a bug in :class:`ensemble.GradientBoostingClassifier` where the
215+
exponential loss was computing the positive gradient instead of the
216+
negative one.
217+
:pr:`22050` by :user:`Guillaume Lemaitre <glemaitre>`.
218+
214219
:mod:`sklearn.feature_extraction.text`
215220
......................................
216221

sklearn/ensemble/_gb_losses.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -935,8 +935,8 @@ def negative_gradient(self, y, raw_predictions, **kargs):
935935
The raw predictions (i.e. values from the tree leaves) of the
936936
tree ensemble at iteration ``i - 1``.
937937
"""
938-
y_ = -(2.0 * y - 1.0)
939-
return y_ * np.exp(y_ * raw_predictions.ravel())
938+
y_ = 2.0 * y - 1.0
939+
return y_ * np.exp(-y_ * raw_predictions.ravel())
940940

941941
def _update_terminal_region(
942942
self,

sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -320,3 +320,18 @@ def test_lad_equals_quantiles(seed, alpha):
320320
y_true, raw_predictions, sample_weight=weights, alpha=alpha
321321
)
322322
assert pbl_weighted_loss == approx(ql_weighted_loss)
323+
324+
325+
def test_exponential_loss():
326+
"""Check that we compute the negative gradient of the exponential loss.
327+
328+
Non-regression test for:
329+
https://github.com/scikit-learn/scikit-learn/issues/9666
330+
"""
331+
loss = ExponentialLoss(n_classes=2)
332+
y_true = np.array([0])
333+
y_pred = np.array([0])
334+
# we expect to have loss = exp(0) = 1
335+
assert loss(y_true, y_pred) == pytest.approx(1)
336+
# we expect to have negative gradient = -1 * (1 * exp(0)) = -1
337+
assert_allclose(loss.negative_gradient(y_true, y_pred), -1)

0 commit comments

Comments
 (0)
0