8000 [MRG+1] Corrected sign error in QuantileLossFunction (#6429) · NelleV/scikit-learn@285364b · GitHub
[go: up one dir, main page]

Skip to content

Commit 285364b

Browse files
AlexisMignonNelleV
authored andcommitted
[MRG+1] Corrected sign error in QuantileLossFunction (scikit-learn#6429)
* Corrected sign error in QuantileLossFunction
1 parent 3502b1d commit 285364b

File tree

3 files changed

+22
-5
lines changed

3 files changed

+22
-5
lines changed

doc/whats_new.rst

Lines changed: 9 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -98,7 +98,7 @@ Enhancements
9898
- Added ability to set ``n_jobs`` parameter to :func:`pipeline.make_union`.
9999
A ``TypeError`` will be raised for any other kwargs. :issue:`8028`
100100
by :user:`Alexander Booth <alexandercbooth>`.
101-
101+
102102
- Added type checking to the ``accept_sparse`` parameter in
103103
:mod:`sklearn.utils.validation` methods. This parameter now accepts only
104104
boolean, string, or list/tuple of strings. ``accept_sparse=None`` is deprecated
@@ -140,12 +140,18 @@ Bug fixes
140140
where the ``perplexity`` method was returning incorrect results because
141141
the ``transform`` method returns normalized document topic distributions
142142
as of version 0.18. :issue:`7954` by :user:`Gary Foreman <garyForeman>`.
143-
143+
144144
- Fix a bug where :class:`sklearn.ensemble.GradientBoostingClassifier` and
145145
:class:`sklearn.ensemble.GradientBoostingRegressor` ignored the
146146
``min_impurity_split`` parameter.
147147
:issue:`8006` by :user:`Sebastian Pölsterl <sebp>`.
148148

149+
- Fix a bug where
150+
:class:`sklearn.ensemble.gradient_boosting.QuantileLossFunction` computed
151+
negative errors for negative values of ``ytrue - ypred`` leading to
152+
wrong values when calling ``__call__``.
153+
:issue:`8087` by :user:`Alexis Mignon <AlexisMignon>`
154+
149155
API changes summary
150156
-------------------
151157

@@ -154,7 +160,7 @@ API changes summary
154160
ensemble estimators (deriving from :class:`ensemble.BaseEnsemble`)
155161
now only have ``self.estimators_`` available after ``fit``.
156162
:issue:`7464` by `Lars Buitinck`_ and `Loic Esteve`_.
157-
163+
158164
- Deprecate the ``doc_topic_distr`` argument of the ``perplexity`` method
159165
in :class:`sklearn.decomposition.LatentDirichletAllocation` because the
160166
user no longer has access to the unnormalized document topic distribution

sklearn/ensemble/gradient_boosting.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -418,10 +418,10 @@ def __call__(self, y, pred, sample_weight=None):
418418

419419
mask = y > pred
420420
if sample_weight is None:
421-
loss = (alpha * diff[mask].sum() +
421+
loss = (alpha * diff[mask].sum() -
422422
(1.0 - alpha) * diff[~mask].sum()) / y.shape[0]
423423
else:
424-
loss = ((alpha * np.sum(sample_weight[mask] * diff[mask]) +
424+
loss = ((alpha * np.sum(sample_weight[mask] * diff[mask]) -
425425
(1.0 - alpha) * np.sum(sample_weight[~mask] * diff[~mask])) /
426426
sample_weight.sum())
427427
return loss

sklearn/ensemble/tests/test_gradient_boosting_loss_functions.py

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@
1515
from sklearn.ensemble.gradient_boosting import RegressionLossFunction
1616
from sklearn.ensemble.gradient_boosting import LOSS_FUNCTIONS
1717
from sklearn.ensemble.gradient_boosting import _weighted_percentile
18+
from sklearn.ensemble.gradient_boosting import QuantileLossFunction
1819

1920

2021
def test_binomial_deviance():
@@ -141,6 +142,16 @@ def test_weighted_percentile_zero_weight():
141142
assert score == 1.0
142143

143144

145+
def test_quantile_loss_function():
146+
# Non regression test for the QuantileLossFunction object
147+
# There was a sign problem when evaluating the function
148+
# for negative values of 'ytrue - ypred'
149+
x = np.asarray([-1.0, 0.0, 1.0])
150+
y_found = QuantileLossFunction(1, 0.9)(x, np.zeros_like(x))
151+
y_expected = np.asarray([0.1, 0.0, 0.9]).mean()
152+
np.testing.assert_allclose(y_found, y_expected)
153+
154+
144155
def test_sample_weight_deviance():
145156
# Test if deviance supports sample weights.
146157
rng = check_random_state(13)

0 commit comments

Comments
 (0)
0