8000 DOC Add example showcasing HGBT regression (#26991) · scikit-learn/scikit-learn@c826fec · GitHub
[go: up one dir, main page]

Skip to content

Commit c826fec

Browse files
ArturoAmorQArturoAmorQlorentzenchrglemaitre
authored
DOC Add example showcasing HGBT regression (#26991)
Co-authored-by: ArturoAmorQ <arturo.amor-quiroz@polytechnique.edu> Co-authored-by: Christian Lorentzen <lorentzen.ch@gmail.com> Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
1 parent 77a63e7 commit c826fec

12 files changed

+465
-8
lines changed

doc/modules/ensemble.rst

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,8 @@ are not yet supported, for instance some loss functions.
8080

8181
.. topic:: Examples:
8282

83-
* :ref:`sphx_glr_auto_examples_inspection_plot_partial_dependence.py`
83+
* :ref:`sphx_glr_auto_examples_inspection_plot_partial_dependence.py`
84+
* :ref:`sphx_glr_auto_examples_ensemble_plot_forest_hist_grad_boosting_comparison.py`
8485

8586
Usage
8687
^^^^^
@@ -129,6 +130,8 @@ Note that for technical reasons, using a callable as a scorer is significantly s
129130
than using the loss. By default, early-stopping is performed if there are at least
130131
10,000 samples in the training set, using the validation loss.
131132

133+
.. _nan_support_hgbt:
134+
132135
Missing values support
133136
^^^^^^^^^^^^^^^^^^^^^^
134137

@@ -167,6 +170,10 @@ If no missing values were encountered for a given feature during training,
167170
then samples with missing values are mapped to whichever child has the most
168171
samples.
169172

173+
.. topic:: Examples:
174+
175+
* :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py`
176+
170177
.. _sw_hgbdt:
171178

172179
Sample weight support
@@ -331,6 +338,7 @@ Also, monotonic constraints are not supported for multiclass classification.
331338
.. topic:: Examples:
332339

333340
* :ref:`sphx_glr_auto_examples_ensemble_plot_monotonic_constraints.py`
341+
* :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py`
334342

335343
.. _interaction_cst_hgbt:
336344

doc/templates/index.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,8 +70,8 @@ <h4 class="sk-landing-subheader text-white font-italic mb-3">Machine Learning in
7070
and <a href="supervised_learning.html#supervised-learning">more...</a></p>
7171
</div>
7272
<div class="overflow-hidden mx-2 text-center flex-fill">
73-
<a href="auto_examples/ensemble/plot_adaboost_regression.html" aria-label="Regression">
74-
<img src="_images/sphx_glr_plot_adaboost_regression_thumb.png" class="sk-index-img" alt="Decision Tree Regression with AdaBoost">
73+
<a href="auto_examples/ensemble/plot_hgbt_regression.html" aria-label="Regression">
74+
<img src="_images/sphx_glr_plot_hgbt_regression_002.png" class="sk-index-img" alt="Decision Tree Regression with HGBT">
7575
</a>
7676
</div>
7777
<a href="auto_examples/index.html#examples" class="sk-btn-primary btn text-white btn-block" role="button">Examples</a>

examples/ensemble/plot_adaboost_regression.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,10 @@
99
regressor. As the number of boosts is increased the regressor can fit more
1010
detail.
1111
12+
See :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for an
13+
example showcasing the benefits of using more efficient regression models such
14+
as :class:`~ensemble.HistGradientBoostingRegressor`.
15+
1216
.. [1] `H. Drucker, "Improving Regressors using Boosting Techniques", 1997.
1317
<https://citeseerx.ist.psu.edu/doc_view/pid/8d49e2dedb817f2c3330e74b63c5fc86d2399ce3>`_
1418

examples/ensemble/plot_forest_hist_grad_boosting_comparison.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,9 @@
2222
the predicted value. RFs, on the other hand, are based on bagging and use a
2323
majority vote to predict the outcome.
2424
25-
For more information on ensemble models, see the :ref:`User Guide <ensemble>`.
25+
See the :ref:`User Guide <ensemble>` for more information on ensemble models or
26+
see :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for an
27+
example showcasing some other features of HGBT models.
2628
"""
2729

2830
# Author: Arturo Amor <david-arturo.amor-quiroz@inria.fr>

examples/ensemble/plot_gradient_boosting_categorical.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,10 @@
2121
We will work with the Ames Iowa Housing dataset which consists of numerical
2222
and categorical features, where the houses' sales prices is the target.
2323
24+
See :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for an
25+
example showcasing some other features of
26+
:class:`~ensemble.HistGradientBoostingRegressor`.
27+
2428
"""
2529

2630
# %%

examples/ensemble/plot_gradient_boosting_quantile.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,9 @@
44
=====================================================
55
66
This example shows how quantile regression can be used to create prediction
7-
intervals.
7+
intervals. See :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py`
8+
for an example showcasing some other features of
9+
:class:`~ensemble.HistGradientBoostingRegressor`.
810
911
"""
1012

examples/ensemble/plot_gradient_boosting_regression.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,10 @@
1111
and 500 regression trees of depth 4.
1212
1313
Note: For larger datasets (n_samples >= 10000), please refer to
14-
:class:`~sklearn.ensemble.HistGradientBoostingRegressor`.
14+
:class:`~sklearn.ensemble.HistGradientBoostingRegressor`. See
15+
:ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for an example
16+
showcasing some other advantages of
17+
:class:`~ensemble.HistGradientBoostingRegressor`.
1518
1619
"""
1720

0 commit comments

Comments
 (0)
0