@@ -1384,6 +1384,14 @@ class GradientBoostingClassifier(BaseGradientBoosting, ClassifierMixin):
1384
1384
The collection of fitted sub-estimators. ``loss_.K`` is 1 for binary
1385
1385
classification, otherwise n_classes.
1386
1386
1387
+ Notes
1388
+ -----
1389
+ The features are always randomly permuted at each split. Therefore,
1390
+ the best found split may vary, even with the same training data and
1391
+ ``max_features=n_features``, if the improvement of the criterion is
1392
+ identical for several splits enumerated during the search of the best
1393
+ split. To obtain a deterministic behaviour during fitting,
1394
+ ``random_state`` has to be fixed.
1387
1395
1388
1396
See also
1389
1397
--------
@@ -1727,7 +1735,8 @@ class GradientBoostingRegressor(BaseGradientBoosting, RegressorMixin):
1727
1735
warm_start : bool, default: False
1728
1736
When set to ``True``, reuse the solution of the previous call to fit
1729
1737
and add more estimators to the ensemble, otherwise, just erase the
1730
- previous solution.
1738
+ p
1739
+ revious solution.
1731
1740
1732
1741
random_state : int, RandomState instance or None, optional (default=None)
1733
1742
If int, random_state is the seed used by the random number generator;
@@ -1770,6 +1779,15 @@ class GradientBoostingRegressor(BaseGradientBoosting, RegressorMixin):
1770
1779
estimators_ : ndarray of DecisionTreeRegressor, shape = [n_estimators, 1]
1771
1780
The collection of fitted sub-estimators.
1772
1781
1782
+ Notes
1783
+ -----
1784
+ The features are always randomly permuted at each split. Therefore,
1785
+ the best found split may vary, even with the same training data and
1786
+ ``max_features=n_features``, if the improvement of the criterion is
1787
+ identical for several splits enumerated during the search of the best
1788
+ split. To obtain a deterministic behaviour during fitting,
1789
+ ``random_state`` has to be fixed.
1790
+
1773
1791
See also
1774
1792
--------
1775
1793
DecisionTreeRegressor, RandomForestRegressor
0 commit comments