@@ -432,8 +432,11 @@ def predict(self, X):
432
432
433
433
@property
434
434
def feature_importances_ (self ):
435
- """Return the feature importances (the higher, the more important the
436
- feature).
435
+ """Return the feature importances.
436
+
437
+ The importance of a feature is computed as the
438
+ (normalized) total reduction of the criterion brought by that
439
+ feature. It is also known as the Gini importance [4]_.
437
440
438
441
Returns
439
442
-------
@@ -506,10 +509,9 @@ class DecisionTreeClassifier(BaseDecisionTree, ClassifierMixin):
506
509
output (for multi-output problems).
507
510
508
511
`feature_importances_` : array of shape = [n_features]
509
- The feature importances
510
- (the higher, the more important the feature).
512
+ The feature importances. The higher, the more important the feature.
511
513
The importance of a feature is computed as the
512
- (normalized) total reduction of error brought by that
514
+ (normalized) total reduction of the criterion brought by that
513
515
feature. It
8C9E
is also known as the Gini importance [4]_.
514
516
515
517
See also
@@ -692,10 +694,9 @@ class DecisionTreeRegressor(BaseDecisionTree, RegressorMixin):
692
694
The underlying Tree object.
693
695
694
696
`feature_importances_` : array of shape = [n_features]
695
- The feature importances
696
- (the higher, the more important the feature).
697
+ The feature importances. The higher, the more important the feature.
697
698
The importance of a feature is computed as the
698
- (normalized) total reduction of error brought by that
699
+ (normalized) total reduction of the criterion brought by that
699
700
feature. It is also known as the Gini importance [4]_.
700
701
701
702
See also
0 commit comments