8000 take glouppe comment · johannah/scikit-learn@317281b · GitHub
[go: up one dir, main page]

Skip to content

Commit 317281b

Browse files
committed
take glouppe comment
1 parent 392ef50 commit 317281b

File tree

2 files changed

+12
-10
lines changed
  • examples/tree
  • sklearn/tree

2 files changed

+12
-10
lines changed

examples/tree/plot_structure.py

Lines changed: 11 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,12 @@
33
Understanding the decision tree structure
44
=========================================
55
6-
The decision tree structure could be analysed to gain further insight on the
6+
The decision tree structure can be analysed to gain further insight on the
77
relation between the features and the target to predict. In this example, we
88
show how to retrieve:
99
- the binary tree structure;
10-
- the nodes that were reaches by a sample using the decision_paths method;
11-
- the leaf that was reaches by a sample using the apply method;
10+
- the nodes that were reached by a sample using the decision_paths method;
11+
- the leaf that was reached by a sample using the apply method;
1212
- the rules that were used to predict a sample;
1313
- the decision path shared by a group of samples.
1414
@@ -29,7 +29,7 @@
2929
estimator.fit(X_train, y_train)
3030

3131
# The decision estimator has an attribute called tree_ which stores the entire
32-
# tree structure and allow to access to low level attribute. The binary tree
32+
# tree structure and allows access to low level attributes. The binary tree
3333
# tree_ is represented as a number of parallel arrays. The i-th element of each
3434
# array holds information about the node `i`. Node 0 is the tree's root. NOTE:
3535
# Some of the arrays only apply to either leaves or split nodes, resp. In this
@@ -42,17 +42,18 @@
4242
# - threshold, threshold value at the node
4343
#
4444

45-
# Using those array, we can parse the tree structure:
45+
# Using those arrays, we can parse the tree structure:
4646

4747
print("The binary tree structure has %s nodes and has "
4848
"the following tree structure:"
4949
% estimator.tree_.node_count)
5050

51-
for i in np.arange(estimator.tree_.node_count):
51+
for i in range(estimator.tree_.node_count):
5252
if estimator.tree_.children_left[i] == estimator.tree_.children_right[i]:
5353
print("node=%s leaf node." % i)
5454
else:
55-
print("node=%s test node: go to node %s if X[:, %s] <= %ss else %s."
55+
print("node=%s test node: go to node %s if X[:, %s] <= %ss else to "
56+
"node %s."
5657
% (i,
5758
estimator.tree_.children_left[i],
5859
estimator.tree_.feature[i],
@@ -63,7 +64,7 @@
6364

6465
# First let's retrieve the decision path of each sample. The decision_paths
6566
# method allows to retrieve the node indicator function. A non zero elements at
66-
# position (i, j) indicates that the sample i goes # through the node j.
67+
# position (i, j) indicates that the sample i goes sthrough the node j.
6768

6869
node_indicator = estimator.decision_paths(X_test)
6970

@@ -89,8 +90,9 @@
8990
else:
9091
threshold_sign = ">"
9192

92-
print("rule %s : (X[%s, %s] (= %s) %s %s)"
93+
print("rule %s from node %s : (X[%s, %s] (= %s) %s %s)"
9394
% (i,
95+
node_id,
9496
sample_id,
9597
estimator.tree_.feature[node_id],
9698
X_test[i, estimator.tree_.feature[node_id]],

sklearn/tree/tree.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -472,7 +472,7 @@ def decision_paths(self, X, check_input=True):
472472
-------
473473
indicator : sparse csr array, shape = [n_samples, n_nodes]
474474
Return a node indicator matrix where non zero elements
475-
indicates that the samples goes through the samples.
475+
indicates that the samples goes through the nodes.
476476
477477
"""
478478
X = self._validate_X_predict(X, check_input)

0 commit comments

Comments
 (0)
0