8000 [MRG + 1] DOC refer to code elements in nested CV example description… · maskani-moh/scikit-learn@2ee491b · GitHub
[go: up one dir, main page]

Skip to content

Commit 2ee491b

Browse files
jnothmanmaskani-moh
authored andcommitted
[MRG + 1] DOC refer to code elements in nested CV example description (scikit-learn#7949)
1 parent f716d90 commit 2ee491b

File tree

1 file changed

+7
-5
lines changed

1 file changed

+7
-5
lines changed

examples/model_selection/plot_nested_cross_validation_iris.py

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,13 @@
1717
[1]_ for an analysis of these issues.
1818
1919
To avoid this problem, nested CV effectively uses a series of
20-
train/validation/test set splits. In the inner loop, the score is approximately
21-
maximized by fitting a model to each training set, and then directly maximized
22-
in selecting (hyper)parameters over the validation set. In the outer loop,
23-
generalization error is estimated by averaging test set scores over several
24-
dataset splits.
20+
train/validation/test set splits. In the inner loop (here executed by
21+
:class:`GridSearchCV <sklearn.model_selection.GridSearchCV>`), the score is
22+
approximately maximized by fitting a model to each training set, and then
23+
directly maximized in selecting (hyper)parameters over the validation set. In
24+
the outer loop (here in :func:`cross_val_score
25+
<sklearn.model_selection.cross_val_score>`), generalization error is estimated
26+
by averaging test set scores over several dataset splits.
2527
2628
The example below uses a support vector classifier with a non-linear kernel to
2729
build a model with optimized hyperparameters by grid search. We compare the

0 commit comments

Comments
 (0)
0