File tree Expand file tree Collapse file tree 1 file changed +7
-5
lines changed Expand file tree Collapse file tree 1 file changed +7
-5
lines changed Original file line number Diff line number Diff line change 17
17
[1]_ for an analysis of these issues.
18
18
19
19
To avoid this problem, nested CV effectively uses a series of
20
- train/validation/test set splits. In the inner loop, the score is approximately
21
- maximized by fitting a model to each training set, and then directly maximized
22
- in selecting (hyper)parameters over the validation set. In the outer loop,
23
- generalization error is estimated by averaging test set scores over several
24
- dataset splits.
20
+ train/validation/test set splits. In the inner loop (here executed by
21
+ :class:`GridSearchCV <sklearn.model_selection.GridSearchCV>`), the score is
22
+ approximately maximized by fitting a model to each training set, and then
23
+ directly maximized in selecting (hyper)parameters over the validation set. In
24
+ the outer loop (here in :func:`cross_val_score
25
+ <sklearn.model_selection.cross_val_score>`), generalization error is estimated
26
+ by averaging test set scores over several dataset splits.
25
27
26
28
The example below uses a support vector classifier with a non-linear kernel to
27
29
build a model with optimized hyperparameters by grid search. We compare the
You can’t perform that action at this time.
0 commit comments