@@ -43,7 +43,7 @@ data in *folds* that we use for training and testing::
43
43
44
44
.. currentmodule :: sklearn.cross_validation
45
45
46
- This is called a :class: `KFold ` cross validation
46
+ This is called a :class: `KFold ` cross validation
47
47
48
48
.. _cv_generators_tut :
49
49
@@ -64,7 +64,7 @@ of indices for this purpose::
64
64
Train: [0 1 4 5] | test: [2 3]
65
65
Train: [0 1 2 3] | test: [4 5]
66
66
67
- The cross-validation can then be implemented easily::
67
+ The cross-validation can then be implemented easily::
68
68
69
69
>>> kfold = cross_validation.KFold(len(X_digits), k=3)
70
70
>>> [svc.fit(X_digits[train], y_digits[train]).score(X_digits[test], y_digits[test])
@@ -85,7 +85,7 @@ of the computer.
85
85
86
86
.. list-table ::
87
87
88
- *
88
+ *
89
89
90
90
- :class: `KFold ` **(n, k) **
91
91
@@ -95,7 +95,7 @@ of the computer.
95
95
96
96
- :class: `LeaveOneLabelOut ` **(labels) **
97
97
98
- *
98
+ *
99
99
100
100
- Split it K folds, train on K-1, test on left-out
101
101
@@ -116,11 +116,11 @@ of the computer.
116
116
:class: green
117
117
118
118
On the digits dataset, plot the cross-validation score of a :class: `SVC `
119
- estimator with an RBF kernel as a function of parameter `C ` (use a
119
+ estimator with an RBF kernel as a function of parameter `C ` (use a
120
120
logarithmic grid of points, from `1 ` to `10 `).
121
121
122
122
.. literalinclude :: ../../auto_examples/exercises/plot_cv_digits.py
123
- :lines: 13-23
123
+ :lines: 13-23
124
124
125
125
Solution: :download: `../../auto_examples/exercises/plot_cv_digits.py `
126
126
@@ -141,7 +141,7 @@ estimator during the construction and exposes an estimator API::
141
141
142
142
>>> from sklearn.grid_search import GridSearchCV
143
143
>>> gammas = np.logspace(-6, -1, 10)
144
- >>> clf = GridSearchCV(estimator=svc, param_grid=dict(gamma=gammas),
144
+ >>> clf = GridSearchCV(estimator=svc, param_grid=dict(gamma=gammas),
145
145
... n_jobs=-1)
146
146
>>> clf.fit(X_digits[:1000], y_digits[:1000]) # doctest: +ELLIPSIS
147
147
GridSearchCV(cv=None,...
@@ -165,7 +165,7 @@ a stratified 3-fold.
165
165
166
166
>>> cross_validation.cross_val_score(clf, X_digits, y_digits)
167
167
array([ 0.98497496, 0.97829716, 0.97996661])
168
-
168
+
169
169
Two cross-validation loops are performed in parallel: one by the
170
170
:class: `GridSearchCV ` estimator to set `gamma `, the other one by
171
171
`cross_val_score ` to measure the prediction performance of the
@@ -198,8 +198,8 @@ automatically by cross-validation::
198
198
n_alphas=100, normalize=False, precompute='auto', tol=0.0001,
199
199
verbose=False)
200
200
>>> # The estimator chose automatically its lambda:
201
- >>> lasso.alpha
202
- 0.013180196198701137
201
+ >>> lasso.alpha # doctest: +ELLIPSIS
202
+ 0.01318...
203
203
204
204
These estimators are called similarly to their counterparts, with 'CV'
205
205
appended to their name.
0 commit comments