8000 Fix doctest failure. · bennihepp/scikit-learn@a8b5110 · GitHub
[go: up one dir, main page]

Skip to content

Commit a8b5110

Browse files
committed
Fix doctest failure.
1 parent 8a6fe20 commit a8b5110

File tree

1 file changed

+10
-10
lines changed

1 file changed

+10
-10
lines changed

doc/tutorial/statistical_inference/model_selection.rst

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ data in *folds* that we use for training and testing::
4343

4444
.. currentmodule:: sklearn.cross_validation
4545

46-
This is called a :class:`KFold` cross validation
46+
This is called a :class:`KFold` cross validation
4747

4848
.. _cv_generators_tut:
4949

@@ -64,7 +64,7 @@ of indices for this purpose::
6464
Train: [0 1 4 5] | test: [2 3]
6565
Train: [0 1 2 3] | test: [4 5]
6666

67-
The cross-validation can then be implemented easily::
67+
The cross-validation can then be implemented easily::
6868

6969
>>> kfold = cross_validation.KFold(len(X_digits), k=3)
7070
>>> [svc.fit(X_digits[train], y_digits[train]).score(X_digits[test], y_digits[test])
@@ -85,7 +85,7 @@ of the computer.
8585

8686
.. list-table::
8787

88-
*
88+
*
8989

9090
- :class:`KFold` **(n, k)**
9191

@@ -95,7 +95,7 @@ of the computer.
9595

9696
- :class:`LeaveOneLabelOut` **(labels)**
9797

98-
*
98+
*
9999

100100
- Split it K folds, train on K-1, test on left-out
101101

@@ -116,11 +116,11 @@ of the computer.
116116
:class: green
117117

118118
On the digits dataset, plot the cross-validation score of a :class:`SVC`
119-
estimator with an RBF kernel as a function of parameter `C` (use a
119+
estimator with an RBF kernel as a function of parameter `C` (use a
120120
logarithmic grid of points, from `1` to `10`).
121121

122122
.. literalinclude:: ../../auto_examples/exercises/plot_cv_digits.py
123-
:lines: 13-23
123+
:lines: 13-23
124124

125125
Solution: :download:`../../auto_examples/exercises/plot_cv_digits.py`
126126

@@ -141,7 +141,7 @@ estimator during the construction and exposes an estimator API::
141141

142142
>>> from sklearn.grid_search import GridSearchCV
143143
>>> gammas = np.logspace(-6, -1, 10)
144-
>>> clf = GridSearchCV(estimator=svc, param_grid=dict(gamma=gammas),
144+
>>> clf = GridSearchCV(estimator=svc, param_grid=dict(gamma=gammas),
145145
... n_jobs=-1)
146146
>>> clf.fit(X_digits[:1000], y_digits[:1000]) # doctest: +ELLIPSIS
147147
GridSearchCV(cv=None,...
@@ -165,7 +165,7 @@ a stratified 3-fold.
165165

166166
>>> cross_validation.cross_val_score(clf, X_digits, y_digits)
167167
array([ 0.98497496, 0.97829716, 0.97996661])
168-
168+
169169
Two cross-validation loops are performed in parallel: one by the
170170
:class:`GridSearchCV` estimator to set `gamma`, the other one by
171171
`cross_val_score` to measure the prediction performance of the
@@ -198,8 +198,8 @@ automatically by cross-validation::
198198
n_alphas=100, normalize=False, precompute='auto', tol=0.0001,
199199
verbose=False)
200200
>>> # The estimator chose automatically its lambda:
201-
>>> lasso.alpha
202-
0.013180196198701137
201+
>>> lasso.alpha # doctest: +ELLIPSIS
202+
0.01318...
203203

204204
These estimators are called similarly to their counterparts, with 'CV'
205205
appended to their name.

0 commit comments

Comments
 (0)
0