8000 fix doctests · scikit-learn/scikit-learn@b8a118a · GitHub
[go: up one dir, main page]

Skip to content

Commit b8a118a

Browse files
committed
fix doctests
1 parent 57b8364 commit b8a118a

File tree

2 files changed

+8
-5
lines changed

2 files changed

+8
-5
lines changed

doc/modules/ensemble.rst

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -964,7 +964,8 @@ The following example shows how to fit the majority rule classifier::
964964
>>> iris = datasets.load_iris()
965965
>>> X, y = iris.data[:, 1:3], iris.target
966966

967-
>>> clf1 = LogisticRegression(random_state=1)
967+
>>> clf1 = LogisticRegression(solver='lbfgs', multi_class='multinomial',
968+
... random_state=1)
968969
>>> clf2 = RandomForestClassifier(n_estimators=50, random_state=1)
969970
>>> clf3 = GaussianNB()
970971

@@ -973,10 +974,10 @@ The following example shows how to fit the majority rule classifier::
973974
>>> for clf, label in zip([clf1, clf2, clf3, eclf], ['Logistic Regression', 'Random Forest', 'naive Bayes', 'Ensemble']):
974975
... scores = cross_val_score(clf, X, y, cv=5, scoring='accuracy')
975976
... print("Accuracy: %0.2f (+/- %0.2f) [%s]" % (scores.mean(), scores.std(), label))
976-
Accuracy: 0.90 (+/- 0.05) [Logistic Regression]
977+
Accuracy: 0.95 (+/- 0.04) [Logistic Regression]
977978
Accuracy: 0.94 (+/- 0.04) [Random Forest]
978979
Accuracy: 0.91 (+/- 0.04) [naive Bayes]
979-
Accuracy: 0.95 (+/- 0.05) [Ensemble]
980+
Accuracy: 0.95 (+/- 0.04) [Ensemble]
980981

981982

982983
Weighted Average Probabilities (Soft Voting)
@@ -1049,7 +1050,8 @@ The `VotingClassifier` can also be used together with `GridSearch` in order
10491050
to tune the hyperparameters of the individual estimators::
10501051

10511052
>>> from sklearn.model_selection import GridSearchCV
1052-
>>> clf1 = LogisticRegression(random_state=1)
1053+
>>> clf1 = LogisticRegression(solver='lbfgs', multi_class='multinomial',
1054+
... random_state=1)
10531055
>>> clf2 = RandomForestClassifier(random_state=1)
10541056
>>> clf3 = GaussianNB()
10551057
>>> eclf = VotingClassifier(estimators=[('lr', clf1), ('rf', clf2), ('gnb', clf3)], voting='soft')

doc/tutorial/statistical_inference/supervised_learning.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -368,7 +368,8 @@ function or **logistic** function:
368368
369369
::
370370

371-
>>> logistic = linear_model.LogisticRegression(C=1e5)
371+
>>> logistic = linear_model.LogisticRegression(solver='lbfgs', C=1e5,
372+
... multi_class='multinomial')
372373
>>> logistic.fit(iris_X_train, iris_y_train)
373374
LogisticRegression(C=100000.0, class_weight=None, dual=False,
374375
fit_intercept=True, intercept_scaling=1, max_iter=100,

0 commit comments

Comments
 (0)
0