8000 [MRG] Changed examples so they produce the same values on OS X by georgipeev · Pull Request #11289 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

[MRG] Changed examples so they produce the same values on OS X #11289

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
773ae4c
DOC add missing requirements for building docs (#11292)
twosigmajab Jun 15, 2018
47891de
DOC: replace TODO with link to the glossary (#11279)
tsdlovell Jun 15, 2018
7613559
FIX #11215 : Changing return in docstring to yields for generator fun…
Andrew-peng Jun 15, 2018
9395424
DOC replace OpenHub/ohloh badge with star button (#11288)
twosigmajab Jun 16, 2018
e751d65
MAINT skip dataset downloading doctest (#11284)
aozgaa Jun 17, 2018
13b33ed
DOC: add references for CD in LASSO and duality gap criterion (#11302)
agramfort Jun 17, 2018
cb5ec0a
Add sparse efficiency warning to randomized_svd for dok_matrix / lil_…
scottgigante Jun 17, 2018
0badbea
FIX Uses self.scoring for score function (#11192)
thomasjpfan Jun 17, 2018
c80665f
BLD fix sphx gallery errors (#11307)
agramfort Jun 18, 2018
5a063ed
proposal to use .joblib file extension (#11230)
yufengg Jun 18, 2018
2aee027
DOC: use .joblib file extension rather than .pkl
lesteve Jun 18, 2018
877ab46
DOC Add libraries.io and changelog links (#11298)
twosigmajab Jun 18, 2018
c9e48bf
DOC reorganize datasets documentation page (#11180)
jeremiedbb Jun 19, 2018
b67149e
ENH Add refit_time_ attribute to model selection (#11310)
mfeurer Jun 20, 2018
9566738
MAINT Fix #9350: Enable has_fit_parameter() and fit_score_takes_y() t…
markroth8 Jun 20, 2018
2ce21c2
Fix skipping in conftest.py (#11318)
jnothman Jun 20, 2018
786c94d
MAINT clarifications in ColumnTransformer._update_transformers (#11323)
jnothman Jun 20, 2018
caa426f
COSMIT fix syntax quirk
jnothman Jun 20, 2018
580026e
changed examples so they produce the same values on MacOS
Jun 15, 2018
ca6adf7
switched two constants to scientific notation
Jun 15, 2018
f473b29
use slipsis for example values that depend on RNG library
Jun 18, 2018
58161fc
addede more ellipsis
Jun 18, 2018
6492d8a
removed extra dot
Jun 19, 2018
e4d2ce0
Merge remote-tracking branch 'upstream/master'
Jun 20, 2018
ff42981
Merge branch 'master' into fix-doctest-for-LinearSVC-and-LinearSVR
Jun 20, 2018
723cd09
nitpicks
glemaitre Jul 14, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions doc/modules/model_evaluation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -98,9 +98,9 @@ Usage examples:
>>> from sklearn.model_selection import cross_val_score
>>> iris = datasets.load_iris()
>>> X, y = iris.data, iris.target
>>> clf = svm.SVC(gamma='scale', probability=True, random_state=0)
>>> cross_val_score(clf, X, y, scoring='neg_log_loss') # doctest: +ELLIPSIS
array([-0.10..., -0.16..., -0.07...])
>>> clf = svm.SVC(gamma='scale', random_state=0)
>>> cross_val_score(clf, X, y, scoring='recall_macro') # doctest: +ELLIPSIS
array([0.980..., 0.960..., 0.979...])
>>> model = svm.SVC()
>>> cross_val_score(model, X, y, scoring='wrong_choice')
Traceback (most recent call last):
Expand Down
19 changes: 9 additions & 10 deletions sklearn/svm/classes.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,16 +116,15 @@ class LinearSVC(BaseEstimator, LinearClassifierMixin,
>>> from sklearn.svm import LinearSVC
>>> from sklearn.datasets import make_classification
>>> X, y = make_classification(n_features=4, random_state=0)
>>> clf = LinearSVC(random_state=0)
>>> clf = LinearSVC(random_state=0, tol=1e-5)
>>> clf.fit(X, y)
LinearSVC(C=1.0, class_weight=None, dual=True, fit_intercept=True,
intercept_scaling=1, loss='squared_hinge', max_iter=1000,
multi_class='ovr', penalty='l2', random_state=0, tol=0.0001,
verbose=0)
multi_class='ovr', penalty='l2', random_state=0, tol=1e-05, verbose=0)
>>> print(clf.coef_)
[[0.08551385 0.39414796 0.49847831 0.37513797]]
[[0.085... 0.394... 0.498... 0.375...]]
>>> print(clf.intercept_)
[0.28418066]
[0.284...]
>>> print(clf.predict([[0, 0, 0, 0]]))
[1]

Expand Down Expand Up @@ -327,17 +326,17 @@ class LinearSVR(LinearModel, RegressorMixin):
>>> from sklearn.svm import LinearSVR
>>> from sklearn.datasets import make_regression
>>> X, y = make_regression(n_features=4, random_state=0)
>>> regr = LinearSVR(random_state=0)
>>> regr = LinearSVR(random_state=0, tol=1e-5)
>>> regr.fit(X, y)
LinearSVR(C=1.0, dual=True, epsilon=0.0, fit_intercept=True,
intercept_scaling=1.0, loss='epsilon_insensitive', max_iter=1000,
random_state=0, tol=0.0001, verbose=0)
random_state=0, tol=1e-05, verbose=0)
>>> print(regr.coef_)
[16.35750999 26.91499923 42.30652207 60.47843124]
[16.35... 26.91... 42.30... 60.47...]
>>> print(regr.intercept_)
[-4.29756543]
[-4.29...]
>>> print(regr.predict([[0, 0, 0, 0]]))
[-4.29756543]
[-4.29...]

See also
--------
Expand Down
0