8000 DOC roc_auc_score and average_precision_score explicit about binary i… · scikit-learn/scikit-learn@6f42105 · GitHub
[go: up one dir, main page]

Skip to content

Commit 6f42105

Browse files
jrbourbeaujnothman
authored andcommitted
DOC roc_auc_score and average_precision_score explicit about binary input (#9557)
1 parent ee2025f commit 6f42105

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

sklearn/metrics/ranking.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ def average_precision_score(y_true, y_score, average="macro",
116116
Parameters
117117
----------
118118
y_true : array, shape = [n_samples] or [n_samples, n_classes]
119-
True binary labels in binary label indicators.
119+
True binary labels (either {0, 1} or {-1, 1}).
120120
121121
y_score : array, shape = [n_samples] or [n_samples, n_classes]
122122
Target scores, can either be probability estimates of the positive
@@ -200,7 +200,7 @@ def roc_auc_score(y_true, y_score, average="macro", sample_weight=None):
200200
Parameters
201201
----------
202202
y_true : array, shape = [n_samples] or [n_samples, n_classes]
203-
True binary labels in binary label indicators.
203+
True binary labels (either {0, 1} or {-1, 1}).
204204
205205
y_score : array, shape = [n_samples] or [n_samples, n_classes]
206206
Target scores, can either be probability estimates of the positive
@@ -438,8 +438,8 @@ def roc_curve(y_true, y_score, pos_label=None, sample_weight=None,
438438
----------
439439
440440
y_true : array, shape = [n_samples]
441-
True binary labels in range {0, 1} or {-1, 1}. If labels are not
442-
binary, pos_label should be explicitly given.
441+
True binary labels. If labels are not either {-1, 1} or {0, 1}, then
442+
pos_label should be explicitly given.
443443
444444
y_score : array, shape = [n_samples]
445445
Target scores, can either be probability estimates of the positive

0 commit comments

Comments
 (0)
0