8000 DOC Clearer and more illustrative description for f_beta_score (#25548) · scikit-learn/scikit-learn@6d356dd · GitHub
[go: up one dir, main page]

Skip to content

Commit 6d356dd

Browse files
pnuccithomasjpfan
andauthored
DOC Clearer and more illustrative description for f_beta_score (#25548)
Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
1 parent 86541f2 commit 6d356dd

File tree

1 file changed

+6
-4
lines changed

1 file changed

+6
-4
lines changed

sklearn/metrics/_classification.py

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1274,10 +1274,12 @@ def fbeta_score(
12741274
The F-beta score is the weighted harmonic mean of precision and recall,
12751275
reaching its optimal value at 1 and its worst value at 0.
12761276
1277-
The `beta` parameter determines the weight of recall in the combined
1278-
score. ``beta < 1`` lends more weight to precision, while ``beta > 1``
1279-
favors recall (``beta -> 0`` considers only precision, ``beta -> +inf``
1280-
only recall).
1277+
The `beta` parameter represents the ratio of recall importance to
1278+
precision importance. `beta > 1` gives more weight to recall, while
1279+
`beta < 1` favors precision. For example, `beta = 2` makes recall twice
1280+
as important as precision, while `beta = 0.5` does the opposite.
1281+
Asymptotically, `beta -> +inf` considers only recall, and `beta -> 0`
1282+
only precision.
12811283
12821284
Read more in the :ref:`User Guide <precision_recall_f_measure_metrics>`.
12831285

0 commit comments

Comments
 (0)
0