8000 DOC Clearer and more illustrative description for f_beta_score by pnucci · Pull Request #25548 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

DOC Clearer and more illustrative description for f_beta_score #25548

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
May 22, 2023

Conversation

pnucci
Copy link
Contributor
@pnucci pnucci commented Feb 6, 2023

Clearer and more illustrative description about the beta parameter. The original text mentions "weight of recall", without being clear that it's a ratio. Two intuitive examples are added, instead of discussing the asymptotic behaviour of beta.

Reference Issues/PRs

What does this implement/fix? Explain your changes.

Simple improvement in the documentation for f_beta_score, to make it more accurate and pragmatic.

Any other comments?

Clearer and more illustrative description about the `beta` parameter
@pnucci pnucci closed this Feb 6, 2023
@pnucci pnucci reopened this Feb 6, 2023
@pnucci pnucci changed the title [MRG] Better documentation for f_beta_score DOC Clearer and more illustrative description for f_beta_score Feb 6, 2023
Copy link
Member
@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the PR @pnucci !

Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
Copy link
Contributor Author
@pnucci pnucci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@thomasjpfan just one thing: with that suggestion, we lost the word "importance". Now we are saying "ratio of recall to precision". I still believe we should say something like
"ratio of recall importance to precision importance"
or
"importance ratio of recall to precision"
after all, beta is not a ratio of the metrics, just a ratio of their importance/weights
what are your thoughts?

@thomasjpfan
Copy link
Member
thomasjpfan commented Feb 8, 2023

I was trying to think of a way not to say importance twice. In any case, I am okay with your original suggestion:

ratio of recall importance to precision importance

@pnucci
Copy link
Contributor Author
pnucci commented Apr 17, 2023

Hi @thomasjpfan, are you ok with the last proposal? are we good to approve it?
Cheers

Copy link
Member
@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the update! LGTM

@thomasjpfan thomasjpfan added Quick Review For PRs that are quick to review Waiting for Second Reviewer First reviewer is done, need a second one! labels May 16, 2023
Copy link
Member
@ArturoAmorQ ArturoAmorQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM as it is. Thanks @pnucci!

@thomasjpfan thomasjpfan merged commit 6d356dd into scikit-learn:main May 22, 2023
REDVM pushed a commit to REDVM/scikit-learn that referenced this pull request Nov 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Documentation module:metrics Quick Review For PRs that are quick to review Waiting for Second Reviewer First reviewer is done, need a second one!
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
0