-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
Make _binary_clf_curve a "public" method? #16470
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
maybe i'm missing something but doesn't scikitlearn.metrics._ranking.roc_curve also return what you need? |
|
So you're really looking for the confusion matrix reported over changing
thresholds?
|
Exactly, very well put.
On Fri, 6 Mar 2020 at 00:42, Joel Nothman ***@***.***> wrote:
So you're really looking for the confusion matrix reported over changing
thresholds?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#16470?email_source=notifications&email_token=AAEZN2PGB6ZN2YXCDIB3GH3RGA2HXA5CNFSM4KXAWN2KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEN7J2QY#issuecomment-595500355>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAEZN2NZ5OOKDXYMC3Q4TWLRGA2HXANCNFSM4KXAWN2A>
.
--
Kornel
|
I would be in favor of this being a public method. Having access to raw |
Same here - making it more flexible to create custom plots and metrics to supplement ROC and Precision-Recall. |
@jnothman any thoughts on the above? |
Currently, there are three public curve use
I have a homework for So, I think this internal api should be public. And, the history of
|
@scikit-learn/core-devs there's been requests to make this public every now and then. WDYT? |
I am not against making it public. However, I don't like the current name if we need go public ( |
I am okay with making it public with the name |
@adrinjalali I am interested in working on this |
Sure, give it a go @SuccessMoses |
Currently
sklearn.metrics.ranking._binary_clf_curve
is (the way I understand the underscore) an internal API method.Whenever there is a need to work with a different tradeoff than precision / recall or roc or when you need custom metrics for all thresholds, this method is a perfect fit, and the underscore in front of it makes me wonder if I can be confident it will not change in future versions :-)
I need to compute for instance (FP+TN)/(TN+FN+FP) at different thresholds and other use cases could be e.g. these)
I think making this method part of the public API would be beneficial for the community.
p.s.
Tensorflow used to have e.g.
tensorflow.contrib.metrics.python.ops.metric_ops.precision_recall_at_equal_thresholds
now they have https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/metrics_impl.py#L1792 etc.The text was updated successfully, but these errors were encountered: