-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
kaggle AUC != sklearn AUC #6711
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
May be due to #3864, i.e. it relates to the handling of small differences between scores. |
or interpolation? but probably not. Would be nice to know the definition of AUC that kaggle uses without diving into the code... |
I think small differences are to be expected although here the difference seems large. BTW, in our unit tests, we test our implementation against this alternative implementation: |
The alternative gives AUC, which is very close to Kaggle's:
|
Could you plot the ROC curve? |
I will take a look into this. |
This issue seems to be fixed now with sklearn version 0.18 |
Try to find what commit introduced this change. Also, are there test
failures when you comment out these lines?
|
You might have read the comment I made (which was incorrect) prior to editing my post, since it says replied from mail. |
@chenhe95 just to be sure, you retracted and no longer stand by your comment that blamed some difference on |
Yes. drop_intermediate was not the cause. |
I've written code to compare Kaggle's and sklearn's ROC AUC, and they appear to be very different.
The code to reproduce my results you can find here:
https://github.com/IraKorshunova/metrics_test
It gives:
AUC package in R gives the same score as Kaggle.
The text was updated successfully, but these errors were encountered: