8000 GridSearchCV should report the average values of arbitrary scoring functions · Issue #3575 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

GridSearchCV should report the average values of arbitrary scoring functions #3575

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
briandastous opened this issue Aug 18, 2014 · 1 comment

Comments

@briandastous
Copy link

Right now I'm using GridSearchCV to determine the optimal classifier parameters for a binary classification problem. I'm using the Matthews correlation coefficient as the scorer for GridSearchCV. However, I also need to know the average precision and recall of the best classifier (as determined by the Matthews correlation coefficient) on the validation data.

Given the way that GridSearchCV works, after calling it I have to make two calls to cross_validation.cross_val_score to get the precision and recall. Under the hood this involves repeating the same time-consuming fitting of my validation data 3 times over - once in GridSearchCV, and then in each cross_val_score call. It's true that by rolling my own cross-validation code I could reduce this duplication to a factor of 2 (rather than 3), but fundamentally I should be able to get this information from GridSearchCV. Perhaps its initializer should take a parameter named something like report_scorers that would take a list of scoring functions, and instances could have a corresponding list attribute report_scores_ that the average values would be dumped into?

@jnothman
Copy link
Member

Duplicate of #1850. See also #2759.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants
0