-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Request more criterion for random forest regression #5368
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Is that the same as mean squared relative error? Do you have a reference? |
New criteria can be supported fairly easily. They need to be added to |
@amueller https://www.kaggle.com/c/rossmann-store-sales/details/evaluation defined the RMSPE criterion. |
Not sure that is the most common name for that (also the formula is not really for percentages). |
It is easy when you have understood the codebase, but it is true that people may not know where to look at when they arrive. Maybe we could make it possible to pass Criterion object directly, as we do for Splitter? Then one would not have to hack around. |
This is true. What do you mean pass Criterion object directly? |
so when it will be done? On Wed, Oct 21, 2015 at 6:58 PM, Jacob Schreiber notifications@github.com
|
I think that is kind of the wrong attitude to have. If there's a particular feature you'd like, you should attempt to submit a PR incorporating it, or be very specific about which criterion you'd like and maybe someone will take it up for you. |
If there is a concrete need/idea for a new criterion I'd be interested in doing the coding work to implement it (for my education about how the decision tree internals work). |
There was some attempt to implement MAE at #6039. I think this criterion really is missing at the moment and could be used at many occasions. |
As a follow up, how much effort would it be to implement the set of criteria currently available in GradientBoostingRegressor, namely lad and huber losses ? |
Hi - I've written up a cython extension for a LAD (L1 norm) criterion that plugs into the tree based classifiers. Would this be of wider interest - to include back into sklearn ? |
Could you make a PR? (An attempt at |
The implementation there is for MAD - namely the L1 norm about the mean of each class. This is a LAD implementation that uses medians. |
Ah okay! thanks for the clarification :) I'm pretty much new to all this ;) |
Current random forest regressor only support for 'mse'. Can more criterions such as mean square of percentage error can be support by scikit-learn?
The text was updated successfully, but these errors were encountered: