-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG + 1] Set min_impurity_split in gradient boosting models #8007
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Can you please add a test to check that it now works as expected? |
f1c355e
to
50e30b2
Compare
@amueller I added a test function to the PR |
@@ -961,6 +961,18 @@ def test_max_leaf_nodes_max_depth(): | |||
assert_equal(tree.max_depth, 1) | |||
|
|||
|
|||
def test_min_impurity_split(): | |||
# Test precedence of max_leaf_nodes over max_depth. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
perhaps change this comment? Thanks for looking into this!
50e30b2
to
5385982
Compare
Good catch, I updated the comment. |
Can you please add an entry to whatsnew.rst to the bugfix section? Otherwise LGTM. |
5385982
to
c901cc5
Compare
Done, I added an entry to whats_new |
self.min_impurity_split should be passed to DecisionTreeRegressor in BaseGradientBoosting._fit_stage. Fixes scikit-learn#8006
c901cc5
to
b48fb08
Compare
LGTM |
Reference Issue
Fixes #8006
What does this implement/fix? Explain your changes.
Pass
self.min_impurity_split
toDecisionTreeRegressor
inBaseGradientBoosting._fit_stage
.