-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
DOC highlights: stacking estimators #15414
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@thebooort sorry I saw your message too late :/ |
from sklearn.ensemble import StackingClassifier | ||
from sklearn.model_selection import train_test_split | ||
|
||
X, y = load_iris(return_X_y=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the docstring, it was enough but do you think that in the highlight we should use another dataset than iris
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the highlights should be as easy as the docstrings, and the user guide is where the examples should be maybe more meaningful. We can link to the userguide for more info though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK I see LGTM then
# allows to use the strength of each individual estimator by using their output | ||
# as input of a final estimator. | ||
# Note that ``estimators_`` are fitted on the full ``X`` while | ||
# ``final_estimator``_ is trained using cross-validated predictions of the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# ``final_estimator``_ is trained using cross-validated predictions of the | |
# ``final_estimator_`` is trained using cross-validated predictions of the |
############################################################################ | ||
# Stacking Classifier (and Regressor) | ||
# ----------------------------------- | ||
# These new estimators are a stack of estimators with a final classifier or |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove the first line and link to estimators?
StackingClassifer and StackingRegressor implements stacked generalization, which consist of stacking the output...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The
.. currentmodule:: sklearn
On the top allows us to not need the sklearn.
part in references in this file.
@@ -12,7 +12,7 @@ | |||
|
|||
To install the latest version (with pip):: | |||
|
|||
pip install -U scikit-learn --upgrade | |||
pip install -U scikit-learn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm confused by this. As I understand, -U
is the same as --upgrade
, isn't it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lol I thought it was for --user
(which I've never ever used)
but you're right, let's keep --upgrade
and remove -U
then
Towards #15152
This is the example we have already in its docstring.