8000 DOC Fixed typo, added missing comma in plot_forest_hist_grad_boosting_comparison example by Tialo · Pull Request #26954 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

DOC Fixed typo, added missing comma in plot_forest_hist_grad_boosting_comparison example #26954

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
trees according to each estimator:

- `n_estimators` controls the number of trees in the forest. It's a fixed number.
- `max_iter` is the the maximum number of iterations in a gradient boosting
- `max_iter` is the maximum number of iterations in a gradient boosting
based model. The number of iterations corresponds to the number of trees for
regression and binary classification problems. Furthermore, the actual number
of trees required by the model depends on the stopping criteria.
Expand Down Expand Up @@ -210,7 +210,7 @@
# models uniformly dominate the Random Forest models in the "test score vs
# training speed trade-off" (the HGBDT curve should be on the top left of the RF
# curve, without ever crossing). The "test score vs prediction speed" trade-off
# can also be more disputed but it's most often favorable to HGBDT. It's always
# can also be more disputed, but it's most often favorable to HGBDT. It's always
# a good idea to check both kinds of model (with hyper-parameter tuning) and
# compare their performance on your specific problem to determine which model is
# the best fit but **HGBT almost always offers a more favorable speed-accuracy
Expand Down
0