8000 DOC Inspection Examples links in User Guide by preyasshah9 · Pull Request #30752 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

DOC Inspection Examples links in User Guide #30752

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Feb 13, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added bench_num_threads.parquet
Binary file not shown.
3 changes: 3 additions & 0 deletions examples/inspection/plot_partial_dependence.py
Original file line number Diff line number Diff line change
Expand Up @@ -365,8 +365,11 @@
# However, it is worth noting that we are creating potential meaningless
# synthetic samples if features are correlated.
#
# .. _ice-vs-pdp:
#
# ICE vs. PDP
# ~~~~~~~~~~~
#
# PDP is an average of the marginal effects of the features. We are averaging the
# response of all samples of the provided set. Thus, some effects could be hidden. In
# this regard, it is possible to plot each individual response. This representation is
Expand Down
8 changes: 6 additions & 2 deletions sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py
Original file line number Diff line number Diff line change
Expand Up @@ -1563,6 +1563,8 @@ class HistGradientBoostingRegressor(RegressorMixin, BaseHistGradientBoosting):
and specifies that each branch of a tree will either only split
on features 0 and 1 or only split on features 2, 3 and 4.

See :ref:`this example<ice-vs-pdp>` on how to use `interaction_cst`.

.. versionadded:: 1.2

warm_start : bool, default=False
Expand Down Expand Up @@ -1908,8 +1910,8 @@ class HistGradientBoostingClassifier(ClassifierMixin, BaseHistGradientBoosting):
.. versionchanged:: 1.4
Added `"from_dtype"` option.

.. versionchanged::1.6
The default will changed from `None` to `"from_dtype"`.
.. versionchanged:: 1.6
The default value changed from `None` to `"from_dtype"`.
Comment on lines +1913 to +1914
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make it consistent with the other class


monotonic_cst : array-like of int of shape (n_features) or dict, default=None
Monotonic constraint to enforce on each feature are specified using the
Expand Down Expand Up @@ -1950,6 +1952,8 @@ class HistGradientBoostingClassifier(ClassifierMixin, BaseHistGradientBoosting):
and specifies that each branch of a tree will either only split
on features 0 and 1 or only split on features 2, 3 and 4.

See :ref:`this example<ice-vs-pdp>` on how to use `interaction_cst`.

.. versionadded:: 1.2

warm_start : bool, default=False
Expand Down
4 changes: 3 additions & 1 deletion sklearn/inspection/_partial_dependence.py
Original file line number Diff line number Diff line change
Expand Up @@ -385,7 +385,9 @@ def partial_dependence(
the average response of an estimator for each possible value of the
feature.

Read more in the :ref:`User Guide <partial_dependence>`.
Read more in
:ref:`sphx_glr_auto_examples_inspection_plot_partial_dependence.py`
and the :ref:`User Guide <partial_dependence>`.

.. warning::

Expand Down
4 changes: 3 additions & 1 deletion sklearn/inspection/_plot/partial_dependence.py
Original file line number Diff line number Diff line change
Expand Up @@ -284,7 +284,9 @@ def from_estimator(
marks on the x-axes for one-way plots, and on both axes for two-way
plots.

Read more in the :ref:`User Guide <partial_dependence>`.
Read more in
:ref:`sphx_glr_auto_examples_inspection_plot_partial_dependence.py`
and the :ref:`User Guide <partial_dependence>`.

.. note::

Expand Down
Loading
0