8000 DOC: Correct links and typos in 6.6 Random Projections by star1327p · Pull Request #30848 · scikit-learn/scikit-learn · GitHub
[go: up one dir, main page]

Skip to content

DOC: Correct links and typos in 6.6 Random Projections #30848

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Feb 18, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions doc/modules/random_projection.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ technique for distance based method.
Kaufmann Publishers Inc., San Francisco, CA, USA, 143-151.

* Ella Bingham and Heikki Mannila. 2001.
`Random projection in dimensionality reduction: applications to image and text data. <https://citeseerx.ist.psu.edu/doc_view/pid/aed77346f737b0ed5890b61ad02e5eb4ab2f3dc6>`_
`Random projection in dimensionality reduction: applications to image and text data. <https://cs-people.bu.edu/evimaria/cs565/kdd-rp.pdf>`_
In Proceedings of the seventh ACM SIGKDD international conference on
Knowledge discovery and data mining (KDD '01). ACM, New York, NY, USA,
245-250.
Expand Down Expand Up @@ -84,7 +84,7 @@ bounded distortion introduced by the random projection::

* Sanjoy Dasgupta and Anupam Gupta, 1999.
`An elementary proof of the Johnson-Lindenstrauss Lemma.
<https://citeseerx.ist.psu.edu/doc_view/pid/95cd464d27c25c9c8690b378b894d337cdf021f9>`_
<https://cseweb.ucsd.edu/~dasgupta/papers/jl.pdf>`_

.. _gaussian_random_matrix:

Expand All @@ -95,7 +95,7 @@ dimensionality by projecting the original input space on a randomly generated
matrix where components are drawn from the following distribution
:math:`N(0, \frac{1}{n_{components}})`.

Here a small excerpt which illustrates how to use the Gaussian random
Here is a small excerpt which illustrates how to use the Gaussian random
projection transformer::

>>> import numpy as np
Expand Down Expand Up @@ -136,7 +136,7 @@ where :math:`n_{\text{components}}` is the size of the projected subspace.
By default the density of non zero elements is set to the minimum density as
recommended by Ping Li et al.: :math:`1 / \sqrt{n_{\text{features}}}`.

Here a small excerpt which illustrates how to use the sparse random
Here is a small excerpt which illustrates how to use the sparse random
projection transformer::

>>> import numpy as np
Expand Down Expand Up @@ -179,7 +179,7 @@ been computed during fit, they are reused at each call to ``inverse_transform``.
Otherwise they are recomputed each time, which can be costly. The result is always
dense, e 577F ven if ``X`` is sparse.

Here a small code example which illustrates how to use the inverse transform
Here is a small code example which illustrates how to use the inverse transform
feature::

>>> import numpy as np
Expand Down
Loading
0