8000 Add prominent mention of Laplacian Eigenmaps being the algorithm that… · scikit-learn/scikit-learn@5c9a6c4 · GitHub
[go: up one dir, main page]

Skip to content

Commit 5c9a6c4

Browse files
committed
Add prominent mention of Laplacian Eigenmaps being the algorithm that Spectral Embedding implements
1 parent ca687ba commit 5c9a6c4

File tree

2 files changed

+11
-8
lines changed

2 files changed

+11
-8
lines changed

doc/modules/manifold.rst

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -305,11 +305,11 @@ The overall complexity of standard HLLE is
305305
Spectral Embedding
306306
====================
307307

308-
Spectral Embedding (also known as Laplacian Eigenmaps) is one method
309-
to calculate non-linear embedding. It finds a low dimensional representation
310-
of the data using a spectral decomposition of the graph Laplacian.
311-
The graph generated can be considered as a discrete approximation of the
312-
low dimensional manifold in the high dimensional space. Minimization of a
308+
Spectral Embedding is an approach to calculating a non-linear embedding.
309+
Scikit-learn implements Laplacian Eigenmaps, which finds a low dimensional
310+
representation of the data using a spectral decomposition of the graph
311+
Laplacian. The graph generated can be considered as a discrete approximation of
312+
the low dimensional manifold in the high dimensional space. Minimization of a
313313
cost function based on the graph ensures that points close to each other on
314314
the manifold are mapped close to each other in the low dimensional space,
315315
preserving local distances. Spectral embedding can be performed with the
@@ -319,7 +319,7 @@ function :func:`spectral_embedding` or its object-oriented counterpart
319319
Complexity
320320
----------
321321

322-
The Spectral Embedding algorithm comprises three stages:
322+
The Laplacian Eigenmaps algorithm comprises three stages:
323323

324324
1. **Weighted Graph Construction**. Transform the raw input data into
325325
graph representation using affinity (adjacency) matrix representation.
@@ -640,4 +640,3 @@ Tips on practical use
640640
:ref:`random_trees_embedding` can also be useful to derive non-linear
641641
representations of feature space, also it does not perform
642642
dimensionality reduction.
643-

sklearn/manifold/spectral_embedding_.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -149,6 +149,8 @@ def spectral_embedding(adjacency, n_components=8, eigen_solver=None,
149149
However care must taken to always make the affinity matrix symmetric
150150
so that the eigenvector decomposition works as expected.
151151
152+
Note : Laplacian Eigenmaps is the actual algorithm implemented here.
153+
152154
Read more in the :ref:`User Guide <spectral_embedding>`.
153155
154156
Parameters
@@ -189,7 +191,7 @@ def spectral_embedding(adjacency, n_components=8, eigen_solver=None,
189191
190192
Notes
191193
-----
192-
Spectral embedding is most useful when the graph has one connected
194+
Laplacian Eigenmaps is most useful when the graph has one connected
193195
component. If there graph has many components, the first few eigenvectors
194196
will simply uncover the connected components of the graph.
195197
@@ -329,6 +331,8 @@ class SpectralEmbedding(BaseEstimator):
329331
The resulting transformation is given by the value of the
330332
eigenvectors for each data point.
331333
334+
Note : Laplacian Eigenmaps is the actual algorithm implemented here.
335+
332336
Read more in the :ref:`User Guide <spectral_embedding>`.
333337
334338
Parameters

0 commit comments

Comments
 (0)
0