@@ -305,11 +305,11 @@ The overall complexity of standard HLLE is
305
305
Spectral Embedding
306
306
====================
307
307
308
- Spectral Embedding (also known as Laplacian Eigenmaps) is one method
309
- to calculate non-linear embedding. It finds a low dimensional representation
310
- of the data using a spectral decomposition of the graph Laplacian.
311
- The graph generated can be considered as a discrete approximation of the
312
- low dimensional manifold in the high dimensional space. Minimization of a
308
+ Spectral Embedding is an approach to calculating a non-linear embedding.
309
+ Scikit-learn implements Laplacian Eigenmaps, which finds a low dimensional
310
+ representation of the data using a spectral decomposition of the graph
311
+ Laplacian. The graph generated can be considered as a discrete approximation of
312
+ the low dimensional manifold in the high dimensional space. Minimization of a
313
313
cost function based on the graph ensures that points close to each other on
314
314
the manifold are mapped close to each other in the low dimensional space,
315
315
preserving local distances. Spectral embedding can be performed with the
@@ -319,7 +319,7 @@ function :func:`spectral_embedding` or its object-oriented counterpart
319
319
Complexity
320
320
----------
321
321
322
- The Spectral Embedding algorithm comprises three stages:
322
+ The Laplacian Eigenmaps algorithm comprises three stages:
323
323
324
324
1. **Weighted Graph Construction **. Transform the raw input data into
325
325
graph representation using affinity (adjacency) matrix representation.
@@ -640,4 +640,3 @@ Tips on practical use
640
640
:ref: `random_trees_embedding ` can also be useful to derive non-linear
641
641
representations of feature space, also it does not perform
642
642
dimensionality reduction.
643
-
0 commit comments