From b34318f09a0c4f615d94bfb2b43d3fe8e54dcae7 Mon Sep 17 00:00:00 2001 From: Samson Date: Thu, 5 Jan 2017 22:57:34 +0800 Subject: [PATCH 1/2] Add prominent mention of Laplacian Eigenmaps being the algorithm that Spectral Embedding implements --- doc/modules/manifold.rst | 13 ++++++------- sklearn/manifold/spectral_embedding_.py | 6 +++++- 2 files changed, 11 insertions(+), 8 deletions(-) diff --git a/doc/modules/manifold.rst b/doc/modules/manifold.rst index 3c003e0d0cb50..ddc82fed31b1f 100644 --- a/doc/modules/manifold.rst +++ b/doc/modules/manifold.rst @@ -305,11 +305,11 @@ The overall complexity of standard HLLE is Spectral Embedding ==================== -Spectral Embedding (also known as Laplacian Eigenmaps) is one method -to calculate non-linear embedding. It finds a low dimensional representation -of the data using a spectral decomposition of the graph Laplacian. -The graph generated can be considered as a discrete approximation of the -low dimensional manifold in the high dimensional space. Minimization of a +Spectral Embedding is an approach to calculating a non-linear embedding. +Scikit-learn implements Laplacian Eigenmaps, which finds a low dimensional +representation of the data using a spectral decomposition of the graph +Laplacian. The graph generated can be considered as a discrete approximation of +the low dimensional manifold in the high dimensional space. Minimization of a cost function based on the graph ensures that points close to each other on the manifold are mapped close to each other in the low dimensional space, preserving local distances. Spectral embedding can be performed with the @@ -319,7 +319,7 @@ function :func:`spectral_embedding` or its object-oriented counterpart Complexity ---------- -The Spectral Embedding algorithm comprises three stages: +The Spectral Embedding (Laplacian Eigenmaps) algorithm comprises three stages: 1. **Weighted Graph Construction**. Transform the raw input data into graph representation using affinity (adjacency) matrix representation. @@ -640,4 +640,3 @@ Tips on practical use :ref:`random_trees_embedding` can also be useful to derive non-linear representations of feature space, also it does not perform dimensionality reduction. - diff --git a/sklearn/manifold/spectral_embedding_.py b/sklearn/manifold/spectral_embedding_.py index c2fc878693c93..b12c628dd1a91 100644 --- a/sklearn/manifold/spectral_embedding_.py +++ b/sklearn/manifold/spectral_embedding_.py @@ -149,6 +149,8 @@ def spectral_embedding(adjacency, n_components=8, eigen_solver=None, However care must taken to always make the affinity matrix symmetric so that the eigenvector decomposition works as expected. + Note : Laplacian Eigenmaps is the actual algorithm implemented here. + Read more in the :ref:`User Guide `. Parameters @@ -189,7 +191,7 @@ def spectral_embedding(adjacency, n_components=8, eigen_solver=None, Notes ----- - Spectral embedding is most useful when the graph has one connected + Spectral Embedding (Laplacian Eigenmaps) is most useful when the graph has one connected component. If there graph has many components, the first few eigenvectors will simply uncover the connected components of the graph. @@ -329,6 +331,8 @@ class SpectralEmbedding(BaseEstimator): The resulting transformation is given by the value of the eigenvectors for each data point. + Note : Laplacian Eigenmaps is the actual algorithm implemented here. + Read more in the :ref:`User Guide `. Parameters From 740b78bdf35b80703973ff3cc24317379b38ba80 Mon Sep 17 00:00:00 2001 From: Joel Nothman Date: Wed, 18 Jan 2017 12:35:33 +1100 Subject: [PATCH 2/2] Line length --- sklearn/manifold/spectral_embedding_.py | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/sklearn/manifold/spectral_embedding_.py b/sklearn/manifold/spectral_embedding_.py index b12c628dd1a91..39a7355cc8f51 100644 --- a/sklearn/manifold/spectral_embedding_.py +++ b/sklearn/manifold/spectral_embedding_.py @@ -191,9 +191,9 @@ def spectral_embedding(adjacency, n_components=8, eigen_solver=None, Notes ----- - Spectral Embedding (Laplacian Eigenmaps) is most useful when the graph has one connected - component. If there graph has many components, the first few eigenvectors - will simply uncover the connected components of the graph. + Spectral Embedding (Laplacian Eigenmaps) is most useful when the graph + has one connected component. If there graph has many components, the first + few eigenvectors will simply uncover the connected components of the graph. References ----------