8000 DOC: document better similarity matrix of spectral clustering · seckcoder/scikit-learn@0bc70b1 · GitHub
[go: up one dir, main page]

Skip to content

Commit 0bc70b1

Browse files
committed
DOC: document better similarity matrix of spectral clustering
1 parent 0c054d7 commit 0bc70b1

File tree

1 file changed

+16
-1
lines changed

1 file changed

+16
-1
lines changed

scikits/learn/cluster/spectral.py

Lines changed: 16 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515

1616
def spectral_embedding(adjacency, k=8, mode=None):
1717
""" Spectral embedding: project the sample on the k first
18-
eigen vectors of the graph laplacian.
18+
eigen vectors of the normalized graph Laplacian.
1919
2020
Parameters
2121
-----------
@@ -123,6 +123,9 @@ def spectral_clustering(adjacency, k=8, mode=None):
123123
------
124124
The graph should contain only one connect component,
125125
elsewhere the results make little sens.
126+
127+
This algorithm solves the normalized cut for k=2: it is a
128+
normalized spectral clustering.
126129
"""
127130
maps = spectral_embedding(adjacency, k=k, mode=mode)
128131
maps = maps[1:]
@@ -172,9 +175,21 @@ def fit(self, X, **params):
172175
-----------
173176
X: array-like or sparse matrix, shape: (p, p)
174177
The adjacency matrix of the graph to embed.
178+
X is an adjacency matrix of a simimlarity graph: its
179+
entries must be positive or zero. Zero means that
180+
elements have nothing in comon, wereas high values mean
181+
that elements are strongly similar.
175182
176183
Notes
177184
------
185+
If you have an affinity matrix, such as a distance matrix,
186+
for which 0 means identical elements, and high values means
187+
very dissimilar elements, it can be transformed in a
188+
simimlarity matrix that is well suited for the algorithm by
189+
applying the heat kernel::
190+
191+
np.exp(- X**2/2. * delta**2)
192+
178193
If the pyamg package is installed, it is used. This
179194
greatly speeds up computation.
180195
"""

0 commit comments

Comments
 (0)
0