Eigenvalues, the Laplacian, and spectral clustering
Spectral graph theory studies the relationship between graph structure and the eigenvalues of associated matrices. By encoding a graph as a matrix, we can use powerful tools from linear algebra to uncover hidden properties.
The adjacency, degree, and Laplacian matrices each capture different aspects of a graph. Their eigenvalues reveal connectivity, expansion properties, and natural partitions that are invisible from the combinatorial perspective alone.
Every graph has associated matrices that encode its structure. The adjacency matrix A records which vertices are connected, the degree matrix D captures vertex degrees along the diagonal, and the Laplacian L = D - A combines both to encode connectivity.
Click a vertex to select, click two vertices to toggle edge
Key insight: All three matrices are symmetric for undirected graphs, guaranteeing real eigenvalues — a prerequisite for the spectral analysis that follows.
The eigenvalues of the Laplacian matrix reveal graph structure. The number of zero eigenvalues equals the number of connected components, and the second-smallest eigenvalue lambda-2 (algebraic connectivity) tells us how well-connected the graph is.
Key insight: A graph is connected if and only if lambda-2 > 0. The larger lambda-2 is, the harder it is to disconnect the graph by removing edges — this is why it is called the algebraic connectivity.
Spectral clustering uses the Fiedler vector (eigenvector of lambda-2) to partition graphs. Vertices are assigned to clusters based on the sign of their Fiedler vector component, naturally minimizing the number of edges cut.
Key insight: The Fiedler vector provides an optimal relaxation of the NP-hard minimum bisection problem — spectral methods turn a combinatorial optimization into a linear algebra computation.