Eigenvalues, the Laplacian, and spectral clustering
Spectral graph theory studies the relationship between graph structure and the eigenvalues of associated matrices. By encoding a graph as a matrix, we can use powerful tools from linear algebra to uncover hidden properties!
In this section, you'll explore the adjacency and Laplacian matrices, learn how eigenvalues reveal connectivity, and see how spectral methods enable graph clustering.
Every graph has associated matrices that encode its structure. The Adjacency matrix A, Degree matrix D, and Laplacian L = D - A reveal deep properties of the graph.
Click a vertex to select, click two vertices to toggle edge
The eigenvalues of the Laplacian matrix reveal graph structure. The second-smallest eigenvalue λ₂ (algebraic connectivity) tells us if the graph is connected!
Spectral clustering uses the Fiedler vector (eigenvector of λ₂) to partition graphs. Vertices are assigned to clusters based on the sign of their Fiedler vector component!
1. Graph Matrices: Every graph has associated matrices — the adjacency matrix A, degree matrix D, and Laplacian L = D - A. These encode the graph structure in a form amenable to linear algebra.
2. Laplacian Eigenvalues: The eigenvalues of L reveal graph properties. The number of zero eigenvalues equals the number of connected components. The second-smallest eigenvalue λ₂ (algebraic connectivity) is positive iff the graph is connected.
3. Spectral Clustering: The eigenvector of λ₂ (Fiedler vector) provides a natural way to partition the graph. Vertices with the same sign in the Fiedler vector tend to form clusters, minimizing edges cut.
Next Up: In the next section, you'll explore random graphs and networks — Erdős-Rényi models, small-world phenomena, and scale-free networks!