Why do we care about eigenvalues of graphs?

Of course, any novel question in mathematics is interesting, but there is an entire discipline of mathematics devoted to studying these eigenvalues, so they must be important.

I always assumed that spectral graph theory extends graph theory by providing tools to prove things we couldn't otherwise, somewhat like how representation theory extends finite group theory. But most results I see in spectral graph theory seem to concern eigenvalues not as means to an end, but as objects of interest in their own right.

I also considered practical value as motivation, e.g. using a given set of eigenvalues to put bounds on essential properties of graphs, such as maximum vertex degree. But I can't imagine a situation in which I would have access to a graph's eigenvalues before I would know much more elementary information like maximum vertex degree.

(EDIT: for example, dtldarek points out that $\lambda_2$ is related to diameter, but then why would we need $\lambda_2$ when we already have diameter? Is this somehow conceptually beneficial?)

So, what is the meaning of graph spectra intuitively? And for what practical purposes are they used? Why is finding the eigenvalues of a graph's adjacency/Laplacian matrices more than just a novel problem?

Alexander Gruber
  • 26,937
  • 30
  • 121
  • 202
  • 8
    There is a several-century old tradition of studying geometrical objects spectrally, through their Laplacians and other related objects; this was initiated by the greeks and their study of music, or something like that. Spectral graph theory is just the discretization of that. – Mariano Suárez-Álvarez Feb 20 '13 at 09:37

6 Answers6


This question already has a number of nice answers; I want to emphasize the breadth of this topic.

Graphs can be represented by matrices - adjacency matrices and various flavours of Laplacian matrices. This almost immediately raises the question as to what are the connections between the spectra of these matrices and the properties of the graphs. Let's call the study of these connections "the theory of graph spectra". (But I am not entirely happy with this definition, see below.) It is tempting to view the map from graphs to eigenvalues as a kind of Fourier theory, but there are difficulties with this analogy. First, graphs in general are not determined by the their eigenvalues. Second, which of the many adjacency matrices should we use?

The earliest work on graph spectra was carried out in the context of the Hueckel molecular orbital theory in Quantum Chemistry. This lead among other things to work on the matching polynomial; this gives us eigenvalues without adjacency matrices (which is why I feel the above definition of the topic is unsatisfactory). A more recent manifestation of this stream of ideas is the work on the spectra of fullerenes.

The second source of the topic arises in Seidel's work on regular two-graphs, which started with questions about regular simplices in real projective space and lead to extraordinarily interesting questions about sets of equiangular lines in real space. The complex analogs of these questions are now of interest to quantum physicists - see SIC-POVMs. (It is not clear what role graph theory can play here.) In parallel with Seidel's work was the fundamental paper by Hoffman and Singleton on Moore graphs of diameter two. In both cases, the key observation was that certain extremal classes of graphs could be characterized very naturally by conditions on their spectra. This work gained momentum because a number of sporadic simple groups were first constructed as automorphism groups of graphs. For graph theorists it flowered into the the theory of distance-regular graphs, starting with the work of Biggs and his students, and still very active.

One feature of the paper of Hoffman and Singleton is that its conclusion makes no reference to spectra. So it offers an important graph theoretical result for which the "book proof" uses eigenvalues. Many of the results on distance-regular graphs preserve this feature.

Hoffman is also famous for his eigenvalue bounds on chromatic numbers, and related bounds on the maximum size of independent sets and cliques. This is closely related to Lovász's work on Shannon capacity. Both the Erdős-Ko-Rado theorem and many of its analogs can now be obtained using extensions of these techniques.

Physicists have proposed algorithms for graph isomorphism based on the spectra of matrices associated to discrete and continuous walks. The connections between continuous quantum walks and graph spectra are very strong.

Chris Godsil
  • 13,260
  • 2
  • 20
  • 36

I can't speak much to what traditional Spectral Graph Theory is about, but my personal research has included the study of what I call "Spectral Realizations" of graphs. A spectral realization is a special geometric realization (vertices are not-necessarily-distinct points, edges are not-necessarily-non-degenerate line segments, in some $\mathbb{R}^n$) derived from the eigenvectors of a graph's adjacency matrix.

In particular, if the rows of a matrix constitute a basis for some eigenspace of the adjacency matrix a graph $G$, then the columns of that matrix are coordinate vectors of (a projection of) a spectral realization.

A spectral realization of a graph has two nice properties:

  • It's harmonious: Every graph automorphism induces to a rigid isometry of the realization; you can see the graph's automorphic structure!
  • It's eigenic: Moving each vertex to the vector-sum of its immediate neighbors is equivalent to scaling the figure; the scale factor is the corresponding eigenvalue.

Well, the properties are nice in theory. Usually, a spectral realization is a jumble of collapsed segments, or is embedded is high-dimensional space; such circumstances make a realization difficult to "see". Nevertheless, a spectral realization can be a helpful first pass at visualizing a graph. Moreover, a graph with a high degree of symmetry can admit some visually-interesting low-dimensional spectral realizations; for example, the skeleton of the truncated octahedron has this modestly-elaborate collection:

Spectral Realizations of the Truncated Octahedron

For a gallery of hundreds of these things, see the PDF linked at my Bloog post, "Spectral Realizations of Graphs".

Since many mathematical objects decompose into eigen-objects, it probably comes as no surprise that any geometric realization of a graph is the sum of spectral realizations of that graph. (Simply decomposing the realization's coordinate matrix into eigen-matrices gets most of the way to that result, although the eigen-matrices themselves usually represent "affine images" of properly-spectral realizations. The fact that affine images decompose into a sum of similar images takes an extension of a theorem of Barlotti.) There's likely something interesting to be said about how each spectral component spectral influences the properties of the combined figure.

Anyway ... That's why I care about the eigenvalues of graphs.

  • 69,412
  • 11
  • 108
  • 217
  • 3
    That is very neat. I am going to leave the question open because I would still like to know the answer for traditional spectral graph theory, but I appreciate your post. I'll read your paper. – Alexander Gruber Feb 21 '13 at 20:20

Spectral graph theory is a discrete analogue of spectral geometry, with the Laplacian on a graph being a discrete analogue of the Laplace-Beltrami operator on a Riemannian manifold. The Laplacian $\Delta$ can be used to write down three important differential equations, both on a graph and a Riemannian manifold:

  • The heat equation $\frac{\partial u}{\partial t} = \Delta u$, which describes how heat propagates on the graph / manifold,
  • The wave equation $\frac{\partial^2 u}{\partial t^2} = \Delta u$, which describes how waves propagate on the graph / manifold,
  • The Schrödinger equation $i \frac{\partial u}{\partial t} = \Delta u$, which describes how quantum particles propagate on the graph / manifold.

The behavior of solutions to these equations is controlled by the eigenvalues of the Laplacian. For the heat equation these eigenvalues control the rate at which a given heat distribution decays to a stationary distribution, and for the wave and Schrödinger equations these eigenvalues control the rate at which standing wave solutions oscillate (this gives some intuition as to why the eigenvalues should be related to the connectivity of the graph). So these eigenvalues describe some important physical properties of the graph / manifold.

Qiaochu Yuan
  • 359,788
  • 42
  • 777
  • 1,145
  • 2
    (revisiting after a year) Thanks again. Since asking this question, this is the answer that has stuck most in my head. I have a followup question. I can see how wave propogation on graphs could be useful in applied contexts, for computable approximations of continuous structures, and other things. Does wave propogation on (finite/discrete) graphs occur naturally in other nonapplied areas mathematics? – Alexander Gruber Mar 18 '14 at 21:44
  • 3
    @Alexander: there are a number of results in spectral graph theory describing how the spectrum of the Laplacian controls random walks on a graph, and I think morally speaking the heat equation is the "reason" that these results exist. – Qiaochu Yuan Mar 30 '14 at 01:31
  • @QiaochuYuan, Interestingly, not only morally but literally, the random walk is related to [Heat Equation](https://en.wikipedia.org/wiki/Heat_equation#General_description). – SddS Nov 22 '17 at 08:35

Some random facts:

  • The largest eigenvalue $\lambda_1$ is closely related to average degree.
  • The second largest eigenvalue $\lambda_2$ is closely related to connectivity, that is graphs with small $\lambda_2$ have small diameter.
  • A graph $G$ is bipartite if and only if for every eigenvalue $\lambda$, $-\lambda$ is also an eigenvalue.
  • For connected $G$, the chromatic number of $G$, $\chi(G) \leq \lambda_1+1$, with equality for $G$ being complete or an odd cycle.
  • Some use spectrum for clustering.

I hope it helps ;-)

  • 36,363
  • 8
  • 53
  • 121

Expanding on a random fact mentioned by dtldarek: People frequently use spectral graph theory in clustering.

The general situation is that you have a collection of data points $x_1, \dots x_n$, that belong to some unknown clusters $A_1, \dots, A_k$. You want to identify which points lie in which cluster. You have some sort of connection graph, and the hope is that connections tend to occur within clusters -- e.g. the density of edges within $A_1$ is much higher than the density of edges connecting $A_1$ and $A_2$. One way to (try and) find a cluster, then can be thought of as searching for a subset $A$ such that $$\frac{\textrm{The number of edges between }A \textrm{ and the rest of the graph}}{\textrm{The total number of edges involving} A }$$ is unusually small.

More generally, we can imagine a weighted graph where between each pair of vertices we put a weight corresponding to how "close" the vertices are in some sense. The goal is then to find an $A$ such that the total weight of edges between $A$ and the rest of the graph is much smaller than the total weight of edges involving $A$.

The catch is that if $G$ has $n$ vertices, there's $2^n$ choices for $A$, so we can't just search them all if $n$ is large. But what often turns out to be a good approximation is to choose $A$ based on the eigenvectors for the Laplacian matrix of your (weighted) connection graph (in particular the eigenvectors corresponding to the smallest nonzero eigenvalues of $L$).

The idea is that if you take a vector $x$ and a graph having weights $w_{ij}$ on the eges, then $$x^T L x = \sum_{\textrm{edges } (i,j) } w_{ij} (x_i-x_j)^2$$ In the special case where $x_i=1$ for $i$ in $A$ and $0$ for $i$ not in $A$, this reduces down to $x^T L x$ is the total weight of edges connecting $A$ with the rest of the graph. Eigenvectors of $L$ with small eigenvalues correspond to choices of $x$ which make $x^T L x$ small.

For more on this, see this tutorial by Ulrike von Luxburg.

Kevin P. Costello
  • 6,196
  • 21
  • 45

Another reason the eigenvalues are used if for approximating certain functions in a graph that would be too complex to compute otherwise. For instance, in social networks one often wants to count the number of paths of length $k$ between two nodes $u$ and $v$. In a large graph, this may be too complex to compute exactly. One way to approximate it consists in considering the $k^{\mathrm{th}}$ power of the adjacency matrix $\mathbf A$. However, even if $\mathbf A$ is sparse, $\mathbf A^k$ is not for even small $k$ (remember the "six degrees of separation" – the diameter of a social network is low). Thus, a method to compute entries of $\mathbf A^k$ approximately uses the eigenvalue decomposition of $\mathbf A$. If the graph is undirected then we can write

$$ \mathbf A = \mathbf U \mathbf Λ \mathbf U^{\mathrm T} $$ with $\mathbf U$ and $\mathbf Λ$ being $n \times n$ matrices. A rank-reduced eigenvalue can be computed efficiently, i.e., one where $\mathbf U$ is replaced by an $n×k$ matrix $\mathbf {\tilde U}$, and $\mathbf Λ$ by a $k×k$ matrix $\mathbf{\tilde Λ}$. Then, $\mathbf A^k$ can be approximated by

$$ \mathbf A^k ≈ \mathbf{\tilde U} \mathbf{\tilde Λ}^k \mathbf{\tilde U}^{\mathrm T} $$

and to approximate only $(\mathbf A^k)_{uv}$ (i.e., the number of $k$-paths between nodes $u$ and $v$), we only need to compute $\mathbf{\tilde U}_{i\bullet} \mathbf{\tilde D}^k \mathbf{\tilde U}_{j\bullet}^{\mathrm T}$. This is a case where one is not interested in the eigenvalues of eigenvectors themselves, but uses the eigenvalue decomposition for approximation.

The same technique can be used to approximate other functions of $\mathbf A$, for instance the matrix exponential.