I thought "Wow!" when I learned that the eigenvector of the adjacency matrix of a cycle graph $C_n$ corresponding to the second largest eigenvalue gives the coordinates of the vertices when equally distributed on the unit cycle: the $n$-th roots of unity (from complex analysis) come up in a completely discrete context! What's more: The coordinates of the eigenvector can be "interpreted" straight forwardly when assigned properly to the vertices.

There's another straight-forward interpretation of an adjacency eigenvector: the eigenvector corresponding to the largest eigenvalue gives the relative importances of the vertices, being proportional to the sum of the relative importances of its neighbors.

Question: Can more - or eventually all - adjacency eigenvectors be sensibly
"interpreted"?

Or does it in general depend on the type of graph, whether and how the eigenvectors can be interpreted, and the first example above is just a curio?

What about the interpretation of the eigen-values? Do at least some of them "mean" something conceivable?

You are asking for something not covered in the usual textbooks about spectral graph theory?
–
Mariano Suárez-Alvarez♦May 17 '10 at 8:31

Cvetkovic/Doob/Sachs's "Spectra of Graphs" is the textbook I have at hand. Maybe it's covered, but not - what I would have expected - in a general context, and not in an introductory and "motivating" fashion. Can you recommend another textbook?
–
Hans StrickerMay 17 '10 at 8:50

2

I'm not sure whether this is satisfying, but Theorem 1.2 gives an interpretation for the characteristic polynomial of the ordinary spectrum. The Laplacian Spectrum is covered by Theorem 1.4. I'm not aware of a direct interpretation of the eigenvectors, but possibly Hückel theory is your friend, see the bottom of page 230 of Spectra of Graphs.
–
Martin RubeyMay 17 '10 at 9:13

7 Answers
7

If the graph has an eigenspace with dimension greater than one, then it is going
to be difficult to relate properties of eigenvectors to properties of the graph.
One way to get around this is to work with the orthogonal projections onto
the eigenspace. If $A$ is the adjacency matrix then
$$
A^r =\sum_\theta \theta^r E_\theta
$$
where $\theta$ runs over the distinct eigenvalues and $E_\theta$ is the projection
on the $\theta$-eigenspace. From this we see that the eigenvalues along with the entries of $E_\theta$ give precise information about walks on the graph, and general information about expansion properties.

There are many natural ways of representing a graph by a matrix. For example
you might argue that we should use $-A$ in place of $A$. This suggests that
assigning a meaning to eigenvalues might be difficult, but it makes it less surprising
that there are useful bounds on size of cliques and the chromatic number involving
ratios of eigenvalues (see "Hoffman bounds").

For many planar graphs (for example, fullerenes), the image of the projection of a standard basis onto the sum of the second through fourth eigenspaces is a polytope whose 1-skeleton is often the
original graph. This is related to Tutte's "How to draw a graph" and to work
on the Colin de Verdiere number, but it is fair to say that we cannot really explain
what is going on here.

When the graph has a high degree of symmetry, these realizations --which I call "spectral"[*]-- have a great visual appeal; in general, though, these realizations are jumbles of points in one-dimensional space. In most cases, multiple vertices (and edges) are collapsed to single points, so that the realizations aren't faithful.

Sorry for not having brought together your recent answer with my current question. I wasn't aware that my - seemingly unrelated - questions admitted the very same answer!
–
Hans StrickerMay 17 '10 at 10:23

@Hans No apology necessary. This result is a distillation of the core of my Masters Thesis from a while back. I'm kinda proud of it, so I'm more than happy to repeat it at every opportunity. :)
–
BlueMay 17 '10 at 10:31

Another amusing case is that of the truncated icosahedral graph (chemically referred to as Buckminsterfullerene when realized as a 60-atom carbon cluster). Here the second eigenvalue is 3-fold degnerate, and if x, y, z denote 3 real equi-norm orthogonal eigenvectors for this eigenvalue, then the 60 triples of corresponding components locate the vertices as embedded in Euclidean space so as to manifest the icosahedral symmetry. See: D. E. Manopolous & P. W. Fowler, J. Chem. Phys. 96 (1992) 7603-7615 . In fact, these authors go on to similarly treat other "fullerenes" (which are polyhedra with degree-3 vertices and faces which are either pentagons or hexagons). In general the requisite eigenvalues are not degenerate, but are those which have eigenvectors with components dividing the graph into exactly 2 connected regions of different signs for the components -- also some scaling of the components by appropriate fuctions of the different eigenvalues is used. There has been some further work on such ideas for other suitable graphs - for the non-regular case the Laplaian matrix might plausibly be preferred over the adjacency matrix.
What is going on can be intuitively "understood", best in terms of the Laplacian, which is a discrete analog of the classical Laplacian from analysis. For this classical Laplacian, eigenvectors correspond to standing waves with energy corresponding to the eigenvalue. The long wave-length waves which have a single internal node partition the region (i.e., the graph in our analogy) with the amplitudes giving the distances from the node.
There is also W. T. Tutte's "classic" scheme for "drawing" the Schlegel diagram of a polyhedral graph: Proc. London Math. Soc. 13 (1963) 743-767. For a more recent survey on "Bridges between Geometry and Graph Theory" see T. Pisanski & M. Randic, pages 174-194 in Geopmetry at Work, ed. C. A. Gorini (MAA, Wash. DC, 2000).

Regarding interpretations of eigenvalues, I recall that the spectra of a graph can tell you if it is bipartite. I think a connected graph is bipartite if and only if $-\lambda$ is one of its eigenvalues (here $\lambda$ is the largest eigenvalue). Please correct me if I've muddled the theorem.

Stronger: $-\lambda$ is an eigenvalue whenever $\lambda$ is. Proof via "spectral" realizations (see my earlier answer): If $\lambda$ is an eigenvalue, then there's a corresponding spectral realization of the graph in coordinate space. Such a realization is "eigenic": moving each vertex to the vector sum of its neighbors is the same as scaling the figure by factor of $\lambda$. If the graph is bipartite, we can take a 2-coloring and move each vertex of one color to that vertex's reflection in the origin, arriving at an eigenic realization with scale factor (and, thus, eigenvalue) $-\lambda$.
–
BlueJul 22 '10 at 4:14

One way of making the last statement in Chris Godsil's answer precise is as follows: Consider a planar graph which is $3-$edge connected. By a famous theorem, it is then the 1-skeleton of a $3-$dimensional polytope. Choose positive weights on edges in such a way that the resulting adjacency matrix plus a suitable diagonal matrix (a Schroedinger operator) has an eigenspace of dimension $3$ for the second-smallest eigenvalue. We need moreover a slightly technical stability condition: the set of all such Schroedinger operators on the graph has to be transversal at the chosen matrix to the set of all symmetric matrices with second smallest eigenvalue of multiplicity $3$
(this is a stability condition allowing for small perturbations). The coordinates of any basis of the second smallest eigenvalue are then the coordinates for the vertices of a $3-$dimensional polytope realizing the graph.
(Such a basis should be close to the $2-$nd up to the $4-$th smallest eigenvectors, thus Chris Godsil's remark.

There are also partial generalization of this construction related to graphs with Colin de Verdiere number higher than $3$.

Interesting structural features in graphs are often reflected in eigenvectors (how's that for a generality?) It is often a matter of skill which eigenvector to take. For example, a cyclic grph has multiplicity 2 for the next to largest eigenvalue. Take a 6-cycle (since it is easy). The 2nd eigenvalue is 1 and a generic eigenvector (in cyclic order) is
(1,t,t-1,-1,-t,1-t) SO (looking for relatively few distinct values)
(1,1/2,-1/2,-1,-1/2,1/2) and (1,1,0,-1,-1,0) (and their cyclic shifts) seem nice. Rotating the latter to (0,1,1,0,-1,-1) multiplying by $\frac{\sqrt{-3}}{2}$ and adding this to (1,1/2,-1/2,-1,-1/2,1/2) does give a nice eigenvector with values on the unit circle, but it is not the eigenvector.

Here is another way to see that $-\lambda$ is an eigenvalue of a bipartite graph when $\lambda $ is: Take an eigenvector for $\lambda$ and replace all the values in one part by their negatives. This shows that (since the largest eigenvalue is unique in that it has an all positive eigenvector) the smallest (in the bipartite case) is unique in that it is positive on one half of the graph and negative on the other, so it reveals the bipartition. One might guess that in a general graph the smallest eigenvalue might have some eigenvectors which partition the vertices into two classes (positive and negative) in a way which minimizes the number of edges connected vertices of the same sign. Several theorems of Miroslav Fiedler are relevant to these considerations.

I'll mention too that I've found it useful to consider the subspace spanned by (eigenvectors for) several eigenvalues. Consider the skeleton of a cube. The sum of the first and second eigenspaces are spanned by the characteristic vectors of the 6 faces. The sum of the first second and third eigenvalues is spanned by the characteristic vectors of the 12 edges. The sum of the first four eigenvalues (i.e. everything!) is spanned by the characteristic vectors of the 8 vertices. This is true (mutatis mutandis) much more generally (Hamming graphs, finite nets, Johnson graphs and other graphs arising from highly regular geometries).