I have some finite graph $G$ with $n$ vertices and adjacency matrix $A$. Let $D$ be the $n$ by $n$ matrix with the degree of vertex $i$ at the $i,i$ entry, and 0's everywhere. Finally, let $L = D - A$ be the (unnormalized) graph Laplacian of $G$. Next, fix some collection of eigenvectors and eigenvalues of $L$.

My big-picture question is: Under what conditions are there other graphs which share those eigenvectors/values?

With a little more precision: Approximately how many eigenvectors & eigenvalues can be specified before the answer is no? About how many graphs are there when the answer is yes?

It seems likely that the answer would be a little complicated. I know a few special cases (e.g. 2 eigenvectors/values determine cycles completely; on the other hand, as long as the average degree is fairly large, there are generally very many graphs with the same bottom eigenvector). I certainly appreciate hearing about conditions which aren't tight, as long as they are at least a little broad.

I'm interested in the situation where the eigenvectors DON'T determine the graph, so I would also appreciate any literature pointers to `relaxations' of this idea. For example, one could imagine requiring that the Laplacian contracts the eigenvectors by at least a certain amount (this certainly allows many graphs, but that space is pretty big). In another direction, it seems plausible there is some analogue in the language of graphons.

What do the eigenvectors look like for isospectral graphs?
–
Douglas ZareAug 18 '12 at 21:57

Re: Douglas Zare: I'm not sure. The few `classical' examples I've looked at seem to have pretty different eigenvectors. Until your question, I hadn't thought about this avenue - I thought they were of interest if you want different eigenvectors. Could you let me know any intuition about why these might be good candidates?
–
flocAug 18 '12 at 22:25

It's just that in isospectral graphs, the eigenvalues aren't different, so if you hope to distinguish graphs by their eigenvalues and eigenvectors then in those examples the eigenvectors must be enough.
–
Douglas ZareAug 18 '12 at 23:02

1

It seems plausible to me that for a generic connected graph, a small collection of eigenvectors and eigenvalues (maybe $O(\log n)$ of them?) will already generate the rest of them under the action of the corresponding Galois group.
–
Qiaochu YuanAug 18 '12 at 23:18

Re Douglas: Thank you. Re Qiaochu: Interesting comment. I wouldn't be surprised if a small number of eigenvectors were essentially always enough, but am not familiar with what you mean by the `corresponding Galois group'. A quick Google search only found me articles that seemed focused on graphs with at least some symmetry. (I should also mention: Even ~log(n) is interesting to me. I'd be happy to know that for large, fairly dense graphs, there is generically freedom to fix ~10 eigenvectors! )
–
flocAug 19 '12 at 0:02

2 Answers
2

The experimental evidence for adjacency matrices suggests that, for a random graph, its characteristic polynomial is irreducible over the rationals. I would expect that the characteristic polynomial of the Laplacian of a random graph on $n$ vertices would have one irreducible factor of degree $n-1$. However, while it easy is to convince yourself of this by testing on random graphs, absolutely nothing has been proved.

If we move away from the generic case, things become much more complicated.
All circulant graphs can be assumed to have the same orthogonal basis of eigenvectors
(a Vandermonde matrix) and there are examples of cospectral circulants. (Note
that for regular graphs, the adjacency matrix and the Laplacian have the same eigenvectors, and provide the same spectral information.) So now we
have nonisomorphic graphs with the same eigenvalues and eigenvectors. Hence we must assume that we are given pairs (eigenvalue, eigenvector). This is not quite enough, because our eigenvalues need not be simple. It would seem that our data
will be pairs consisting of an eigenvalue and the matrix representing orthogonal
projection onto the corresponding eigenspace.

Let $M$ be a $d\times m$ binary matrix with linearly independent rows.
Let $X(M)$ be the graph with the elements of $\mathbb{Z}_2^d$ as its vertices,
two adjacent if and only if their difference is a column of $M$. Such a graph
is a Cayley graph for $\mathbb{Z}_2^d$, and is known as a cubelike graph.
The characters of $\mathbb{Z}_2^d$ are eigenvectors, and the eigenvalues
are integers. The eigenvalues and their multiplicities are determined by the weight enumerator of the binary code generated by the rows of $M$ - a code word of weight $k$ gives an eigenvalue $m-2k$. Since there are nonisomorphic binary codes with the same weight enumerator, there are cospectral cubelike graphs. I do not know
how much eigendata two cubelike graphs may have in common, but this would
seem a good place to look.

EDIT: Let $G$ be a graph with vertex set $\{1,\ldots,n\}$. Let $H$ be a graph with
vertex set $\{\pm1,\ldots,\pm n\}$; if $i$ and $j$ are adjacent in $G$, let
$(i,j)$ and $(-i,-j)$ be edges in $H$, and if $i$ and $j$ are not adjacent
in $G$, let $(i,-j)$ and $(-i,j)$ be edges. (Note that $H$ is regular
with valency $n-1$.) Call $H$ the two-graph constructed from $G$.

Let $\pi$ be the partition of $V(H)$ with cells $\{i,-i\}$.
Then the subspace of $\mathbb{R}^{2n}$ formed by the vectors constant on cells
of $\pi$ is invariant under the adjacency matrix of $H$. It is the direct
sum of the span of the constant vectors (which has dimension 1) and the
vectors constant on cells that sum to 0 over $V(H)$. The second subspace is
an eigenspace for $A(H)$, with eigenvalue $-1$. Since the space of vectors
constant on cells does not depend on the structure, distinct two-graph
on $2n$ vertices have $n$ eigenvectors and eigenvalues in common. (The spectrum of
a two-graph is the union of the spectrum of $K_n$ and the spectrum of $2A(G)+I-J$).

Dear Chris, Thanks for the pointer! That seems like a fairly large family, and between this and Qiaochu's comment it seems clear that there is only something nontrivial to say about very special graphs. I'm inclined to accept unless someone comes by soon with a miraculous pointer to a literature on relaxations.
–
flocAug 19 '12 at 14:35

@floc; just added a family of examples
–
Chris GodsilAug 19 '12 at 16:05

Those are interesting examples. A similar construction to the second is to add a point connected to everything to a collection of $k$-regular bipartite graphs. Many of the eigenvectors for the eigenvalue $-k$ are constant on the parts of the components, and thus don't reveal any of the structure within the components.
–
Douglas ZareAug 20 '12 at 4:41

1

The common thread is an equitable partition, if a graph has an equitable partition with $m$ cells, then the eigenvectors constant on cells of the partition are determined by the multigraph we get by quotienting over the cells. In my example the cells are the pairs $\pm i$, in Douglas's they are the bipartite graphs and the new vertex/
–
Chris GodsilAug 20 '12 at 13:02

Everything I'm about to say depends on knowing the eigenvectors and eigenvalues exactly (in terms of algebraic numbers).

Lemma: Let $L$ be an integer square matrix and $\lambda$ an eigenvalue of $L$. Then the $\lambda$-eigenspace of $L$ has a basis consisting of vectors with entries in $\mathbb{Q}(\lambda)$.

Proof. Use row reduction to compute the kernel of $L - \lambda I$. At every step, the entries of everything in sight stay in $\mathbb{Q}(\lambda)$.

If $G$ is a graph with Laplacian $\Delta$, let $K$ denote the splitting field of the characteristic polynomial $f(z)$ of $\Delta$. This is a Galois extension, and $\text{Gal}(K/\mathbb{Q})$ acts on both the eigenvalues and the eigenvectors of $\Delta$. Explicitly, if $\sigma \in \text{Gal}(K/\mathbb{Q})$, then
$$\Delta v = \lambda v \Rightarrow \Delta \sigma(v) = \sigma(\lambda) \sigma(v)$$

where $\sigma(v)$ describes the operation of applying $\sigma$ to every component of $v$ (possible by the lemma). Consequently, if $\lambda$ is not rational, knowing the eigenpair $(\lambda, v)$ gives you potentially many other eigenpairs $(\sigma(\lambda), \sigma(v))$. The total number of eigenpairs you get from knowing $(\lambda, v)$ (including $(\lambda, v)$ itself) is in fact precisely the degree of the minimal polynomial of $\lambda$, and the corresponding eigenvalues you get are the other roots of the minimal polynomial.

So if $f(z)$ factors into $c$ irreducible factors over $\mathbb{Z}$, then in fact $c$ carefully chosen eigenpairs (one coming from each irreducible factor) suffice to completely determine the eigenpairs of $\Delta$, hence to determine $\Delta$ itself. (The same remarks apply to arbitrary integer matrices and we have not used the fact that $\Delta$ is a Laplacian at all.)

So how many irreducible factors should we expect $f(z)$ to have if $G$ is sufficiently generic? I don't know. But here are some thoughts. $f(z)$ has a factor of $z$ for every connected component of $G$. I believe a lot is known about the distribution of the number of connected components of random graphs so let me ignore these factors in what follows.

The reason I said I expected $O(\log n)$ (carefully chosen) eigenpairs to suffice is that the expected number of irreducible factors of a random monic polynomial over $\mathbb{F}_q$ of degree $n$ is about $\log n$, at least if $q$ is large compared to $n$. This is the function field analogue of the Hardy-Ramanujan theorem. But actually it seems that we can do substantially better; in an appropriate sense, a random integer polynomial is irreducible, so if we expect $f(z)$ to behave like such a polynomial (up to its factors of $z$) then we should expect to be able to use even fewer eigenpairs (one for every connected component of $G$ and then maybe $O(1)$ more).

Anyway, all of this suggests that if you want to look for cases where non-uniqueness occurs, you want to restrict your attention to integer eigenvalues.