Right, I thought immediately of the lovely manuscript by Babai and Frankl. Too bad it is not more easily available: I was lucky enough to take a combinatorics class from Babai in 1998 and thereby acquire a copy. I still have it! (Do you know why they have apparently lost interest in publishing it?)
–
Pete L. ClarkMar 4 '10 at 0:12

I don't know why Babai and Frankl have lost interest in publishing it, but the preliminary manuscript is still readily available from the link above.
–
Richard StanleyMar 4 '10 at 0:49

@Richard - Much thanks! There is a lot of good information in your post.
–
Tony HuynhMar 4 '10 at 2:24

8

I know someone tried to contact the UChicago department several times about getting that manuscript without luck, so here is a scanned copy: ifile.it/6wl3uv2
–
Steven SamMar 4 '10 at 17:05

Hoffman and Singleton proved that a regular graph with girth 5 and diameter 2 has to have degree 2, 3, 7, or 57. If I recall correctly, the proof used spectral properties of the adjacency matrix to produce some non-polynomial equation for which these were the integer solutions.

There are unique examples of the first three cases: degree 2 is a pentagon, degree 3 is the Petersen graph, and degree 7 is the Hoffman-Singleton graph. The existence of the degree 57 graph is still open (as far as I know).

The Lindstrom-Gessel-Viennot Lemma uses the reflection principle on $S_n$ to say that the number of nonintersecting families of lattice paths in the plane equals the determinant of a matrix so that the $i,j$-th entry is the number of paths from the $i$th source to the $j$th sink.

This was not a linear algebra proof. However, this determinant can be used to enumerate plane partitions inside an $a\times b \times c~$ box, to $q$-enumerate plane partitions by weight, and to count domino tilings of an Aztec diamond. The resulting determinants can be manipulated and evaluated in ways which are natural in linear algebra, but not as clear on the objects, such as factoring the matrices. These enumerations can be viewed as applications of simple results in linear algebra.

Notes:

Lattice paths are defined and the sources and sinks are restricted so that any nonintersecting family must be an even permutation from source indices to sink indices, usually the identity.

Some facts - and proofs! - in combinatorics can be interpreted as linear algebra over the "field with one element". In this very nicely written article Henry Cohn gives a concrete meaning to this and shows how to make a proof from linear algebra into a proof about a combinatorical statement by rephrasing it into axiomatic projective geometry.

(by the way: Lior's answer is an instance of linear algebra over field with one element)

These references may be more shallow than you desired, but they are both fun and lucid.

1) Noga Alon's Tools From Higher Algebra contains many things (or at least references to those things) that only require linear algebra at heart, such as Rayleigh's Principle.

2) A Course in Combinatorics by van Lint and Wilson is laced with gems in self-contained sections, such that each page is an adventure. You'll find Lots of techniques here that only require linear algebra, including the awkward-looking "interlacing property" of eigenvalues that have popped up way too much for me to ignore by now.

My favorite is actually the aforementioned Babai/Frankl manuscript, which is still very readable and useful. In theory you can still get it; in practice more difficult. First time I tried to order it I didn't get a reply at all.

Here is an example I learned about this month: The edges of the complete graph cannot be partitioned into fewer than $n-1$ complete bipartite graphs. Apparently the only known proofs involve linear algebra.

I should add that Richard Brualdi, from whom I learned this, will be writing a book over the course of the next year largely on the connections between linear algebra and graph theory.
–
Tracy HallJul 27 '10 at 13:27

Let $P$ be a finite set ("points"), and let $L\subset 2^P$ ("lines") be such that distinct lines intersect in at most one point and any two distinct points are contained in a line. Let $V$ be the real vector space with basis $P$, $W$ the vector space with basis $L$. There are natural linear maps $T\colon V\to W$ and $S\colon W\to V$ mapping every point to the sum of the lines containing it, and every line to the sum of the points in it. Then $ST = J+D-I$ where $J$ is the all-ones matrix (through every two distinct points there is a unique line), $I$ the identity matrix and $D$ is diagonal with entries counting the lines through each point.

Assume that not all points are collinear. Then all the diagonal entries of $D-I$ are at least one; it is then easy to verify that the determinant of $ST$ is positive, and conclude that $|L| \geq |P|$.

This is a crosspost from Why linear algebra is fun!(or ?), suggested by Kevin O'Bryant. I think it's relevant here. Everything below is verbatim from the earlier post.

My favorite application of linear algebra, as introduced to me by Fan Chung, is Oddtown (which I learned about from a manuscript of Lovasz, but may not be due to him).

The $n$ residents of Oddtown love to form clubs; call the family of these $\mathcal{F}$. If $F_1$ and $F_2$ are in $\mathcal{F}$, then $|F_1|$ must be odd (this is Oddtown!) and $|F_1 \cap F_2|$ must be even unless $F_1 = F_2$ ($\scriptsize{go\;Oddtown?}$). The question is, how many clubs may these $n$ people form?

Variants of the EKR Theorem offer a wide class of examples. This page has a nice list going, by the way.

A friend of mine once made the outrageous claim -- but hear me out -- that most "linear algebra proofs" in combinatorics are not truly using linear algebra. I think he was getting at such arguments' use of a preferential basis (think positive or $\{0,1\}$-matrices) when linear algebra should, in its purest sense, be basis-independent. In the standard proof of Fisher's inequality, you set up and compute a determinant, get that some matrix has full rank, and then conclude the inequality. But there are no linear transformations (debateable). He conceded that, for instance, Perron-Frobenius belongs inside "matrix analysis", but not "linear algebra"!

I argued a little, but then eventually kind of saw his point. I guess I now really appreciate the obvious (but fortunate) fact that inner products count something!