As addressed in this past question, there are many applications of linear algebra to combinatorics. What about examples of applications of exterior algebras? Part 4 here is one such example. What are the heuristics for when to apply them and the intuition for how to do so?

To the extent that "combinatorics" includes the theory of representations of the symmetric group, exterior algebra must be relevant because the easiest irreducible representations to describe are the exterior powers of the $n-1$ dimensional representation corresponding to the partition $n-1, 1$.
–
Noam D. ElkiesJun 2 '12 at 19:43

6 Answers
6

This is a bit of cheating (since requires to know a bit of homological algebra) but is too nice to not be mentioned:

Let's notice that the generating function for dimensions of graded components for the exterior algebra on elements of degrees $a_1,\ldots,a_k,\ldots$ is $\prod (1+t^{a_i})$;

Let's recall that for a Lie algebra $\mathfrak{g}$ the exterior algebra $\Lambda^\bullet(\mathfrak{g})$ is equipped with a Chevalley--Eilenberg differential, making it into a chain complex. If $\mathfrak{g}$ is graded with basis elements of degrees $a_1,\ldots,a_k,\ldots$, the generating function of Euler characteristics of graded components of this chain complex is obtained from the previous formula by incorporating signs, amounting to $\prod (1-t^{a_i})$;

Of course, if we know the homology of the Chevalley--Eilenberg complex, we can compute that generating function in a different way, and get a combinatorial identity. Some famous identities arise that way for infinite-dimensional Lie algebras. For instance, one can prove Euler's pentagonal theorem considering $\mathfrak{g}=L_1(1)$, the algebra of vector fields on the line that vanish twice at zero. More generally, for nilpotent subalgebras of Kac--Moody algebras one obtains Macdonald's identities, see e.g. "Lie algebra homology and the Macdonald-Kac formulas" by Garland and Lepowsky.

The exterior algebra arose naturally in a problem about enumerating matchings on a planar graph (domino/lozenge tilings). In some situations, those correspond to families of nonintersecting lattice paths between a set of sources $S$ and a set of sinks $T$. The Lindstrom-Gessel-Viennot lemma says that the number of these families equals the determinant of a matrix $M$ whose $i,j$ term is the number of paths between the $i$th element of $S$ and the $j$th element of $T$.

Suppose you want to count the domino tilings of a modified region where only some subset of $S$ is present and only some subset of the same size of $T$ is present. This happens if you have a large region you cut into pieces along $S$ and $T$, and want to enumerate tilings of the larger region by the transfer matrix method. If you index a matrix by subsets of $S$ and subsets of $T$, and put the number of domino tilings of the region in that entry, the result is the action of $M$ on the exterior algebra.

The main example of Koszul duality is duality between exterior algebra and polynomial algebra, it is enough to obtain the classical MacMahon identity as well as its generalization to certain matrices with non-commuting entries (Manin matrices).
There are further q-super analogs - everything is the same just need to consider appropriate versions of the exterior algebra.

Let me sketch the idea of application which is quite simple, it is related to Vladimir's answer above.

One knows that Euler characteristic can be calculated from complex itself and from (co)homology of the complex - we get the same result.
This actually can be generalized to the trace of arbitrary operator acting on the complex,
if it commutes with differential.

So the proof of the MacMahon formula exploits this idea for the Koszul complex, which cohomology consits of $C$. So the trace taken over cohomology is $1$, while the trace calculated via the complex itself will give $det(1-A)(\sum_k Tr S^k A)$.

The advantage of this proof that it can be generelized to matrices with non-commuting
entries - Manin matrices, quantum matrices, super matrices, etc...
While the standard proof via the diagonalization will not work in such situations.

Probably the main thing about the exterior algebra is that it is intimately related to the determinats of matrices, and it can be effeciently used to obtain certain statements which includes matrix determinants like det(AB)=det(A)det(B), Cramer's formula for A^{-1}, Plucker relations, Jacobi ratio theorem, etc. (Example on MO).

On the other hand there are cerain amount of papers from combintatorial community
(names include G.-C. Rota, D. Zeilberger, D. Foata) to prove similar relations by combinatorial means.

Hope this might be considered as an application or at least as a relation.

Before giving some concrete examples let me mention that both techniques (combinatorial and exterior algebra) allows to handle matrices with non-commutative elements, which is of certain interest in representation theory and quantum integrable systems.

Example 2. Matjaž Konvalinka (who was a student of Igor Pak in MIT at that time) obtained combinatorially the proofs the so-called Jacobi, Sylvester theorems
for some matrices with non-necessary commutative entries (Manin matrices).

(These papers are motivated by other Zeilberger's&K paper - I already discussed this in another answer to this question).