Mathematics for the interested outsider

We continue working over the field of real numbers. Again, let be a linear transformation from a real vector space of dimension to itself. We want to find the characteristic polynomial of this linear transformation.

When we had an algebraically closed field, this was easy. We took an upper-triangular matrix, and then the determinant was just the product down the diagonal. This gave one factor of the form for each diagonal entry , which established that the diagonal entries of an upper-triangular matrix were exactly the eigenvalues of the linear transformation.

Now we don’t always have an upper-triangular matrix, but we can always find a matrix that’s almost upper-triangular. That is, one that looks like

where the blocks are all either matrices

or matrices

In this latter case, we define to be the trace , and to be the determinant . We must find that , otherwise we can find another basis which breaks up into two blocks. Let’s go a step further and insist that all the blocks show up first, followed by all the blocks.

Now we can start calculating the determinant of , summing over permutations. Just like we saw with an upper-triangular matrix, if we have a block in the lower-right we have to choose the rightmost entry in the bottom column, or the whole term will be zero. So we start racking up factors just like before. Each block, then, gives us a root of the characteristic polynomial, which is an eigenvalue. So far everything is the same as in the upper-triangular case.

Once we get to the blocks we have to be a bit more careful. We have two choices of a nonzero entry in the lowest row: or . But if we choose then we can only choose on the next row up to have a chance of a nonzero term. On the other hand, if we choose on the lowest row we are forced to choose next. The choice between these two is independent of any other choices we might make in calculating the determinant. The first always gives a factor of to the term corresponding to that permutation, while the second always gives a factor of to its term. These permutations (no matter what other choices we might make) differ by exactly one swap, and so they enter the determinant with opposite signs.

Now we can collect together all the permutations where we make one choice in block , and all the permutations where we make the other choice. From the first collection we can factor out , and from the second we can factor out . What remains after we pull these factors out is the same in either case, so the upshot is that the block contributes a factor of to the determinant. Some calculation simplifies this:

which is a quadratic factor with no real roots (since we assumed that ).

But a factor of the characteristic polynomial of this formula is exactly what we defined to be an eigenpair. That is, just as eigenvectors — roots of the characteristic polynomial — correspond to one-dimensional invariant subspaces, so too do eigenpairs — irreducible quadratic factors of the characteristic polynomial — correspond to two-dimensional invariant subspaces. The blocks that show up along the diagonal of the almost upper-triangular matrix give rise to the eigenpairs of .

About this weblog

This is mainly an expository blath, with occasional high-level excursions, humorous observations, rants, and musings. The main-line exposition should be accessible to the “Generally Interested Lay Audience”, as long as you trace the links back towards the basics. Check the sidebar for specific topics (under “Categories”).

I’m in the process of tweaking some aspects of the site to make it easier to refer back to older topics, so try to make the best of it for now.