Mathematics for the interested outsider

Diagonal Matrices

Each basis vector is an eigenvector, with the eigenvalues listed down the diagonal. It’s straightforward to show that the sum and product of two diagonal matrices are themselves diagonal. Thus, diagonal matrices form a further subalgebra inside the algebra of upper-triangular matrices. This algebra is just a direct sum of copies of , with multiplication defined component-by-component.

Diagonal matrices are especially nice because it’s really easy to see how they act on vectors. Given a diagonal matrix , break a vector into its components . Multiply each component by the corresponding eigenvalue . And you’re done! Composing a diagonal matrix with another matrix is also easy. To find , just multiply each row of by the corresponding eigenvalue. To find , multiply each column of by the corresponding diagonal.

So, if we can find a basis for our vector space consisting only of eigenvectors for the transformation , then with respect to that basis the matrix of is diagonal. This is as good as we can hope for, and a lot of linear algebra comes down to determining when we can do this.

[…] We’re now ready to characterize those transformations on complex vector spaces which have a diagonal matrix with respect to some basis. First of all, such a transformation must be normal. If we have a […]

About this weblog

This is mainly an expository blath, with occasional high-level excursions, humorous observations, rants, and musings. The main-line exposition should be accessible to the “Generally Interested Lay Audience”, as long as you trace the links back towards the basics. Check the sidebar for specific topics (under “Categories”).

I’m in the process of tweaking some aspects of the site to make it easier to refer back to older topics, so try to make the best of it for now.