Since \(\mu\) and \(\lambda\) were assumed to be distinct eigenvalues, \(\lambda-\mu\) is non-zero, and so \(x\cdot y=0\). We have proved the following theorem.

Theorem

Eigenvectors of a symmetric matrix with distinct eigenvalues are orthogonal.

Example 129

The matrix \(M=\begin{pmatrix}2&1\\1&2\end{pmatrix}\) has eigenvalues determined by

\[\det(M-\lambda I)=(2-\lambda)^{2}-1=0.\]

So the eigenvalues of \(M\) are \(3\) and \(1\), and the associated eigenvectors turn out to be \(\begin{pmatrix}1\\1\end{pmatrix}\) and \(\begin{pmatrix}1\\-1\end{pmatrix}\). It is easily seen that these eigenvectors are orthogonal:

In chapter 14 we saw that the matrix \(P\) built from any orthonormal basis \((v_{1},\ldots, v_{n} )\) for \(\mathbb{R}^{n}\) as its columns,

\[P=\begin{pmatrix}v_{1} & \cdots & v_{n}\end{pmatrix}\, ,\]

was an orthogonal matrix:

\[P^{-1}=P^{T}, \textit{ or } PP^{T}=I=P^{T}P.\]

Moreover, given any (unit) vector \(x_{1}\), one can always find vectors \(x_{2}, \ldots, x_{n}\) such that \((x_{1},\ldots, x_{n})\) is an orthonormal basis. (Such a basis can be obtained using the Gram-Schmidt procedure.)

Now suppose \(M\) is a symmetric \(n\times n\) matrix and \(\lambda_{1}\) is an eigenvalue with eigenvector \(x_{1}\) (this is always the case because every matrix has at least one eigenvalue--see review problem 3). Let the square matrix of column vectors \(P\) be the following:

\[P=\begin{pmatrix}x_{1} & x_{2} & \cdots & x_{n}\end{pmatrix},\]

where \(x_{1}\) through \(x_{n}\) are orthonormal, and \(x_{1}\) is an eigenvector for \(M\), but the others are not necessarily eigenvectors for \(M\). Then

The last equality follows since \(P^{T}MP\) is symmetric. The asterisks in the matrix are where “stuff'' happens; this extra information is denoted by \(\hat{M}\) in the final expression. We know nothing about \(\hat{M}\) except that it is an \((n-1)\times (n-1)\) matrix and that it is symmetric. But then, by finding an (unit) eigenvector for \(\hat{M}\), we could repeat this procedure successively. The end result would be a diagonal matrix with eigenvalues of \(M\) on the diagonal. Again, we have proved a theorem:

Theorem

Every symmetric matrix is similar to a diagonal matrix of its eigenvalues. In other words,

\[M=M^{T} \Leftrightarrow M=PDP^{T}\]

where \(P\) is an orthogonal matrix and \(D\) is a diagonal matrix whose entries are the eigenvalues of \(M\).

To diagonalize a real symmetric matrix, begin by building an orthogonal matrix from an orthonormal basis of eigenvectors:

Example 130

The symmetric matrix

$$M=\begin{pmatrix}2&1\\1&2\end{pmatrix}\, ,$$

has eigenvalues \(3\) and \(1\) with eigenvectors \(\begin{pmatrix}1\\1\end{pmatrix}\) and \(\begin{pmatrix}1\\-1\end{pmatrix}\) respectively. After normalizing these eigenvectors, we build the orthogonal matrix: