Practice and Assignment problems are not yet written. As time permits I am working on them, however I don't have the amount of free time that I used to so it will take a while before anything shows up here.

You appear to be on a device with a "narrow" screen width (i.e. you are probably on a mobile phone). Due to the nature of the mathematics on this site it is best views in landscape mode. If your device is not in landscape mode many of the equations will run off the side of your device (should be able to scroll to see them) and some of the menu items will be cut off due to the narrow screen width.

Section 5-3 : Review : Eigenvalues & Eigenvectors

If you get nothing out of this quick review of linear algebra you must get this section. Without this section you will not be able to do any of the differential equations work that is in this chapter.

So, let’s start with the following. If we multiply an \(n \times n\) matrix by an \(n \times 1\) vector we will get a new \(n \times 1\) vector back. In other words,

\[A\,\vec \eta = \vec y\]

What we want to know is if it is possible for the following to happen. Instead of just getting a brand new vector out of the multiplication is it possible instead to get the following,

In other words, is it possible, at least for certain \(\lambda \) and \(\vec \eta \), to have matrix multiplication be the same as just multiplying the vector by a constant? Of course, we probably wouldn’t be talking about this if the answer was no. So, it is possible for this to happen, however, it won’t happen for just any value of \(\lambda \) or \(\vec \eta \). If we do happen to have a \(\lambda \) and \(\vec \eta \) for which this works (and they will always come in pairs) then we call \(\lambda\) an eigenvalue of \(A\) and \(\vec \eta \) an eigenvector of \(A\).

So, how do we go about finding the eigenvalues and eigenvectors for a matrix? Well first notice that if \(\vec \eta = \vec 0\) then \(\eqref{eq:eq1}\) is going to be true for any value of \(\lambda \) and so we are going to make the assumption that \(\vec \eta \ne \vec 0\). With that out of the way let’s rewrite \(\eqref{eq:eq1}\) a little.

Notice that before we factored out the \(\vec \eta \) we added in the appropriately sized identity matrix. This is equivalent to multiplying things by a one and so doesn’t change the value of anything. We needed to do this because without it we would have had the difference of a matrix, \(A\), and a constant, \(\lambda \), and this can’t be done. We now have the difference of two matrices of the same size which can be done.

is equivalent to \(\eqref{eq:eq1}\). In order to find the eigenvectors for a matrix we will need to solve a homogeneous system. Recall the fact from the previous section that we know that we will either have exactly one solution (\(\vec \eta = \vec 0\)) or we will have infinitely many nonzero solutions. Since we’ve already said that we don’t want \(\vec \eta
= \vec 0\) this means that we want the second case.

Knowing this will allow us to find the eigenvalues for a matrix. Recall from this fact that we will get the second case only if the matrix in the system is singular. Therefore, we will need to determine the values of \(\lambda \) for which we get,

\[\det \left( {A - \lambda I} \right) = 0\]

Once we have the eigenvalues we can then go back and determine the eigenvectors for each eigenvalue. Let’s take a look at a couple of quick facts about eigenvalues and eigenvectors.

Fact

If \(A\) is an \(n \times n\) matrix then \(\det \left( {A - \lambda I} \right) = 0\) is an \(n^{\text{th}}\) degree polynomial. This polynomial is called the characteristic polynomial.

To find eigenvalues of a matrix all we need to do is solve a polynomial. That’s generally not too bad provided we keep \(n\) small. Likewise this fact also tells us that for an \(n \times n\) matrix, \(A\), we will have \(n\) eigenvalues if we include all repeated eigenvalues.

So, it looks like we will have two simple eigenvalues for this matrix, \({\lambda _{\,1}} = - 5\) and \({\lambda _{\,2}} = 1\). We will now need to find the eigenvectors for each of these. Also note that according to the fact above, the two eigenvectors should be linearly independent.

To find the eigenvectors we simply plug in each eigenvalue into and solve. So, let’s do that.

\({\lambda _{\,1}} = - 5\) :
In this case we need to solve the following system.

that will yield an infinite number of solutions. This is expected behavior. Recall that we picked the eigenvalues so that the matrix would be singular and so we would get infinitely many solutions.

Notice as well that we could have identified this from the original system. This won’t always be the case, but in the \(2 \times 2\) case we can see from the system that one row will be a multiple of the other and so we will get infinite solutions. From this point on we won’t be actually solving systems in these cases. We will just go straight to the equation and we can use either of the two rows for this equation.

Now, let’s get back to the eigenvector, since that is what we were after. In general then the eigenvector will be any vector that satisfies the following,

We really don’t want a general eigenvector however so we will pick a value for \({\eta _{\,2}}\) to get a specific eigenvector. We can choose anything (except \({\eta _{\,2}} = 0\)), so pick something that will make the eigenvector “nice”. Note as well that since we’ve already assumed that the eigenvector is not zero we must choose a value that will not give us zero, which is why we want to avoid \({\eta _{\,2}} = 0\) in this case. Here’s the eigenvector for this eigenvalue.

Clearly both rows are multiples of each other and so we will get infinitely many solutions. We can choose to work with either row. We’ll run with the first because to avoid having too many minus signs floating around. Doing this gives us,

\[{\eta _1} + 7{\eta _2} = 0\hspace{0.25in}{\eta _1} = - 7{\eta _2}\]

Note that we can solve this for either of the two variables. However, with an eye towards working with these later on let’s try to avoid as many fractions as possible. The eigenvector is then,

So, it looks like we’ve got an eigenvalue of multiplicity 2 here. Remember that the power on the term will be the multiplicity.

Now, let’s find the eigenvector(s). This one is going to be a little different from the first example. There is only one eigenvalue so let’s do the work for that one. We will need to solve the following system,

Recall in the last example we decided that we wanted to make these as “nice” as possible and so should avoid fractions if we can. Sometimes, as in this case, we simply can’t so we’ll have to deal with it. In this case the eigenvector will be,

Note that by careful choice of the variable in this case we were able to get rid of the fraction that we had. This is something that in general doesn’t much matter if we do or not. However, when we get back to differential equations it will be easier on us if we don’t have any fractions so we will usually try to eliminate them at this step.

Also, in this case we are only going to get a single (linearly independent) eigenvector. We can get other eigenvectors, by choosing different values of \({\eta _{\,1}}\). However, each of these will be linearly dependent with the first eigenvector. If you’re not convinced of this try it. Pick some values for \({\eta _{\,1}}\) and get a different vector and check to see if the two are linearly dependent.

Recall from the fact above that an eigenvalue of multiplicity \(k\) will have anywhere from 1 to \(k\) linearly independent eigenvectors. In this case we got one. For most of the \(2 \times 2\) matrices that we’ll be working with this will be the case, although it doesn’t have to be. We can, on occasion, get two.

Now, it’s not super clear that the rows are multiples of each other, but they are. In this case we have,

\[{R_1} = - \frac{1}{2}\left( {3 + 5i} \right){R_2}\]

This is not something that you need to worry about, we just wanted to make the point. For the work that we’ll be doing later on with differential equations we will just assume that we’ve done everything correctly and we’ve got two rows that are multiples of each other. Therefore, all that we need to do here is pick one of the rows and work with it.

We’ll work with the second row this time.

\[2{\eta _1} + \left( {3 - 5i} \right){\eta _2} = 0\]

Now we can solve for either of the two variables. However, again looking forward to differential equations, we are going to need the “\(i\)” in the numerator so solve the equation in such a way as this will happen. Doing this gives,

There is a nice fact that we can use to simplify the work when we get complex eigenvalues. We need a bit of terminology first however.

If we start with a complex number,

\[z = a + bi\]

then the complex conjugate of \(z\) is

\[\overline{z} = a - bi\]

To compute the complex conjugate of a complex number we simply change the sign on the term that contains the “\(i\)”. The complex conjugate of a vector is just the conjugate of each of the vector’s components.

We now have the following fact about complex eigenvalues and eigenvectors.

Fact

If \(A\) is an \(n \times n\) matrix with only real numbers and if \({\lambda _{\,1}} = a + bi\) is an eigenvalue with eigenvector \({\vec \eta ^{\left( 1 \right)}}\). Then \({\lambda _{\,2}} = \overline {{\lambda _{\,1}}} = a - bi\) is also an eigenvalue and its eigenvector is the conjugate of \({\vec \eta ^{\left( 1 \right)}}\).

This fact is something that you should feel free to use as you need to in our work.

Now, we need to work one final eigenvalue/eigenvector problem. To this point we’ve only worked with \(2 \times 2\) matrices and we should work at least one that isn’t \(2 \times 2\). Also, we need to work one in which we get an eigenvalue of multiplicity greater than one that has more than one linearly independent eigenvector.

Example 4 Find the eigenvalues and eigenvectors of the following matrix.
\[A = \left( {\begin{array}{*{20}{c}}0&1&1\\1&0&1\\1&1&0\end{array}} \right)\]

Show Solution

Despite the fact that this is a \(3 \times 3\) matrix, it still works the same as the \(2 \times 2\) matrices that we’ve been working with. So, start with the eigenvalues

So, we’ve got a simple eigenvalue and an eigenvalue of multiplicity 2. Note that we used the same method of computing the determinant of a \(3 \times 3\) matrix that we used in the previous section. We just didn’t show the work.

Let’s now get the eigenvectors. We’ll start with the simple eigenvector.

Okay, in this case is clear that all three rows are the same and so there isn’t any reason to actually solve the system since we can clear out the bottom two rows to all zeroes in one step. The equation that we get then is,

Notice the restriction this time. Recall that we only require that the eigenvector not be the zero vector. This means that we can allow one or the other of the two variables to be zero, we just can’t allow both of them to be zero at the same time!

What this means for us is that we are going to get two linearly independent eigenvectors this time. Here they are.

Now when we talked about linear independent vectors in the last section we only looked at \(n\) vectors each with \(n\) components. We can still talk about linear independence in this case however. Recall back with we did linear independence for functions we saw at the time that if two functions were linearly dependent then they were multiples of each other. Well the same thing holds true for vectors. Two vectors will be linearly dependent if they are multiples of each other.
In this case there is no way to get \({\vec \eta ^{\left( 2 \right)}}\) by multiplying \({\vec \eta ^{\left( 3 \right)}}\) by a constant. Therefore, these two vectors must be linearly independent.

So, summarizing up, here are the eigenvalues and eigenvectors for this matrix