The problem is that I don't know what is meant by "invariant under the transformation." Another question asks for 1-dimesional subspaces of R^2 under the operation of the matrix [tex]\left[ {\begin{array}{*{20}c}
1 & 0 \\
2 & 0 \\
\end{array}} \right][/tex] .

The answer is span(0,1) and span(1,2). I don't know why those are the answers but with a little bit of guess work I decided to find the eigenspaces of the of the matrix and found that the two eigenspaces were exactly those answers. So is there some relationship between eigenspaces/ eigenvectors/ eigenvalues and invariance under transformations?

Hi, can someone help me get started on the following question?
Q. Show that there is no line in the real plane R^2 through the origin which is invariant under the transformation whose matrix is:
[tex]
A\left( \theta \right) = \left[ {\begin{array}{*{20}c}
{\cos \theta } & { - \sin \theta } \\
{\sin \theta } & {\cos \theta } \\
\end{array}} \right]
[/tex]
The problem is that I don't know what is meant by "invariant under the transformation."

Click to expand...

"Invariant under the transformation" means that any the transformation takes any vector in the subspace to a vector in that same subspace. I assume that [tex]\theta[/tex] here cannot be 0 or a multiple of 2[tex]\pi[/tex] since, in that case, the transformation would be the identity and every subspace is invariant under that. Every non-vertical line in the real plane, through the origin, can be written in the form y= mx or, in vector form,
(x, mx), for fixed m. Multiply that by the given matrix. To be in the same subspace, the result must be of the form (x, mx) with the same m.
That covers non-vertical lines. How are vectors in the vertical line through the origin written?

Another question asks for 1-dimesional subspaces of R^2 under the operation of the matrix [tex]\left[ {\begin{array}{*{20}c}
1 & 0 \\
2 & 0 \\
\end{array}} \right][/tex] .
The answer is span(0,1) and span(1,2). I don't know why those are the answers but with a little bit of guess work I decided to find the eigenspaces of the of the matrix and found that the two eigenspaces were exactly those answers. So is there some relationship between eigenspaces/ eigenvectors/ eigenvalues and invariance under transformations?

Click to expand...

Do you mean subspaces that are invariant under the transformation?
A one dimension subspace is spanned by a single vector: say v0, and so every vector in it can be written kv0 for some k. Saying that subspace is invariant under transformation A means A(kv0)= k'v0 (k' not necessarily the same as k). But A(kv0)= kA(v0) so this is the same as kA(v0)= k'A(v0) or A(v0)= (k'/k)v0. Do you see the connection with eigenvectors?
You could answer this the same way as before. Write the vectors in some one-dimensional subspace as (x, mx) and apply the matrix:
(x+0(mx), 2x+0(mx))= (x, 2x). For that subspace to be invariant, we must have m= 2. That is, vectors of the form x(1,2) so that (1,2) spans the space. Notice that the transformation then maps each vector (x,2x) into itself- it is an eigenspace corresponding to eigenvalue 1.
Every non-vertical line is a subspace of vectors which can be written (x, mx). The vertical line is a subspace of vectors which can be written (0, y). Applying A to that gives (0,0)- That's a single vector ( (0,y) is the kernel of this linear transformation) but it is still a subset of (0,y) so the set of all vectors of the form (0,y)= y(0,1), spanned by (0,1), is an invariant subspace. That is the eigenspace corresponding to eigenvalue 0.

Yeah, at the end of the question it asks for the case of integer multiples of 2 pi to be considered separately and there are answers for that part which are in accordance with your comment on that case. Also, the question with the 2 by 2 matrix was also asking for ..."invariant"... as you suspected. I forgot to include that part.

When I do the matrix multiplication with the rotation matrix and (x,mx) I get something which I don't really know how to interpret. From what you've said I infer that since the question is talking about lines in the real plane through the origin, then I can express every non-vertical line through the origin as <(1,m)> for some fixed m.

I can't express this in the form (x,mx)^T so that appears to take care of the case of non-vertical lines. But I can't think of a precise way of concluding this. Intuitively, it the result seems to make sense because if I have a line for example y = 3x and I rotate it through some angle(which isn't a multiple of 2 pi), the resulting line will no longer be 'parallel' to y = 3x.

In vector form non-vertical lines can be expressed as (0,y) for some real constant y as you stated somewhere in your post I think. Again, the matrix multiplication yields a 2 by 1 column vector which isn't a multiple of y. So that takes care of the case of vertical lines I'd say.

As for connection with eigenvectors (for 2 by 2 matrix), I know that k is an eigenvalue of a transformation T is I have T(v) = kv for some vector v. So in that sense, if I am looking for one dimensional subspaces invariant under transformations, then I can see the connection with eigenvectors. I am simply looking for subspaces which the vector v(which satisifies the equation from before) belongs to and they are eigenspaces. I think that's it.

I have just one other question. Two matrices are row equivalent if their rows span the same space. If I have two row equivalent matrices can I say that they have the same solution space? I'm asking this because if that is the case then it would allow for shortcuts in calculating eigenspaces.