I am studying for a test and trying to answer some questions about $T_A$ but the answers I have come up with are very computational and I was wondering if there is a faster way to attack this problem then the method I will describe.

Part 1) What are $\dim(\ker(T_A))$ and $\dim(Image(T_A))$

I was thinking there might be some slick way to use the rank nullity theorem without actually having to do computation but so far my only answer involves calculating the matrix expressions:

Which gives us the eigenvectors of $T_A$ with four repeated eigenvalues $\lambda =1$

We see the characteristic polynomial of $T_A = (x-1)^4$

Question 1) Is there a simple way to do any of part1) or part2) without resorting to all the matrix calculations. In particular I also want to compute the minimal polynomial of $T_A$ without having to compute powers of $(AB-BA-I)$.

Question2) Using the structure of the eigenvectors obtained in part 2) can we list a basis in which $T_A$ is diagonalizable?

Perhaps it's just little tricks. It seems a computational question. You can look at basis elements for your space - there are 16 of them, each with fifteen 0s and one 1. The diagonal ones commute with A (in fact, diagonal matrices always commute). To see that the others don't: left multiplication by an invertible matrix is a row operation, and right multiplication is a column operation. This one's easy: $B\mapsto AB$ multiplies the first row by 1, the second by 2, etc. So the others won't commute (try it). For part 2, your answer looks like just about the only sensible thing to me. Sorry!
–
BillyAug 1 '11 at 18:11

1

Oh, except you've only found the eigenvectors corresponding to the value 1. There are eigenvectors of eigenvalue 2, too...
–
BillyAug 1 '11 at 18:14

1

Billy's basis is the one to use. If $E_{ij}$ is the matrix with 1 in the $(i,j)$ position, $T_A(E_{ij}) = (i-j) E_{ij}$. These are all the eigenvectors you need.
–
Robert IsraelAug 1 '11 at 18:37

Does this show that $T_A$ is diagonalizable?
–
user7980Aug 1 '11 at 20:03

1 Answer
1

Your answer to part 1 is fine, but in part 2, since $T_A$ is a linear mapping defined on $\mathbb{C}^{4\times 4}$, which is a vector space of dimension 16, the characteristic polynomial should be of degree 16, not a degree-4 polynomial.

The eigenvectors for the other eigenvalues can be found in a similar manner. In summary, the characteristic polynomial of $T_A$ should be $p(x)=(x+3)(x+2)^2(x+1)^3x^4(x-1)^3(x-2)^2(x-3)$, or equivalently, $p(x)=x^4(x^2-1)^3(x^2-4)^2(x^2-9)$. And the minimal polynomial is $m(x)=x(x^2-1)(x^2-4)(x^2-9)$.

One thing that may be confusing is that the "eigenvectors" in this problem are actually square matrices. This is unlike the usual case, where the "vectors" we talk about are genuine column vectors. Yet your problem can be recast to the familiar form, but you need the knowledge of Kronecker product (a.k.a. tensor product). You also need to turn the $4\times4$ matrix $B$ into a 16-vector ${\rm vec}(B)$. This can be done by stacking up the columns of $B$, i.e. we represent $B$ by the following column vector:
$$
{\rm vec}(B)=(b_{11}, b_{21}, b_{31}, b_{41}, b_{12}, b_{22}, b_{32}, b_{42}, b_{13}, b_{23}, b_{33}, b_{43}, b_{14}, b_{24}, b_{34}, b_{44})^T.
$$
(Note the transpose at the end of the above, so ${\rm vec}(B)$ is a column vector, not a row vector.) In general, for a linear mapping of the form $T(B)=ABC$, we have ${\rm vec}(T(B)) = (C^T\otimes A){\rm vec}(B)$. In other words, if we represent both $B$ and $T(B)$ in vector forms (instead of in matrix forms), then the matrix representation of $T$ is given by $C^T\otimes A$.

I hope this is concrete enough. For a more "abstract" or "algebraic" method, see Robert Israel's comment in the above.

A final note: in the above characteristic polynomial and minimal polynomial, you may substitute for $x$ the matrix representation of $T_A$, i.e. $(I\otimes A - A^T\otimes I)$, to obtain a zero matrix of order 16.

Did you find the minimal polynomial without computing $m(T_A)$?
–
user7980Aug 1 '11 at 20:02

1

Yes. For an eigenvalue problem $Mv = \lambda v$, if $M$ is diagonalizable and $\lambda_1, ..., \lambda_k$ are the distinct eigenvalues of $M$, the minimal polynomial is simply $m(x)=(x-\lambda_1)...(x-\lambda_k)$. That is, each different eigenvalue only contributes a linear factor to the minimal polynomial. This is a general truth. When $M$ is not diagonalizable, things get nastier, but in our case, $M=(I\otimes A - A^T\otimes I)$ is already a diagonal matrix.
–
user1551Aug 1 '11 at 23:06