Proof about linear transformations

Hello everyone. I've been stuck with this for a while...Let V be a linear space of finite dimension and T:V->V a linear transformation. Show that T=rI where r is some real number and I is the identical transformation if and only if ToS=SoT for any linear transformation S:V->V.

Re: Proof about linear transformations

Hi,
Can you prove that the only n by n matrices that commute with every n by n matrix are the scalar matrices? ( A matrix is scalar iff all non-diagonal elements are 0 and all elements on the main diagonal are the same.) Do you see that this answers your problem?

If you need help proving the matrix result, post again to this thread.

Re: Proof about linear transformations

I just saw this approach on a different forum: They take v != 0 belonging to R^n and construct a basis B={v,b2,b3,...,bn}, then they define a matrix A such that Av=v and Abk=0. My only question is: How can we know this matrix exists for any basis? If it exists then TAv=ATv yields Tv=A(Tv), meaning Tv=rv, so every v belonging to R^n is an eigenvector of T. Finally they proceed to prove that all eigenvectors have the same eigenvalue.

Re: Proof about linear transformations

Let's try this in a different way: we'll use induction on dim(V). We will suppose we have chosen an appropriate basis for V, and identified HomF(V,V) with Matnxn(F) using this basis, for each n, so we can talk about matrices, instead of abstract linear transformations.

The 1-dimensional case (base case) is easy: EVERY 1x1 matrix is a scalar times 1, the 1x1 identity matrix, and matrix multiplication is just the field multiplication, and all 1x1 matrices commute with each other.

Now suppose that we have ST = TS for all nxn matrices S, if and only if T = rIn.

Now it should be clear that if T = rIn+1, that ST = TS, for every (n+1)x(n+1) matrix S, so we only need to show for our inductive step that if T DOES commute with every matrix S, T = rIn+1.

We need to show these 2 matrices are not equal. Note that in ST, the 1x1 matrix in the lower right is S3T2, which is just the dot product of ek and T2, which returns the k-th coordinate of T2, which is t(n+1)k ≠ 0.

However, in TS, this block is 0, so these two matrices CANNOT be equal (we don't even need to look at the other parts).

So if T is to commute with EVERY S (including the particular matrix E(n+1)k), we MUST have that the T2 block is all zero.

Hopefully, you can guess what is coming next: if T3 is not all 0's, let t(n+1)k be the first non-zero entry. This time, we'll use the "S" matrix S = Ek(n+1), so that:

Since we can choose S1 freely, we can choose it to be non-zero, which then forces r-r' = 0 (for any vector space V, with v in V (and kxm matrices do form a vector space for any k,m), if av = 0 with v non-zero, then a = 0).

That said, there is nothing wrong with picking the S you prefer, it clearly works quite well.

*******

(Project Crazy Project has a nice proof as well, but it's a little "subscript-heavy" if you know what I mean)