Thanks for LaTeX. Yes, identity or zero matrix satisfies the condition but I want to find general form.
–
user12290Jun 19 '11 at 5:37

You can define a 9-component vector v composed of those a, b, c coefficients, then write AB-BA=0 as a linear system Xv=0, and then v is any vector in the kernel space of X. Also, you might need to specify that $ \omega $ is one of $ \exp(2\pi i / 3) $ or $ \exp(4\pi i /3) $ and not the other, as they are both solutions to the quadratic.
–
anonJun 19 '11 at 5:42

2 Answers
2

Yes, as the other answer notes, B has distinct eigenvalues. Therefore, anything commuting with it must be a polynomial in B. (Degrees no higher than 2 suffice.)

In some detail: let T be a matrix so that $TBT^{-1}$ is diagonal, with distinct diagonal entries $b_1, b_2, b_3$. Since $AB=BA$, $(TAT^{-1})(TBT^{-1})=(TBT^{-1})(TAT^{-1})$. For any matrix $M$, the $ij$ entry of $M(TBT^{-1})$ is $m_{ij}b_j$, while the $ij$ entry of $(TBT^{-1})M$ is $b_im_{ij}$. Thus, a matrix commuting with $TBT^{-1}$ is diagonal. Use Lagrange interpolation to find a quadratic polynomial $P$ so that $P(TBT^{-1})=TAT^{-1}$ is that other polynomial. Happily, $P(TBT^{-1})=T.P(B).T^{-1}$, so $A=P(B)$.

It is clear that polynomial in B must be commutative with B. Will you kindly explain why converse is true?
–
user12290Jun 20 '11 at 14:47

@user12290: Let the eigenvalues of $B$ be $\lambda_i, i=1,2,3$. Anything that commutes with $B$ must map an eigenspace of $B$ to itself: if $Bx=\lambda_i x$, then $BAx=ABx=\lambda_i Ax$. Here the eigenspaces are 1-dimensional, so $A$ must act on every one of them as a scalar. IOW $A$ is determined by the three scalars $\mu_i,i=1,2,3$. By Lagrange's interpolation formula there exists a quadratic polynomial $p(x)$ such that $p(\lambda_i)=\mu_i$ for all $i=1,2,3$. Thus $A=p(B)$.
–
Jyrki Lahtonen♦Jun 21 '11 at 13:05

It turns out that $B$ has three different eigenvalues, so in a suitable basis it assumes diagonal form $D$. If $\hat A$ is the correspondingly transformed matrix $A$ then $AB=BA$ iff $\hat A D=D\hat A$. Now you have a simpler problem.

As you can see in Paul Garrett's answer you don't even have to compute the eigenvectors of $B$ and get a parametric representation of all $A$'s commuting with $B$ for free.