I have a very interesting question on orthogonal projection matrices. Intuitively it is quite straightforward to understand. But for me it is not easy to prove.

In $R^2$ space, $a_i$, $i=1,..,n>2$ are unit verctors, which are located on the unit circle uniformly. That means the angle subtended by any adjacent vectors is $2\pi/n$. It is known $A_i=a_ia_i^T$ is an orthogonal projection matrix satisfying $A_i^T=A_i$ and $A_i^2=A_i$.

Prove: $A_1+..+A_n=a_1a_1^T+...+a_na_n^T=n/2I$ where $I$ is the identity matrix.

Remark:
1. Intuitively, consider three vectors. If their sum is zero, the vectors are distributed on the unit circle uniformly. 2. Guess: In $R^d$ space instead of $R^2$, the 1/2 in the equation should be 1/d. 3. My approach: prove $B=a_1a_1^T+...+a_na_n^T$ equals $n/2I$ if and only if $Bx=n/2x$ for arbitrary $x$ in $R^2$. Maybe a bad approach.

Thank you very much!

Shiyu

PS: consider the converse question:
if $a_1a_1^T+...+a_na_n^T=n/2I$, can we say $a_1+...+a_n=0$?
Answer: using Nick's method, it can be shown this converse statement does not hold. Please refer to Nick's solution.

Now I'd like to pose a more general question.

If $\sum k_ia_i=0$ where $k_i>0$ and $\sum k_i=1$, then $ \sum k_ia_ia_i^T=1/dI$ where $d$ is the dimension of the space $R^d$.

The original problem in fact is a special case of the above statement where $k_i=1/n$. Of course, this above statement is wrong becasue I must add some hypothesis like the unit vectors $a_i$ should be located on the unit circle uniformly. But I am not sure how to state it rigorously at present.

4 Answers
4

Let $a_1$, ..., $a_n$ be unit vectors in $\mathbb{R}^d$. Let $G$ be a group acting linearly on $\mathbb{R}^d$, which permutes the $a_i$, and such that the representation $\mathbb{R}^d$ of $G$ is irreducible. For example, maybe $d=2$ and $G$ is the cyclic group of order $n$ acting by rotations. This seems to be the example you started out thinking of. Note that, when $n=2$, this representation is not irreducible, which gives you darij's counterexample.

Then I claim that $\sum a_i a_i^T = (n/d) \mathrm{Id}$.

Proof: $\sum a_i a_i^T$ is the matrix of a $G$ invariant bilinear form. Since $\mathbb{R}^d$ is irreducible, the space of such forms is one dimensional and $\sum a_i a_i^T = r \mathrm{Id}$ for some $r$. Taking traces of both sides, we get $n = rd$, so $r=n/d$.

Here is this answer written in the most elementary way I can: Let's first look at the case you started with, where the vector are evenly spaced at angle $\theta$. Let $g(\theta)$ be the matrix $\left( \begin{smallmatrix} \cos \theta & \sin \theta \\ - \sin \theta & \cos \theta \end{smallmatrix} \right)$. Then we have
$$g(\theta) \left( \sum a_i a_i^T \right) g(-\theta) = \sum \left( g(\theta) a_i \right) \left( g(\theta) a_i \right)^T = \sum a_i a_i^T$$
where the second equality is because multiplying by $g(\theta)$ permutes the $a_i$'s. So the sum $Q:=\sum a_i a_i^T$ obeys $g(\theta) Q g(-\theta) = Q$. This is a collection of four linear equations in the entries of $Q$. (Computationally, you may find it easier to work with the equivalent $g(\theta) Q = Q g (\theta)$.) As long as $\theta$ is not a multiple of $\pi$, the space of solutions is $1$-dimensional, spanned by the identity matrix. So $Q=r \mathrm{Id}$ for some $r$. Taking traces of both sides, $r=n/2$.

Now, what to do in higher dimensions? Are the vertices of a cube regularly spaced? What about a dodecahedron? The answer is yes, and group representation theory is the correct vocabulary to explain why. Let $g_1$, $g_2$, ..., $g_k$ be a finite collection of orthogonal matrices permuting the $a_i$ amongst themselves. So, the rotational symmetries of the cube or the dodecahedron or, back in your original example, the matrix $g(\theta)$.

Set $Q = \sum a_i a_i^T$. Just like before, we deduce that $g_1 Q = Q g_1$, $g_2 Q = Q g_2$, ..., $g_k Q = Q g_k$. This is a bunch of linear equations for $Q$. If the space of solutions to these equations is one-dimensional, we win!

We now come to a nontrivial theorem: The space of solutions to these equations is one dimensional if and only if there is no linear subspace $V$ of $\mathbb{R}^d$ (other than $(0)$ and $\mathbb{R}^d$ itself) such that $g_i V = V$ for each $g_i$. For the cube, the dodecahedron, the original evenly spaced points, and many harder examples, this is easily checked. For the case darij raised, of two points evenly spaced at distance $\pi$, this condition fails: every line through the origin is taken to itself under $g(\pi)$.

Some vocabulary: It is usual to work not just with the $g_i$, but with all of their products. (For example, working with not just $g(\theta)$ but $g(2 \theta)$, $g(3 \theta)$, etcetera.) Let $G$ be the set of all of these products. Since they permute a finite set of vectors, $G$ cannot be infinite. (Experienced mathematicians will see that I am glossing over something here, please let me do so.) For example, in our two dimensional example, $\theta/\pi$ cannot be irrational. So $G$ is finite, and is what is called a finite group. Every element of $G$ is a symmetry of $\mathbb{R}^d$, so we say that $G$ acts on $\mathbb{R}^d$ and those actions are by linear maps so we say that $G$ acts linearly. In this setting of a group acting linearly, we say that $\mathbb{R}^d$ is a representation of $G$. The condition that there be no subspace of $\mathbb{R}^d$ which is taken to itself by $G$ is usually stated using the technical term: $\mathbb{R}^d$ is an irreducible representation.

A personal note: If you aren't comfortable switching between matrices and bilinear forms, you don't know the vocabulary of groups and group actions, and you have never seen any group representation theory, you might fit in better at math.stackexchange.com than here. That said, nice problem! I may save it for when I teach representation theory.

Looks like a good mind-reading job!
–
darij grinbergFeb 4 '11 at 19:11

For those who are interested in pursuing this further, a finite set $a_1$, $a_2$, ..., $a_n$ of $S^{d-1}$ is called a $t$-design if, for every polynomial $f$ of degree $\leq t$, we have $(1/n) \sum f(a_i) = \int_{S^{d-1}} f dA$, where $dA$ is Haar measure normalized so that $\int_{S^{d-1}} dA=1$. The conditions that $\sum a_i=0$ and $\sum a_i a_i^T=(n/d) \mathrm{Id}$ are equivalent to being a $2$-design. See neilsloane.com/sphdesigns for more.
–
David SpeyerFeb 4 '11 at 19:40

Many thanks for your answer. David. That is what I mean to ask! But the math you used is quite tough for me. For example, I have never learned about group, invariant bilinear form, etc. Is it possible to prove the equation using linear algebra?
–
ShiyuFeb 4 '11 at 19:55

Thank you David, and also many thanks to the other people. I’m very glad to receive so many good answers and comments. I think David’s answer is the best. It not only presents the solution in both $R^2$ and higher dimensions, but also teaches me how powerful the group theory is. Anyway, I have learned a lot here. I appreciate your attention and thank you.
–
ShiyuFeb 6 '11 at 0:55

This is for the original question, not the "distributed uniformly" version, which is not very interesting. in $d=2,$ if you use complex numbers, the OP is equivalent to asserting that if $z_1, \dotsc, z_n$ are unit complex numbers such that $\sum z_k = 0,$ then $\sum z_k^2 = 0.$ (Proof: write the projection matrices in terms of sines and cosines). This is obviously false, since you can put $n/2$ of the $z_k$ near $1$ and the other $n/2$ be the negations of the first $n/2.$

For the OP, darij in fact gives a good counterexample. But the if n is odd, I think the OP is right since $a_1+..+a_n=0$ is equivalent to the uniformly located vectors when n is odd, right?
–
ShiyuFeb 4 '11 at 20:03

Umm, no. There is an $n$ (real dimensional) set of possible sets $a_1, \dotsc, a_n, $ and the vanishing of the sum is $2$ real conditions (on the real and imaginary part). There is a $1$ dimensional symmetry group, as well (which you can eliminate by putting $a_1=1,$ so you have a $n - 3$ dimensional family left over. Notice that this equals $0$ when $n=3,$ so in that case you do only get the roots of unity. But when $n=5,$ no.
–
Igor RivinFeb 4 '11 at 20:27

Thank you lgor. But I don't understand the '1 dimensional symmetry group' which David also mentioned. It’s a thing in group theory, right? Can the problem be explained just using linear algebra or matrix analysis?
–
ShiyuFeb 5 '11 at 3:38

It seems the group theory can solve the problem easily even for high dimensional spaces. But it becomes complex if we try to use elementary method.
–
ShiyuFeb 5 '11 at 3:52

Many thanks for your effort. Using your method, we can show the converse statement does not hold. If we let $a_i=(cos\theta_i,sin\theta_i)$, then the equation is equivalent to $\sum cos2\theta_i=0$ and $\sum sin2\theta_i=0$. Consider n=3. $\theta_1=0, \theta_2=\pi/3, \theta_3=2\pi/3$ let $\sum cos2\theta_i=0$ and $\sum sin2\theta_i=0$ hold. But obviously they don’t satisfy $\sum cos\theta_i=0$ and $\sum sin\theta_i=0$. It is quite straightforward to use $a_i=(cos\theta_i,sin\theta_i)$ to prove the statement. But it can’t be generally applied to $R^d$ where d>2.
–
ShiyuFeb 5 '11 at 3:05

@Nick Re matrix matters, you could write something like: Thus $a_ia_i^T$ is the $2\times2$ matrix $[[x,y][y,z]]$ with $x=\ldots$, $y=\ldots$ and $z=\ldots$ Or, better still, you could introduce at the onset a shorthand such as $\theta_i=\theta+2(i-1)\pi/n$ and use it everywhere afterwards.
–
DidFeb 5 '11 at 9:19