This is the problem: Let $A$ be a real symmetric $n \times n$ matrix with non negative entries. Prove that $A$ has an eigenvector with non-negative entries

I looked at the answer key and don't quite understand it. In the expression containing max, why should it correspond to the eigenvalue $\lambda_0$? I thought that this may be because if Ax is parallel to x, then the dot product between $Ax$ and $x$ is maximised, but is it not possible that it still attains a large value if $A$ transforms $x$ in a way that scales x by so much that Ax is large enough to make $\langle Ax,x\rangle$ large even though they may not be parallel?

and the maximum it attains precisely when $x$ is an eigenvector of $A$ with
eigenvalue $\lambda_0$. Suppose $v$ is a unit vector for which the maximum is attained, and let $u$ be the vector whose coordinates are the absolute values of the coordinates of $v$. Since the entries of $A$ are nonnegative, we have

$$\langle Au,u \rangle \ge \langle Ax,x\rangle =\lambda_0$$ implying that $\langle Au,u\rangle = \lambda_0$, so that $u$ is an eigenvector of $A$ for the eigenvalue $\lambda_0$.

4 Answers
4

$A$ is real symmetric, so it has $n$ independent eigenvectors with real eigen-values $\lambda_1\geq \cdots \geq\lambda_n$.

Let $v_1,v_2,\cdots, v_n$ be the corresponding independent eigenvectors, and we can assume that their length is $1$ (i.e. they form orthonormal basis).

In max expression, you considered $x$ with $\|x\|=1$. Let
$$x=a_1v_1+a_2v_2+\cdots + a_nv_n.$$
Then $$Ax=\lambda_1a_1v_1+\cdots + \lambda_na_nv_n.$$
Therefore, w.r.t. above orthonormal basis, noting that $\lambda_i\leq \lambda_1$, we get
$$\langle Ax,x\rangle=\lambda_1|a_1|^2+\cdots + \lambda_n|a_n|^2 \leq \lambda_1 (|a_1|^2+\cdots + |a_n|^2)=\lambda_1 \langle x,x\rangle=\lambda_1\|x\|^2=\lambda_1.$$
This implies the maximum value of $\langle Ax,x\rangle$ for unit vector $x$ is $\lambda_1$. Now this is in fact attained, if you take
$$x=v_1= \mbox{ unit eigenvector for the largest eigenvalue } \lambda_1. $$ You can chech this with almost same calculations as above.

By the spectral theorem, a real symmetric matrix can be diagonalized by an orthogonal matrix. So by (rigid) change of basis, $A$ is diagonal. (By rigid, we mean that the initial orthonormal basis may be rotated to the basis that makes $A$ diagonal.) Consequently, embiggening you describe (which is a valid concern in general) cannot occur.

If $\lambda_0=\max_{\|x\|=1}\langle Ax,x\rangle$, then, for all $x$,
$$
0 \le \langle(\lambda_0I-A)x,x\rangle.
$$
Because $A$ is symmetric $\langle x,y\rangle_{\epsilon}=\langle(\lambda_0I+\epsilon I-A)x,y\rangle$ defines an inner product on your space with associated norm $\|x\|_{\epsilon}=\langle(\lambda_0I+\epsilon I-A)x,x\rangle^{1/2}$. By the Cauchy-Schwarz inequality applied to this inner product,
you get $|\langle x,y\rangle_{\epsilon}|\le \|x\|_\epsilon\|y\|_\epsilon$.
Letting $\epsilon\downarrow 0$ gives
$$
|\langle(\lambda_0 I-A)x,y\rangle|\le |\langle(\lambda_0-I)x,x\rangle|^{1/2}|\langle(\lambda_0-I)y,y\rangle|^{1/2}.
$$
Now you can see: if $\langle(\lambda_0-I)x,x\rangle=0$, then $\langle(\lambda_0I-A)x,y\rangle=0$ for all $y$, which gives $(\lambda_0I-A)x=0$.