Assume that $A_{d\times d}$ is a symmetric positive semi-definite matrix, and $\{\mathbf{x}_1,\ldots,\mathbf{x}_d\}$ composes a group of orthogonal bases of $\mathbb{R}^d$ where $\mathbf{x}_i\bot\mathbf{x}_j,\forall i\neq j$ and $\|\mathbf{x}_i\|=1$.

Then, my question is that given
$$
\mathbf{x}_i^\top A\mathbf{x}_i=a_i,\forall i
$$
how to estimate the bound of the eigenvalues of $A$?

Using eigenvalue decomposition $A=U\Lambda U^\top$ and let $\tilde{\mathbf{x}}_i=U^\top\mathbf{x}_i$, this question can be simplified as to estimate the bound of the elements of the diagonal matrix $\Lambda$ given
$$
\tilde{\mathbf{x}}_i^\top\Lambda\tilde{\mathbf{x}}_i=a_i,\forall i.
$$
Since $U$ is orthogonal, we also have $\tilde{\mathbf{x}}_i\bot\tilde{\mathbf{x}}_j,\forall i\neq j$ and $\|\tilde{\mathbf{x}}_i\|=1$.

Denote $\Lambda=\mathrm{diag}[\lambda_1\;\ldots\;\lambda_d]^\top$ and $\tilde{\mathbf{x}}\_i=[\tilde{x}\_{i1}\;\ldots\;\tilde{x}\_{id}]$, the equations above can be represented as
$$
\sum_j\lambda_j\tilde{x}_{ij}^2=a_i,\forall i
$$

However, this question seems to be not so simple yet.

In the case of $d=2$, I found the following method to solve it.

Denote $\tilde{\mathbf{x}}_1=[\cos\theta\;\sin\theta]^\top$ and $\tilde{\mathbf{x}}_2=[-\sin\theta\;\cos\theta]^\top$, we have
\begin{eqnarray}
\lambda_1\cos^2\theta+\lambda_2\sin^2\theta&=&a_1\\\
\lambda_1\sin^2\theta+\lambda_2\cos^2\theta&=&a_2
\end{eqnarray}
Solve the equation system and we get
\begin{eqnarray}
\lambda_1&=&\frac{1}{\Delta}(a_1\cos^2\theta-a_2\sin^2\theta)
=\frac{a_1-a_2\tan^2\theta}{1-\tan^2\theta}=a_2+\frac{a_1-a_2}{1-\tan^2\theta}\\\
\lambda_2&=&\frac{1}{\Delta}(-a_1\sin^2\theta+a_2\cos^2\theta)
=\frac{-a_1\tan^2\theta+a_2}{1-\tan^2\theta}=a_1-\frac{a_1-a_2}{1-\tan^2\theta}
\end{eqnarray}
where $\Delta=\cos^4\theta-\sin^4\theta=\cos^2\theta-\sin^2\theta$.

Without loss of generality, assume that $a_1\geq a_2$ and $\lambda_1>\lambda_2$, we can deduce that $1-\tan^2\theta>0$, then from $\lambda_i\geq 0$, we have
$$\tan^2\theta\leq\min\{\frac{a_2}{a_1},\frac{a_1}{a_2}\}=\frac{a_2}{a_1}$$
Therefore it is obvious that when $\tan^2\theta=\frac{a_2}{a_1}$, $\lambda_1$ and $\lambda_2$ take its maximum and minimum respectively as
\begin{eqnarray}
\lambda_1&=&a_1+a_2\\\
\lambda_2&=&0
\end{eqnarray}
and we get the bound of the eigenvalues of $A$.

If I am not getting too confused by the notation, you may change basis and assume that $x_i$ are the columns of the identity matrix. Then your problem turns into "is there a way to bound the eigenvalues of a symmetric positive semidefinite $A$ by knowing only its diagonal?".
–
Federico PoloniMar 1 '12 at 9:02

@Federico Poloni, I think you are right. However, I am not to get a way to bound the eigenvalues but to calculate the bound value of the eigenvalues. Is there any result about this question then?
–
ppyangMar 1 '12 at 9:26

Can you write it in a formula? I find it difficult to understand what you mean by "calculate the bound value of the eigenvalues".
–
Federico PoloniMar 1 '12 at 9:48

@Federico Poloni, what I want to express is that I want to know the possible maximum and minimum eigenvalue of $A$ given its diagonal. With the answer given by Denis Serre, they should be $\sum_i a_i$ and $0$ respectively.
–
ppyangMar 1 '12 at 13:52

1 Answer
1

You are just comparing the diagonal $\vec a$ and the spectrum $\vec\lambda$ of a symmetric matrix (positive semi-definiteness is not an issue here). The key result is a theorem due to R. Horn & C. Johnson : $\vec\lambda$ and $\vec a$ are the spectrum and the diagonal of a symmetric matrix if and only if $\vec\lambda\succ\vec a$ ($\vec\lambda$ majorizes $\vec a$). This means that if you list the items in the non-decreasing way,
$$a_1\le\cdots\le a_n,\qquad\lambda_1\le\cdots\le\lambda_n,$$
then
$$\lambda_1+\cdots+\lambda_k\le a_1+\cdots+a_k,$$
for every $k=1,\ldots,n$, with equality for $k=n$.

Thank you very much! Could you provide the title of the paper containing this theorem?
–
ppyangMar 1 '12 at 10:00

You find it in several books. For instance in the compendium by Horn & Johnson. Or in my book Matrices (Springer-Verlag GTM216).
–
Denis SerreMar 1 '12 at 10:20

Given the terminology $\vec\lambda\succ\vec a$ and "$\vec\lambda$ majorizes $\vec a$", would your condition be more memorable written as $\lambda_k+\cdots+\lambda_n\ge a_k+\cdots+a_n$ for $k=1,\ldots,n$, with equality for $k=1$ ?
–
John BentinMar 1 '12 at 10:54

@John. Yes, this is the reason why it is always difficult to remember who is the majorizant.
–
Denis SerreMar 1 '12 at 12:17

I will refer to the books you suggested for more detail. Thank you all!
–
ppyangMar 1 '12 at 13:56