Given $Q = \left( \begin{array}{ccc}
A + B & C \\
C^T & D\end{array} \right) $, where we know that $Q, A, B, D$ are all positive semi-definite, square, but not necessarily equally sized matrices, we would like to decompose $Q$ into two matrices in the form of

The context is quadratic optimization, where I want to solve $\min_x c^Tx + \frac{1}{2}x^TQx$. In the actual problem I have a diagonal $B$ and $A=U\Sigma U^T$, where $\Sigma$ is diagonal and small compared to the huge $A$. I want to use the standard trick of introducing a new $y$ variable with the constraint $y=U^Tx$, thus saving an enormous amount of computation on using $y^T\Sigma y$ instead of $x^TAx$. Without the transformation I am asking about, I cannot decompose the quadratic term into a positive (semi)definite form, which is necessary to keep the optimiser happy. But I would be completely satisfied if somebody can show me an alternative way to play the projection trick here.

@user1551: many thanks for looking into this. Looks like it is not trivial as I hoped; extended the description with the context and added optimization as a tag. Do you have an actual example proving that such a split is not always possible?
–
takbalMar 20 '13 at 21:44

1 Answer
1

Here is a partial answer for the case where both $A$ and $B$ are positive definite. We have
\begin{equation}
Q
=\begin{pmatrix}A&X\\X^T&X^TA^{-1}X\end{pmatrix}
+\begin{pmatrix}B&Y\\Y^T&Y^TB^{-1}Y\end{pmatrix}
+\begin{pmatrix}0&0\\0&D-X^TA^{-1}X-Y^TB^{-1}Y\end{pmatrix},\tag{1}
\end{equation}
where $X=A(A+B)^{-1}C$ and $Y=B(A+B)^{-1}C$. By considering the Schur complements of $A$ and $B$ in the first two summands, we see that the first two terms in $(1)$ are positive semidefinite. Furthermore, as $Q$ is positive semidefinite, the Schur complement of $A+B$ in $Q$ must be positive semidefinite too. That is, $D\succeq C^T(A+B)^{-1}C=X^TA^{-1}X + Y^TB^{-1}Y$. Therefore the last term in $(1)$ is also positive semidefinite. Absorb this last term into the first or second summand, we are done.

Unfortunately, from your problem description, it seems that $A$ is rank deficient. I am not sure if $(1)$ can be modified to work in this case (using pseudoinverse?), but hopefully it can shed some light on the problem.

Many thanks, thats definitely a step forward! I will think on how to use this tomorrow. Any idea on the 2nd, 0 off-diagonals version? I will accept this as a solution if you have that as well (or a proof that it cannot be done).
–
takbalMar 21 '13 at 2:14

@takbal Forcing the antidiagonal blocks to zero can only make things worse. For instance, consider $A=B=C=1$ and $D=1/2$. There does not exist any $x$ such that both $\begin{pmatrix}1&0\\0&x\end{pmatrix}$ and $\begin{pmatrix}1&1\\1&\frac12-x\end{pmatrix}$ are positive semidefinite.
–
user1551Mar 21 '13 at 2:42

Obvious in hindsight, thank you. As you said, your solution has problems with the rank deficient A, but I am accepting this anyway as you answered my question in the title.
–
takbalMar 21 '13 at 9:53