General question first: upper/lower bound a sum of Kronecker products by its components. More specifically,
how is $$
\Vert\sum_{\alpha}S_{\alpha}\otimes B_{\alpha}\Vert$$
bounded by the operator norms of $d_{S}\times d_{S}$ dimensional matrices
$\{ S_{\alpha}\}$ and $d_{B}\times d_{B}$ dimensional
matrices $\{ B_{\alpha}\}$? The operator norm $\Vert\cdot\Vert$
denotes the (usual) max abs singular value norm. The symbol $\otimes$ denotes the Kronecker or tensor product of matrices. Do we have a Cauchy-like inequality?

More specific question:

We have a "basis" set $\{ S_{\alpha} \}$
of $d_{S}^{2}-1$ Hermitian and traceless matrices: they
satisfy $$\text{Tr}(S_{\alpha}S_{\beta}^\dagger)=d_{S}\delta_{\alpha\beta}$$
where $\delta_{\alpha\beta}$ is the Kronecker symbol and
are all normalized with respect the operator norm: $\Vert S_{\alpha}\Vert=1$.
In short, they would be an orthonormal basis for the Hilbert-Schmidt
inner product if the identity matrix was included and they were normalized properly. For $d_s=2$ the $\{S_\alpha\}$ can be taken to be the Pauli matrices.
The matrices $B_{\alpha}$ are also Hermitian and traceless but otherwise
arbitrary. I would like to bound $\Vert\sum_{\alpha}S_{\alpha}\otimes B_{\alpha}\Vert$
from up and down by a factor of $\max_{\alpha}\Vert B_{\alpha}\Vert$.
The best I have done so far is rather miserable:$$
\frac{1}{d_S}\max_{\alpha}{\left\Vert B_{\alpha}\right\Vert }\leq\Vert\sum_{\alpha}S_{\alpha}\otimes B_{\alpha}\Vert\leq(d_S^{2}-1)\max_{\alpha}{\left\Vert B_{\alpha}\right\Vert }$$

The closest reference that I have found on this material is the work
by Chansangiam et al [ScienceAsia 35, 106 (2009)] which I am still
going through. This question is related to a question in physics for bounding
fidelity of evolutions in open quantum systems. In short I want to
bound the maximum energy scale in a combined system with that of its
components.

1 Answer
1

I do not know if this helps, but here are some upper bounds, standard in operator space theory. The first inequality, attributed to Haagerup, is an analog of the Cauchy-Schwarz inequality in your setting:

Here for a matrix $A = (A_{i,j})$, $\overline{A}$ denotes the matrix $(\overline{A_{i,j}})$. The expressions appearing on the right-hand side of this inequality are the norms of $(S_\alpha)$ and $(B_\alpha)$ in the operator Hilbert space OH. For a proof, see for example page 123 in Pisier's Introduction to Operator Space Theory.

Another inequality (no longer symmetric) that reduces to the usual Cauchy-Schwarz inequality when the matrices are of size $1$ is the following (and the same with the role of S and B reversed):
$$\Vert\sum_{\alpha}S_{\alpha}\otimes B_{\alpha}\Vert\leq \Vert\sum_{\alpha}S_{\alpha}S_{\alpha}^*\Vert^{1/2} \Vert\sum_{\alpha}B_{\alpha}^* B_{\alpha}\Vert^{1/2}.$$

Now the terms appearing on the left are, in the language of operator spaces, the row (resp. column) norm of $(S_\alpha)$ (resp. $(B_\alpha)$). The row (resp. column) norm of $S=(S_\alpha)$ is just the norm of the matrix $ROW(S)$ (resp. $COLUMN(S)$) obtained, in a block-decomposition, by putting $S_\alpha$'s on the first row (resp. column) and $0$'s on the other rows (resp. columns).

This last inequality is very easy to prove, and more generally we have $\Vert\sum_i a_i b_i\Vert\leq \Vert\sum_i a_i a_i^*\Vert^{1/2} \Vert\sum_i b_i^* b_i\Vert^{1/2}$ for any matrices $a_i$ and $b_i$. Indeed, the LHS of this inequality is $\Vert ROW(a) COLUMN(b)\Vert$, and its RHS is $\Vert ROW(a)\Vert \Vert COLUMN(b)\Vert$. This inequality is thus just expressing that the operator norm is sub-multiplicative.

Edit (for a lower bound, without the typo this time). In the case when the $S_\alpha$'s form an orthonormal family for the scalar product $\langle A,B\rangle = Tr(B^* A)/d_S$, you get the following lower bound:

This is because $\sum_{\alpha}B_{\alpha}^* B_\alpha$ is $1/d_S Tr \otimes id$ applied to $X^*X$, where $X=\sum_{\alpha}S_{\alpha}\otimes B_{\alpha}$. And since $1/d_s Tr$ is a state, $1/d_S Tr \otimes id$ is of norm $1$ from $M_{d_S} \otimes M_{d_B}$ to $M_{d_B}$.

In the specific situation of your problem, here are the bounds one actually gets.
Your assumptions on the $S_\alpha$'s imply that they all are unitary, and orthonormal for $\langle A,B\rangle = Tr(B^* A)/d_S$. Therefore you have that
$$\Vert\sum_{\alpha}S_{\alpha}\otimes \overline{S_{\alpha}} \Vert = \Vert\sum_{\alpha}S_{\alpha}^*S_{\alpha}\Vert= \Vert\sum_{\alpha}S_{\alpha}S_{\alpha}^*\Vert = N$$ where $N$ is the number of terms in the sum (you want to take $N=d_S^2-1$).

Thanks. I have to check these out... but any ideas about a lower bound esp. for the special problem where S_\alpha are some sort of basis?
–
Kaveh KhodjastehFeb 8 '11 at 15:35

A last comment: if you really want to compare with $max_\alpha\Vert B_\alpha\Vert$, the best constants one gets is $1$ for the lower bound and $N = d_S^2-1$ for the upper bound, which are both tight.
–
Mikael de la SalleFeb 8 '11 at 16:21

I think the upper bound will scale with $\sqrt{d_s^2-1}$ actually, as the $S_\alpha$ are well spread out in a sense but I have no proof... How do you get the lower bound of 1, by the way?
–
Kaveh KhodjastehFeb 8 '11 at 17:46

Thanks Mikael for the update. Consider a case where $B_\alpha$ are all the same $B$. Then the upper bound must be larger than $\Vert \sum S_\alpha\Vert\Vert B\Vert$ which despite having $d_s^2-1$ term, does not scale with $d_s^2-1$... Or I have no clue! Funny, the way I got my lower bound also used a similar method (we call it partial trace in physics) but I arrived at a different bound! I still need to check things more carefully.
–
Kaveh KhodjastehFeb 8 '11 at 23:18