Homework Help:
Application of the Schwarz Inequality

Here's yet another assigned problem that I'm having difficulty with. I think I'm close to the end but am "nervous" (for lack of a better word) about whether or not I have used summation notation properly throughout the problem. Here it is:

I will now describe my approach without writing out each step explicitly (I'm finding summation notation tedious to write for LaTeX and this is for a graded assignment so I'd rather not have the entire answer floating on the web). If anything I say is unclear or you require more details let me know and I can fill it out.

Also, please note that the sums in both the above inequalities are from j=1 to j=n. I was unsure how to code that properly in LaTeX.

I squared both side of the given inequality and expanded the left-hand side. After some algebra I obtained the following expression:

The right-hand sides of this expression and the Schwarz Inequality are exactly equal. I argue that this inequality is true if it's left-hand side is less than or equal to the left-hand side of the Schwarz Inequality, that is:

So I am done. But how can I extend this to the case when [itex]j \neq 1[/itex]?

If anything I've described above seems incorrect please let me know and I can write out the steps I took explicitly to see where I went wrong. Thanks a lot in advance for any assistance! :)1. The problem statement, all variables and given/known data

To be honest, I'm not sure how I should use that. I expect I would apply it to the right-hand side of my expression. Is it applicable even though the sum is inside the modulus. Is it possible to provide a hint of how to apply what you've recommended above? :s

To be honest, I'm not sure how I should use that. I expect I would apply it to the right-hand side of my expression. Is it applicable even though the sum is inside the modulus. Is it possible to provide a hint of how to apply what you've recommended above? :s

It's not so much the formula that's important. It's the concept that a summation can be written as a sum or difference of two summations.

If you have the solution for j=1, then you can prove something about a combination of sums with one of the sums being the sum with j=1. What can you prove that would make the j≠1 case follow from the j=1 case?

From here I argued that the left-hand side can be interpreted geometrically as the sum of the x-components of the products ##a_{j}\overline{b_{j}}## and the right-hand side can be interpreted geometrically as the modulus or distance (ie. in both the x- and y- directions). Then, by the triangle inequality this inequality is true.

From here I argued that the left-hand side can be interpreted geometrically as the sum of the x-components of the products ##a_{j}\overline{b_{j}}## and the right-hand side can be interpreted geometrically as the modulus or distance (ie. in both the x- and y- directions). Then, by the triangle inequality this inequality is true.

That's not what you want to do, you want to prove the triangle inequality, not use it. Define a dot product ##a \cdot b=\sum a_{j}\overline{b_{j}}## and define a norm ##|a|=\sqrt{a \cdot a}##. Then Cauchy-Schwartz tells you ##|a \cdot b| \le |a| |b|##. Try to use that to prove ##|a+b| \le |a|+|b|##, and yes, square both sides. Start with ##|a+b|^2=(a+b) \cdot (a+b)##.

This is true via the triangle inequality, as the left-hand side is the "x-component" of ##z_{j}## (ie. the magnitude of the real part) and the right-hand side is the modulus of ##z_{j}##. Is this geometric argument sufficient?

If yes, then I have (successfully) argued that ##(\sum Re[z_{j}])^{2} \leq |\sum z_{j}|^{2}## and therefore the initial inequality is true.

This is really much easier to write if you use dot product notation. You don't need the 'Re' part of your argument, though it's certainly true. Just use CS on each of the two mixed sums separately ##|a \cdot b|=|b \cdot a| \le |a| |b|##.