How did you arrive at the RHS using the Hermitian adjoint? I understand using it for the LHS.

Well, the Hermitian adjoint is defined, at least in finite-dimensional Hilbert spaces, as the complex conjugate transpose. Scalars you can think of as 1-dimensional vectors. Transposing a scalar doesn't change it, but the complex conjugate part does (if it's complex, of course). Hence, the Hermitian adjoint of a scalar is just the complex conjugate. The fact that I'm right-multiplying by the scalar is unimportant, as scalars can pass through vectors no problem (that is, scalar multiplication of a vector is commutative).

I guess so, then we arrive at

If we assume a' and a'' to be equal then we determine that a' must be real..that I understand.

I dont follow when a' not equal a'' ie the orthonormality...

not sure how to continue...

thanks

I think you're confusing two different proofs.

1. If you're trying to show that the eigenvalues of an Hermitian operator are real, then you play around with

not

The reason is that you have to know that which is true because eigenvectors, by definition, are nonzero.

2. If you're trying to prove that the eigenvectors of differing eigenvalues are orthogonal, then you play around with

where the eigenvalues of the two eigenvectors there are different.

I would definitely prove that the eigenvalues are real before proving that the eigenvectors of differing eigenvalues are orthogonal.

Well, the Hermitian adjoint is defined, at least in finite-dimensional Hilbert spaces, as the complex conjugate transpose. Scalars you can think of as 1-dimensional vectors. Transposing a scalar doesn't change it, but the complex conjugate part does (if it's complex, of course). Hence, the Hermitian adjoint of a scalar is just the complex conjugate. The fact that I'm right-multiplying by the scalar is unimportant, as scalars can pass through vectors no problem (that is, scalar multiplication of a vector is commutative).

I think you're confusing two different proofs.

1. If you're trying to show that the eigenvalues of an Hermitian operator are real, then you play around with

not

The reason is that you have to know that which is true because eigenvectors, by definition, are nonzero.

2. If you're trying to prove that the eigenvectors of differing eigenvalues are orthogonal, then you play around with

where the eigenvalues of the two eigenvectors there are different.

I would definitely prove that the eigenvalues are real before proving that the eigenvectors of differing eigenvalues are orthogonal.

Thanks, things are becoming clearer but slowly :-) Here is my working from start to finish

the hermitian adjoint is of this is but

Now right x by giving (1)

Similarly assuming that a'' satisfies the eigen value equation then , left x this by giving

(2)

subtracting 2 from 1 we get

According to Sakurai a' and a'' can be the same or different. If we assume the same we get

It is not true in general that . For the purposes of the proof all we need is

-Dan

Edit: Note carefully that if we can't yet say anything about it being zero or not. This may be positive, negative, or zero. Your next step in the proof is to show that if they have different eigenvalues. (At least, not for the non-degenerate case.)

What happens in the degenerate case? You can have Hermitian operators with degenerate eigenvalues (the identity, for example). In that case, you'd have more than one eigenvector per eigenvalue, thus rendering your implication false.

2 things can happen

1) we established or

2) are orthonormal

This look right? :-)

I think you're ok with 2), but like I said, I think your 1) needs a bit more work.

What happens in the degenerate case? You can have Hermitian operators with degenerate eigenvalues (the identity, for example). In that case, you'd have more than one eigenvector per eigenvalue, thus rendering your implication false.

I think you're ok with 2), but like I said, I think your 1) needs a bit more work.

Ok, perhaps what you are suggesting is beyond the scope of sakurai's material as attached. Thanks for the great help once again! :-)

Ok, in looking at Sakurai's argument, I would phrase it a little differently. In proving the reality condition of the eigenvalues, I would say that you assume AND you assume Then the reality condition follows. The double assumption sidesteps the degenerate case nicely, and is probably what Sakurai had in mind anyway. Trust me, the degenerate case is not beyond Sakurai! By "degenerate case", I just mean that you have eigenvalues of algebraic multiplicity greater than one. They are multiple roots of the characteristic equation

Then, when you prove the orthogonality condition, ALL you have to assume is that the eigenvalues are different. Because you just proved they are all real, the result follows.

So please ignore what I said earlier about the "implication going the wrong way". What you really need is to replace the implication with the double assumption that the eigenvalues are the same, AND the eigenvectors are the same. Then you get the desired result.

I think that is making good sense now, thanks! So multiple roots would be like repeated roots.

Originally Posted by Ackbeet

So please ignore what I said earlier about the "implication going the wrong way". What you really need is to replace the implication with the double assumption that the eigenvalues are the same, AND the eigenvectors are the same. Then you get the desired result.

So for orthonormality we also take the double assumption but in the sense the eigen values AND eigenkets are NOT equal ie