Eigenvectors and orthogonal basis

1. The problem statement, all variables and given/known data
I have a linear transformation ##\mathbb{R}^3 \rightarrow \mathbb{R}^3##. The part that asks for a basis of eigenvectors I've already solved it. The possible eigenvectors are ##(1,-3,0), (1,0,3), (\frac{1}{2}, \frac{1}{2},1) ##. Now the exercise wants me to show that there is no orthogonal basis of eigenvectors for this particular linear transformation.

How do I show it?

The exercise doesn't ask this, but what's the implication of eigenvectors forming an orthogonal basis?

2. Relevant equations

3. The attempt at a solution

With the euclidean inner product I can clearly see that the eigenvectors are not orthogonal to each other. But I'm not sure if calculating many pairs of dot products is the way to show it.

I thought about Gram-Schmidt but doing that would make the vectors not be eigenvectors anymore.

Now the exercise wants me to show that there is no orthogonal basis of eigenvectors for this particular linear transformation.

How do I show it?

I suppose you could do a Schur decomposition, and show that you get a matrix ##\mathbf{UTU^*}## where ##\mathbf{T}## has eigenvalues along the diagonal but it is upper triangular only (i.e. it is not both upper triangular and lower triangular -- i.e. not diagonal). Out of curiosity, what are the eigenvalues here?

The exercise doesn't ask this, but what's the implication of eigenvectors forming an orthogonal basis?

Orthogonal (and unitary if complex) matrices are extremely pleasant to work with. From a geometric standpoint, they are length preserving. Consider some real valued vector ##\mathbf{x}## and an orhogonal matrix ##mathbf{U}##: ##||\mathbf{x}||_2^2 = \mathbf{x^Tx} = ||\mathbf{Ux}||_2^2 =\mathbf{x^TU^TUx} = \mathbf{x^TIx} = \mathbf{x^T x}##. Having mutually orthonormal eigenvectors is immensely useful in manipulating things like quadratic forms -- e.g. maximizing or minimizing ##\mathbf{x^T A x}##, which comes up all the time (e.g. Hessian matrix of second derivatives in calc).

On top of all this, orthogonal matrices are very nice for preserving numeric stability if you're doing serious computational work.