8 Answers
8

Your first question has already many answers. So I'll address your second question.

Edit: in view of Marc van Leeuwen's recommended answer and examples, I will precise how I interpret your question. I read: is it possible that $A$ and $I+A$ have the same complex eigenvalues repeated according to their algebraic multiplicities? I also assume you mean $n\geq 1$. With these assumptions, the answer to your second question is no.

Proof 1: we have
$$
\mbox{tr}(A+I)-\mbox{tr} (A) =\mbox{tr}(A)+n-\mbox{tr}(A)=n\geq 1.
$$
So $A+I$ and $A$ have distinct traces. In particular, they can't have the same eigenvalues.

Proof 2: let $p_A(X)=\det (XI-A)$ be the characteristic polynomial of $A$. The characteristic polynomial of $I+A$ satisfies:
$$
p_{I+A}(X)=\det(XI-I-A)=\det((X-1)I-A)=p_A(X-1).
$$
If $A$ and $I+A$ have the same eigenvalues, it means that they have the same characteristic polynomial. This is therefore equivalent to
$$
p_A(X)=p_A(X-1).
$$
If $p_A(\lambda)=0$, $p_A(\lambda-1)=0$. And by induction $p_A(\lambda -k)=0$ for all integer $k\geq 0$. So $p_A$ is a degree $n\geq 1$ polynomial with infinitely many roots. That's impossible.

Proof 3: much better than proofs 1 and 2, actually. It follows immediately from the definition of the spectrum that $\mbox{Spectrum}(A+I)=\mbox{Spectrum}(A)+1$. Taking the maximum of the real parts of these sets, we get

$$
\max\; \mbox{Re} \;\mbox{Spectrum}(A+I)=\max \;\mbox{Re} \;\mbox{Spectrum}(A)+1.
$$
So $A$ and $A+I$ can't have the same spectra, not even mentioning multiplicities.

Note: the last argument works also for the real spectrum as soon as it is nonempty, which happens simultaneously for $A$ and $A+I$. It works also more generally in a Banach algebra. And as pointed out by Marc van Leeuwen, from leonbloy's observation, it can simply be summarized to: there is no nonempty finite subset in $\mathbb{R}$ which is invariant under $\lambda\longmapsto \lambda +1$. You can now replace finite by compact, and $\mathbb{R}$ by $\mathbb{C}$, to get the general Banach algebra case.

Hmm... I'm not sure I'd say that $n=0$ is precisely absurd, but in any case these are solid proofs that the only $A$ which could possibly work is the trivial map on the $0$-dimensional vector space. (+1)
–
MicahFeb 13 '13 at 19:45

Nice! But can we alter the question a bit? Say than $n$ is a prime number, and the setting is a field of characteristic $n$. So $I$ and $I+A$ can have the same trace. But what about eigenvalues? You used properties of the complex field in your answer... Maybe it can be done differently...
–
LudolilaFeb 13 '13 at 20:43

And using a diagonal $A$ it is easy to arrange both $A$ and $I+A$ to have determinant zero. Just make one of the diagonal entries equal to $-1$ and one equal to $0$.
–
Jyrki Lahtonen♦Feb 13 '13 at 17:39

The requirement is that if $A$ has eigenvalues $\lambda_k$, then $\det A = \prod_k \lambda_k = \prod_k (\lambda_k+1) = \det (A+I)$. This is easy to arrange if we choose a singular matrix with one eigenvalue at $-1$.

To get a non-singular example, suppose $\lambda_2=1$ to simplify, then the other eigenvalue must satisfy $1 \cdot \lambda_1 = (1+1)(\lambda_1 +1)$, which reduces to $\lambda_1 = -2$. Hence $A=\begin{bmatrix} 1 & x \\ 0 & -2 \end{bmatrix}$ will work (for any $x$).

This says that the eigenvalues of $A+I$ are the same eigenvalues of $A$ incremented by 1, hence they cannot be the same. (It can happen, of course, than some eigenvalue of $A+I$ is also an eigenvalue of $A$).

(Update) I assumed here that we are considering the "full" set of eigenvalues (in $\mathbb{C}$). If we restrict to real eigenvalues, the answer also applies; but now, as noted in others answers, the sets can coincide iff they are empty.

Just to give the mathemapedantically correct answer to the second question: Yes the eigenvalues of $A$ can be the same as those of $A+I$, but only if there aren't any of them. Real square matrices correspond to operators on real vector spaces, and such operators may well have no eigenvectors at all.
A typical example arises for $$ A=\begin{pmatrix}0&-1\\1&0\end{pmatrix}.$$
If you view $A$ as a matrix over $\Bbb C$, then a corresponding complex-linear operator will have eigenvalues, namely $\mathbf i$ and $-\mathbf i$, which will differ from those, $1+\mathbf i$ and $1-\mathbf i$, of the linear operator associated to $A+I$. So with this interpretation $A$ is not an example.

However even over the complex numbers an operator might not have any eigenvalues. This happens if (and only if) the space it is defined on is of dimension $0$.
So you get a $0\times0$-matrix $A$ in this case, for which even $A=A+I$ holds (here $I$ is also the $0\times0$-matrix; it is both an identity matrix and a zero matrix).

The reason that $A$ and $A+I$ can only have the same eigenvalues if there aren't any is of course (as leonbloy indicates) that immediately from the definition, $\lambda$ is eigenvalue for $A$ if and only if $\lambda+1$ is eigenvalue for $A+I$, and the empty set is the only finite subset of $\Bbb R$ that is invariant under translation by $1$.

This is a necessary answer, I've just seen it. It made me precise my implicit assumptions. Maybe we'll know some day what the OP actually had in mind?
–
1015Apr 9 '13 at 2:34

1

The OP does not omit complex eigenvalues, why are you doing so?
–
Michael GrantApr 9 '13 at 2:57

1

@MichaelGrant: OP talks about real matrices, which most naturally are associated to a linear operator of a real vector space. Eigenvectors and eigenvalues are defined for operators on a vector space, and for matrices only by the intermediary of such an operator whose matrix it is. So I interpret the question as being about the eigenvalues of a $\Bbb R$-linear operator with the given matrix, and by definition eigenvalues of such an operator must be real.
–
Marc van LeeuwenApr 9 '13 at 4:42

@julien: The formulation of my answer was not at all pointing specifically to your answer, and even less doing so condescendingly. The term "mathematically" means from a strictly mathematical point of view, as opposed to pedagogical or whatever; it does not imply disapproval of other points of view. What I was missing in the whole of the given answers at that point was a simple reasoning (a) eigenvalues of $A+I$ are by definition eigenvalues of $A$ shifted by $+1$ (b) the only shift-invariant finite subset of $\Bbb R$ is the empty set. I mentioned leonbloy who had (a), but not (b).
–
Marc van LeeuwenApr 9 '13 at 7:01

I'm with Julian on this. Not only has the assumption of a complex spectrum been my nearly universal experience, the fact that OP began with a question of the determinant and extended it from there suggests it as well (since the determinant is the product of both the real and complex eigenvalues). To not at least acknowledge the complex case is unhelpful.
–
Michael GrantApr 9 '13 at 11:51

I found the same example as Jyrki (for $n=2$), as follows. The determinant is the product of eigenvalues, and eigenvalues of $I+A$ are $1+\lambda$ for $\lambda$ an eigenvalue of $A$. So to try and find an example, assume $A$ is $2\times 2$ with eigenvalues $a$ and $b$. We need $(1+a)(1+b)=ab$, or equivalently $1+a+b=0$. Taking $a=b=-\frac{1}{2}$ achieves this, and so:
$$A=\begin{pmatrix}-1/2&0\\0&-1/2\end{pmatrix}$$
provides an example.

In general, any matrix with eigenvalues $\lambda_i$ such that $\prod\lambda_i=\prod(\lambda_i+1)$ will do.

@julien First, fix the eigenvalues. This is easy. In the 4x4 case, I use copper.hat's trick: let the eigenvalues appear in pairs of the form $(-n-1,n)$ with $n>0$. In the 3x3 case, just let computer runs thro' a for-loop to find a triple $(-m,-n,p)$ such that $(m-1)(n-1)(p+1)=mnp$. Once the diagonal is set, construct the strictly upper triangular part at random. Then you get a nonsingular triangular matrix. Finally, bring it to another random looking matrix via similarity by a Pascal matrix.
–
user1551Feb 14 '13 at 8:27