It's from a theorem that similar matrices have the same eigenvalues.
Let A and B be similar. Then B = $\mathbf P^{-1}$ AP, for some nonsingular matrix P. We prove that A and B have the same characteristic polynomials, pA(λ) and pB(λ), respectively. We have:
pB(λ) = det(λIn - B) = det(λIn - $\mathbf P^{-1}$ AP) = det($\mathbf P^{-1}$ λInP - $\mathbf P^{-1}$ AP).

Could you provide some context? In general, you are correct that $P^{-1}AIP = P^{-1}AP$ would not equal $A$, but under some circumstances it may. (I'm also assuming $I$ is the identity, but perhaps this is not so?)
–
Arturo MagidinJun 24 '12 at 1:00

2

See? Context makes all the difference. It's not true for an arbitrary matrix, but it is definitely true for a scalar multiple of the identity, since $\lambda I C = C\lambda I$ for all matrices $C$.
–
Arturo MagidinJun 24 '12 at 1:38

Note that we don't really move $P^{-1}$ around; we apply the product property of determinants: $\det(AB) = \det A \cdot \det B$, and since the determinant is a real number, you can move it around because the real numbers are associative and commutative with respect to multiplication.

Now, why does $xI = P^{-1}xIP$?

We call scalar matrices all matrices of the form $kI$, where $k$ is a scalar (note that these matrices have zeroes everywhere except on the diagonal, which has all its entries equal to $k$). It turns out that scalar matrices commute with every matrix, that is, for any matrix $A$, $(kI)A = A(kI)$. (Actually, there are no other matrices that commute with all the matrices.)

Let's prove this equality entrywise; $n$ stands for the appropriate dimension of the matrix. The important thing to note in both cases is that the matrix $kI$ has all zeroes except in the diagonal, so the sums reduce to only one term:

In that case, we say that scalar matrices (scalar multiples of the identity matrix) commute with all other matrices. Since $xI$ is a scalar matrix, it commutes with $P^{-1}$ and $P$, so: $$ xI = xIP^{-1}P = P^{-1}xIP$$
–
talmidJun 24 '12 at 1:35