Let $A:H\longrightarrow H$ be a self-adjoint operator, where H is an Hilbert space.
Let $(E_{\lambda})_{\lambda}$ be the spectral decomposition of $A$ and $\lambda_0$
a regular value of A with finite multiplicity and isolated in the
spectrum of $A$. Chosen $\varepsilon$ small enough we have
$$A=\int_{\mathbb{R}}\lambda dE_{\lambda}= \lambda_0\Pi_{\lambda_0}+\int_{|\lambda-\lambda_0|>\varepsilon}\lambda dE_{\lambda}$$.
Why is it that $$(A-z)^{-1}=\frac{1}{\lambda_0-z}\Pi_{\lambda_0}+\int_{|\lambda-\lambda_0|>\varepsilon}\frac{1}{\lambda-z}dE_{\lambda}$$?.
(z is not in the spectrum of A). I know a theorem that states that if $A=\int\lambda dE_{\lambda}$ then $(A-z)^{-1}=\int\frac{1}{\lambda-z}dE_{\lambda}$
but I don't get it in the case above.