I ran into a "well-known identity" on page 345 of Shepp and Lloyd's On ordered cycle lengths in a random permutation:
$$\int_x^{\infty} \frac{\exp(-y)}y dy = \int_0^x \frac{1-\exp(-y)}y dy - \log x - \gamma, $$
where $\gamma$ is the Euler constant. I am clueless as to how it is derived. Any reference to the derivation of such formulae would suffice, but an explicit solution will also be appreciated.

3 Answers
3

You can apply WZ theory to such identities. In particular, both sides satisfy
$$x*z''(x) + (x+1)z'(x)$$
Picking $x=1$ as the initial condition (since the DE is regular there, that helps), we see that both sides evaluate to $Ei(1,1)$ and their derivatives both evaluate to $-1/e$, so they are equal.

I got that differential equation using Maple's PDEtools[dpolyform] function, which uses Groebner bases over differential polynomials to 'solve' this problem. All the rest is classical analysis (as in A course of modern analysis by Whittaker and Watson, 1926 - which is unfortunately not material that is taught very much anymore, I certainly had to learn a lot of that 'on my own').

[Edit: fixed an error in the evaluation of the derivative, I pasted in the wrong line]

Thanks for the nice reference on wiki! I was looking at the wrong place.
–
John JiangMar 27 '10 at 19:46

2

You can prove the identity up to a constant factor by differentiating with respect to x, so it only remains to prove it for x = 1. This should be a little easier.
–
Qiaochu YuanMar 27 '10 at 19:55

Indeed, it makes it a lot easier. Using integration by part, one can show that $\int_1^{\infty} \exp(-y)/y dy - \int_0^1 \frac{1-\exp(-y)}{y}dy = \int_0^{\infty} \exp(-y) \log y dy which is listed as equal to $-\gamma$ in the following wiki page: en.wikipedia.org/wiki/… I am yet to figure out why that formula in the wiki page is true.
–
John JiangMar 27 '10 at 20:51

1

The integral $\int_0^\infty e^{-t}\log t\,dt$ equals $\Gamma'(1)$. This can be evauated as $-\gamma$ using the infinite product for the gamma function.
–
Robin ChapmanMar 28 '10 at 7:58

Thanks Robin. I will write a short summary of the proof combining all the ingredients given so far.
–
John JiangMar 28 '10 at 18:02

So one of the approaches to proving the equality in the question is via the following three steps:
First differentiate both sides of the equation to see that they agree up to a constant. This reduces to showing the case of $x = 1$, for which $\log x = 0$.

Finally observe that $\Gamma'(1)$ equals the RHS, by differentiating under the integral sign, valid because things are decaying fast enough at infinity.

So it remains to show $\Gamma'(1) =\gamma$. I saw a soft argument (i.e., without using infinite product) in the link scipp.ucsc.edu/~haber/ph116A/psifun_10.pdf
This is re-exposed below:

first we establish that for $\Psi(x) = \log \Gamma(x)$,
$$
\Psi'(x+1) = \Psi'(x) + 1/x
$$
This is easy enough since we have we have the functional equation $\Gamma(x+1) = x\Gamma(x)$.
Next using stirling approximation we get

$$
\Psi(x+1) = (x+1/2)\log x -x + 1/2 \log 2 \pi + O(1/x)
$$
and then they differentiate this and claim that $O(1/x)' = O(1/x^2)$, which is clearly false (take $f(x) = 1/x cos(e^x)$). But I found in Wikipedia another formula that gives the precise error term in terms of an integral of the monotone function $arctan(1/x)$. So this is enough to establish $O(1/x^2)$ for the error term in the derivative of $\Psi$. So we get the asymptotics $\lim_{x \to \infty} \Psi'(x+1) = \log(x)$, from which we get $\Psi'(1) = \gamma$. Now notice $\Psi'(x) = \Gamma'(x)/ \Gamma(x)$, and $\Gamma(1) = 1$, so $\Gamma'(1) = \gamma$ also.