Of course, it is easy to see, that the integral (or the
antiderivative) of $f(x) = 1/x$ is $\log(|x|)$ and of course for
$\alpha\neq 1$ the antiderivative of $f(x) = x^\alpha$ is
$x^{\alpha+1}/(\alpha+1)$.

I was wondering if there is an intuitive (probably geometric)
explanation why the case $\alpha=-1$ is so different and why the
logarithm appears?

Some answers which I thought of but which are not convincing:

Taking the limit $\alpha=-1$ either from above or below lead to diverging functions.

Some speciality of the case $\alpha=-1$ are that both asymptotes are non-integrable. However, the antidrivative is a local thing, and hence, shouldn't care about the behavior at infinity.

The limit works, only the implicit $+C$ diverges as $\alpha\to-1$ and needs accounting; ameliorate with definite integration: take $$\lim_{\alpha\to-1}\int_1^x u^\alpha du=\lim_{\beta\to0}\frac{x^{\beta}-1}{\beta},$$ then write $x^\beta$ as $\exp(\beta\log x)$ and use the Taylor series expansion of the exponential function. / Honestly though I don't see any geometric intuition behind why $\alpha=-1$ is so special of a situation, and I wish I did, but it reminds me of the fact that $\zeta(s)$ can be analytically continued to all of $\mathbb{C}$ except for $s=1$ ... maybe there's a connection?
–
anonSep 14 '11 at 11:01

1

It may be also worth to ask why the anti-derivative function defined by @Didier in his answer is inverse for the exponent. It's strange since for other power functions the inverses of anti-derivatives are also power functions (up to the scaling parameter).
–
IlyaSep 14 '11 at 19:57

5 Answers
5

Assume you know that for every $\beta$ the derivative of the function $x\mapsto x^\beta$ is the function $x\mapsto\beta x^{\beta-1}$ and that you want to choose $\beta$ such that the derivative is a multiple of the function $x\mapsto x^{\alpha}$. You are led to solve the equation $\beta-1=\alpha$, which yields $\beta=\alpha+1$. If $\alpha=-1$, this gets you $\beta=0$, but then the derivative you obtain is the function $x\mapsto 0x^{-1}=0$, which is not a nonzero multiple of $x\mapsto x^{-1}$. For every other $\alpha$, this procedure gets you an antiderivative but for $\alpha=-1$, you failed. Or rather, you proved that no power function is an antiderivative of $x\mapsto x^{-1}$. Your next step might be (as mathematicians often do when they want to transform one of their failures into a success) to introduce a new function defined as the antiderivative of $x\mapsto x^{-1}$ which is zero at $x=1$, and maybe to give it a cute name like logarithm, and then, who knows, to start studying its properties...

Edit (Second version, maybe more geometric.)

Fix $s>t>0$ and $c>1$ and consider the area under the curve $x\mapsto x^\alpha$ between the abscissæ $x=t$ and $x=s$ on the one hand and between the abscissæ $x=ct$ and $x=cs$ on the other hand. Replacing $x$ by $cx$ multiplies the function by a factor $c^\alpha$. The length of the interval of integration is multiplied by $c$ hence the integral itself is multiplied by $c^{\alpha+1}$.

On the other hand, if an antiderivative $F$ of $x\mapsto x^\alpha$ is a multiple of $x\mapsto x^\beta$ for a given $\beta$, then $F(ct)=c^\beta F(t)$ and $F(cs)=c^\beta F(s)$ hence $F(ct)-F(cs)=c^\beta (F(t)-F(s))$. Note that this last relation holds even if one assumes only that $F$ is the sum of a constant and a multiple of $x\mapsto x^\beta$.

Putting the two parts together yields $c^{\alpha+1}=c^\beta$. Once again, if $\alpha=-1$, this would yield $\beta=0$, hence $F$ would be constant and the area $F(t)-F(s)$ under the curve $x\mapsto x^\alpha$ from $s$ to $t\ge s$ would be zero for every such $s$ and $t$, which is impossible since the function $x\mapsto x^\alpha$ is not zero. (And for every $\alpha\ne1$, this scaling argument yields the correct exponent $\beta$.)

The problem is to explain the choice of $x^{a+1}/(a+1)$ as "the" antiderivatives (out of the whole universe of solutions to $f'(x) = x^a$, with its freedom to choose a different constant of integration for each value of $a$). If the choice is made by requiring the limit of $f$ to be zero at $x=0$, then it fails for all $a \leq -1$ and $a = -1$ is not a unique exception. If the choice is via $f(1) = 1/(a + 1)$ this is no better justified than, say, $f(7) = \cot(a+1)$ which breaks down at many other values, or $f(1)=0$ that works for all $a$ (i.e. uses $\int_1^{x}$ to select antiderivative).
–
zyxSep 14 '11 at 21:16

1

Also, the scaling between integrals on $(s,t)$ and $(cs,ct)$, described in the second argument, works perfectly (and most simply of all) in the logarithmic case. It is only the requirement for a notationally convenient, but otherwise arbitrary, form of the anti-derivative that is in conflict with this scaling.
–
zyxSep 14 '11 at 21:32

3

By the way, in English at least, it is common to use quotation marks in phrases like "the" X to gently mock the idea that X is unique (and also to suggest that only "an X" or "a X" is the correct description). It does not necessarily imply or suppose that the expression "the X" was actually utilized, and when discussing a written text it does not (necessarily) imply that such an expression is a direct quotation from the text.
–
zyxSep 14 '11 at 22:34

1

Didier: Although your answer is well formulated and gives several good explanations that $\alpha=-1$ is special, it somehow does not answer why it is special. E.g.: Why is your geometric argument failing (despite the fact that it is simply not working)? Is there a geometric way to see that for $\alpha=-1$ the antiderivative should scale like $\log$?
–
DirkSep 20 '11 at 6:58

1

Dirk: The geometric argument does not fail, rather it shows rigorously that no constant-plus-a-power-of-$x$ can be an antiderivative of $x\mapsto x^{-1}$. // Re a geometric way to see that log solves the case $\alpha=-1$, here is a partly geometric one. By the geometric proof in my post, by scaling, $F(ct)-F(cs)=F(t)-F(s)$ for every positive $c$, $t$ and $s$. The rest is analysis: $F(xy)=F(x)+F(y)-F(1)$, hence $G=F(\exp(\ ))-F(1)$ is such that $G(t+s)=G(t)+G(s)$. Adding the continuity of $G$, $G(t)=tG(1)$ hence $F$ is an affine function of the inverse of exp.
–
DidSep 20 '11 at 10:47

The algebra of all polynomials is closed under differentiation and integration, however as soon as one wants to include negative powers of $x$, integration is no longer closed. As this paper discusses,

In the expression $\dfrac{x^{\alpha+1}}{\alpha+1}$, when $\alpha=-1$, then you're dividing by $0$. If you understand why you can divide by any other number but not by $0$, then that immediately gives you a reason to expect the answer to be quite different in that case.