1 Answer
1

There is something that the notation does not help and which you need to keep straight (this may be the source of your confusions). $R_n(x)$ is the "remainder" at $x$: the error between the actual value of the function and the value you get from evaluating the Taylor polynomial of degree $n$ instead. That is, the $n$th remainder is equal to
$$f(x) - \sum_{k=0}^n \frac{f^{(k)}(0)}{k!}x^k$$
(I'm assuming the polynomial is being expanded around $a=0$).

So the remainder really needs three pieces of information to make it precise: the value of $n$, the value of $x$, and the function $f$ in question. It would be best to write it as $R(n,x,f)$, but since the function $f$ is usually clear from context, we ignore it. But I will use $R(n,x,f)$ in what follows. So notice that in your question, in (1) what you have is $R(n,x,xe^x)$, while in (2) what you have is $xR(n-1,x,e^x)$. And so the question is what is the relation between the two expressions.

Before addressing this in more detail, let me take a slight detour to Taylor series.

Say we have the Taylor series for $f$,
$$\sum_{k=0}^{\infty}\frac{f^{(k)}(0)}{k!}x^k.$$
We know this power series converges inside the radius of convergence (which can be found using, say, the Ratio Test); for $f(x)=e^x$, the radius of convergence is infinite, so the series converges for all values of $x$.

However, there is still the question of whether the value of the series at a given $x$ equals the value of the function at that point. This is where the remainders come in. The Taylor series for $f$ converges to $f(x)$ at the point $x_0$ if and only if $R(n,x_0,f)\to 0$ as $n\to\infty$.

In the case of $f(x)=e^x$, one can show that the Taylor series not only converges everywhere, but also that it converges to $e^x$. For example, this can be done using the Cauchy estimate for the remainder: given $r\gt 0$, and $x$ in $(-r,r)$, if $M_n$ is a number such that $|f^{(n+1)}(x)|\leq M_n$ for all $x$ in $(-r,r)$, then
$$|R(n,x,f)|\leq \frac{M_nr^{n+1}}{(n+1)!}.$$
For $f(x) = e^x$, you can take $M=e^r$ (or even $3^r$), so you get an exponential divided by a factorial, and that goes to $0$ as $n\to\infty$. This holds for any $x$ (by changing the $r$), so that $R(n,x,e^x)\to 0$ as $n\to\infty$ for all $x$. So the Taylor series for $e^x$ converges to $e^x$ at every $x$. That is, for each $x$,
$$e^x = \sum_{k=0}^{\infty}\frac{x^k}{k!} = 1 + x + \frac{x^2}{2!} + \cdots + \frac{x^n}{n!}+\cdots$$
in the sense that the value of the limit on the right hand side is exactly equal to $e^x$.

The "Taylor series" for $x$ is also very easy: it's just $x$ itself (not even a real series). It also has infinite radius of convergence, and $R(n,x,x) = 0$ for any $n\gt 1$.

It is a theorem that if the Taylor series for $f$ converges to $f$ in $(-r,r)$, and the Taylor series for $g$ converges to $g$ in $(-R,R)$, then the product of the series will converge to $fg$ in $(-\min(r,R),\min(r,R))$ (that is, on the interval where they both converge). For $e^x$ and $x$, this tells you that indeed you have that
$$xe^x = x\left(\sum_{k=0}^{\infty}\frac{x^k}{k!}\right) = \sum_{k=0}^{\infty}\frac{x^{k+1}}{k!} = \sum_{k=1}^{\infty}\frac{x^k}{(k-1)!}.$$
Because the series converges, and converges to $xe^x$, that means that if you write
$$xe^x = \sum_{k=1}^n \frac{x^k}{(k-1)!} + R(n,x,xe^x)$$
then you have that
$$R(n,x,xe^x) = \sum_{k=n+1}^{\infty} \frac{x^k}{(k-1)!}$$
(in the sense that the value of $R(n,x,xe^x)$ is the limit of the partial sums of that series) and moreover that
$$\sum_{k=n+1}^{\infty}\frac{x^k}{(k-1)!} \to 0\text{ as }n\to\infty.$$

What you are doing in (2) is to essentially deal with the Taylor polynomials instead of the series as I did above. We have that the Taylor series for $e^x$ converges to $e^x$ at every $x$, so that if we write
$$e^x = \sum_{k=0}^{n}\frac{x^k}{k!} + R(n,x,e^x)$$
then
$$R(n,x,e^x) = \sum_{k=n+1}^{\infty}\frac{x^k}{k!}$$
and $R(n,x,e^x)\to 0$ as $n\to \infty$.

Multiplying the expression for $e^x$ by $x$, we get
$$xe^x = x\left(\sum_{k=0}^n\frac{x^k}{k!} + R(n,x,e^x)\right) = \sum_{k=0}^n\frac{x^{k+1}}{k!}+xR(n,x,e^x).$$
Except that note that this time $xR(n,x,e^x)$ is giving the $(n+1)$st remainder, since the polynomial we have is of degree $n+1$. So, equating this with the expression we had before, this suggests that we should have
$$R(n+1,x,xe^x) = xR(n,x,e^x).$$
And indeed, this is what we find when we express them as series and limits:
\begin{align}
xR(n,x,e^x) &= x\left(\sum_{k=n+1}^{\infty}\frac{x^k}{k!}\right)\\
&=x\left(\lim_{m\to\infty}\sum_{k=n+1}^m\frac{x^k}{k!}\right)\\
&= \lim_{m\to\infty}\left(x\sum_{k=n+1}^m \frac{x^k}{k!}\right)\\
&= \lim_{m\to\infty}\sum_{k=n+1}^m \frac{x^{k+1}}{k!}\\
&=\lim_{m\to\infty}\sum_{k=n+2}^{m-1}\frac{x^k}{(k-1)!}\\
&= \lim_{m\to\infty}\sum_{k=n+2}^{m}\frac{x^k}{(k-1)!}\\
&= R(n+1,x,xe^x).
\end{align}

What this tells you is that the $n+1$st remainder at $x$ for the function $xe^x$ equals $x$ times the $n$th remainder at $x$ for the function $e^x$. So your equations both make sense and they are telling you correct things, once you put back the necessary information into the remainder.

Thanks. I'm wondering though, why is $R(n,x,xe^x) = \sum_{k=n+1}^{\infty}\frac{x^k}{(k-1)!}$? Doesn't the series converge if $R(n,x,xe^x)\to 0$? Then we can also say that $xe^x=\sum_{k=n+1}^{\infty}\frac{x^k}{(k-1)!}$
–
daniel.jacksonDec 19 '10 at 20:49

@daniel.jackson: Yes, the series converges if and only if $R(n,x,xe^x)\to 0$ as $n\to\infty$. And indeed, $\sum_{k=n+1}^{\infty}\frac{x^k}{(k-1)!}\to 0$ and $n\to\infty$. You can say that $xe^x$ "equals" its Taylor series in exactly the same way that we can say $e^x$ equals $1 + x + \frac{x^2}{2!} + \frac{x^3}{3!}+\cdots+\frac{x^n}{n!}+\cdots$; both series have infinite radius of convergence.
–
Arturo MagidinDec 19 '10 at 21:04

1

@daniel.jackson: It's not $\sum_{k=1}^n\frac{x^k}{(k-1)!}$ that does to $0$; it's the "rest" of the series, $\sum_{k=n+1}^{\infty}\frac{x^k}{(k-1)!}\to 0$ as $n\to\infty$. This goes to zero because we can show that the series converges (say, by the Ratio Test), so that the "tails" (sums from $n$ to $\infty$) have to go to $0$ as $n\to\infty$ (since the partial sum from $1$ to $n$ approaches the limit, the difference between the limit and the partial sum, which is the sum from $n+1$ to $\infty$, approaches $0$.
–
Arturo MagidinDec 19 '10 at 21:09

1

@Arturo: this is a very careful and complete answer. But if I may: it might be easier to read for a wide audience (including the OP) if you switched the first and second halves. In particular, in most American calculus classes the fact that the basic functions like $e^x$ are actually equal to their Taylor series expansions at $0$ is used -- often with relatively little comment -- in order to construct "new Taylor series from old". Treating things in terms of the remainder function is nice, but I think it may be a level above what many students are required to do.
–
Pete L. ClarkDec 20 '10 at 2:01

1

@Pete L. Clark: I'm not sure I am clear on what you are suggesting (though I'm sure it's a great suggestion I will agree with once I figure it out...) Do you mean, first the stuff about $R$ really being determined by 3 arguments, then the stuff about $e^x$ being equal to the Taylor series and why you get that $xe^x$ is equal to its Taylor series as well, and finally why the two computations actually give the same remainder?
–
Arturo MagidinDec 20 '10 at 2:04