The Sixth Lesson on Analysis

Null Functions in the Theory of Limits of Functions

Last updated 1jul13, previously 19apr11. Added material from 19apr16.
\begin{document}
\maketitle
\section{Introduction}
We do not have time in this edition of the course to delve very deeply
into Chapter 14 "Continuous Functions", and hardly at all into Chapter 15
"Differentiable Functions. Still, it is worthwhile to spend a couple of
lessons on the generalization of null sequences to functions, and show how
this concept simplifies the proof of some significant theorem you will
meet again in a proper analysis course later.
\section{Null Functions}
Recall that a sequence was a function $\mathbb{N} \rightarrow \mathbb{R}$.
Now we are concerned with fucntions $\mathbb{R} \rightarrow \mathbb{R}$.
\textbf{ Definition of a Null function}
\[ \lim_{h\rightarrow 0} \alpha(h) = 0 \mbox{ means the same as }
(\forall \epsilon >0)(\exists \delta > 0)(\forall 00$ such that something $P(h)$ is true for all
$0 \lt |x-a| \lt e$, we just say the
propositions $P$ is true in a
\textit{neighborhood of} $x=a$.
\section{Derivatives and Differentiable Functions}
\textbf{ Definition of Derivative } \\
We define a function $f$, which is defined in an open interval contaning
the point $x$, to have a \textit{ derivative} at $x$, if there exists a
real number $m$ (think of "slope" in $y=mx+b$) and null function $\theta$,
so that \[f(x+h) = f(x) + mh + \theta(h)h .\]

Show that this definition is no different from the classical definition
you learned in the calculus. Hint: Divide through by $h$, rearrange, and
justify that it says $\lim_{h\rightarrow 0}\frac{f(x+h)-f(x)}{h} = m .$

Show that if $f$ has a derivative at $x$ then $f$ is continuous at $x$.
Hint: The difference of $f(x+h)- f(x)$ in the definition is a null function
$ \alpha(h) = (m + \theta(h))h$.
From the calculus you remember that the converse of this is not necessarily
true. Continuous functions need not have derivatives. The absolute value
function $y=|x|$ is the usual counterexample for this.

Show why there is no $m$ that satisfies the definition of the
derivative of $f(x)=|x|$ at $x=0.$ Hint: Show that $m$ would have
to have two different values depending on whether $h \lt 0$ and
$0 \lt h$.
We also write $f'(x)$ for the derivative of $f$ at $x$ and dispense with an
extra letter, like $m$, unless we want to emphasize the definition. This
notation also suggests that if $f$ is differentiable at every $x$ in an
open interval, then $f'$ is a function defined on it. Since a function can
have only one value for each argument, we need to realize that \textbf{ if }
a derivative exists, it is unique. We show how versatile null functions are
by proving it this way. If we assume there are two derivatives we have
\begin{eqnarray*}
f(x+h) & = & f(x) + mh + \theta(h)h \mbox{ for some null } \theta \\
f(x+h) & = & f(x) + nh + \psi(h)h \mbox{ for some null } \psi \\
0 &=& (m-n)h + (\theta(h) - \psi(h)) h \\
\mbox{ divide by } h: \, m-n &=& \theta(h) - \psi(h) \\
\end{eqnarray*}
Note that here we really use the provision that $h\ne 0$.
Now stare at the last equation. If $m \ne n$, then the LHS is not
$0$. But he RHS becomes smaller than $\epsilon = |m-n|/2$ and we
have a contradiction. Done.
\section{Properties of Null Functions.}
Like the properties of null sequences, somewhere one has to use "epsilonics",
the technique of arguing from the definition. For instance, above we
argued that if $\theta$ is a null function of $h$, so is the function
$ \alpha(h) := (m + \theta(h))h. $ This is a special case of the functional
analogue of the theorem that the product of a bounded by a null sequence is
again a null squences.
\subsection{The product rule for null functions}
Let $k(h)$ be a bounded function of $h$. That is, there is a $K>0$ so that
$ |k(h)| < K $ for all $h$ in an punctured neighborhood of zero. Then
for any null function $\theta$, their product is null.
\texbf{Proof: } For $\epsilon /K$ there is a $\delta$ so that for all
$0

Question 4.

Prove that the sum of two null functions is again a null function.
\subsection{The composition rule for null functions.}
To show that $\gamma = \beta \circ \alpha$
is still null we need to prove that
\[(\forall \epsilon >0)(\exists \delta > 0)(\forall 00)(\exists \delta > 0)(\forall 00)(\exists \delta > 0)(\forall 0

Question 5.

Why didn't we just use $h$ again, instead of introducing yet another
variable $k$?
Now, if we decide to set $k=\phi(h)$ and since we have a rule for null
functions that says that their compositions are again null, we
are soon done. We expand,
\begin{eqnarray*}
g\circ f(x+h) & = &g(f(x+h))=g(f(x) + \phi(h))\\
& = &g(y+k) \\
& = & g(y)+ \gamma(k) \\
& = & g(f(x)) + \gamma(\phi(h)) \\
& = & g\circ f(x) + \gamma \circ \phi(h)) \\
\end{eqnarray*}
\section{Chain Rule}
In calculus you were taught how to use the chain-rule without any indication
of why it should be true. Or better said, which nothing more enlightening
than what Leibniz's ingenious notation for the derivative of $y=f(x)$ as
$\frac{dy}{dx}$ would suggest. You were told to remember that when
$z = g(y)$ then the derivative of $z=h(x)$ where $h(x)=g(f(x))$ was
simply
$\frac{dz}{dx} = \frac{dz}{dy} \frac{dy}{dx} $. This would be obvious if
the $dz, dy, dx$ were numbers. But they aren't. They're infinitesimals.
And if your instructor fumbled with a more rigorouse proof, (s)he probably
fudged it.
With the definition for the derivative above, due to Fr\'echet, in term of
null function, you can \textbf{calculate} the derivative of a composition
without even knowing what it is to begin with.
Now study each step below, where $\phi$ and $\gamma$ are the null functions
known to exist for $f$ and $g$ respectively (hypothesis).
\begin{eqnarray*}
\mbox{ Let } h &=& g\circ f \mbox{ of two differentiable functions } \\
h(x + h) &=& g(f(x+h)) = g(f(x)+ f'(x)h + \phi(h)h) \\
\mbox{ Write } k(h) &=& (f'(x) + \phi(h))h \mbox{ which is null } \\
\mbox{So } h(x+h) &=& g(f(x)) + g'(f(x))k + \gamma (k)k \\
&=& g(f(x))+ g'(f(x))(f'(x) + \phi(h))h + \gamma(k)(f'(x) + \phi(h))h \\
&=& g(f(x))+ g'(f(x))f'(x)h + g'(f(x)) \phi(h)h + \gamma(k)(f'(x) + \phi(h))h \\
&=& g(f(x))+ g'(f(x))f'(x)h + \eta h \\
&=& h(x)+ h'(x)h + \eta h \mbox{ which identifies } h'(x) \\
\end{eqnarray*}
Once we prove the lemma that $\eta$ is a null function of $h$ we're done.
Collecting terms, we have
\begin{eqnarray*}
\eta &=& g'(f(x)\phi(h) + \gamma(k(h))f'(x) + \gamma(k(h))\phi(h) \\
&=& const\cdot null + null\circ null \cdot const + null\circ null \cdot null \\
&=& null + null\cdot const + null \cdot null \\
&=& null + null+ null\\
&=& null
\end{eqnarray*}
Now you know why we avoid proving the chain rule in the caclulus.
\subsection{Fudge}
Actually, the above is also fudged a bit. In the original proof you might
have seen in the calculus, in the step
$\frac{\Delta z}{\Delta x}= \frac{\Delta z}{\Delta y} \frac{\Delta y}{\Delta x}$
when the deltas were still real numbers and you couldn't divide by zero, your
instructor had to take two cases. First, for $f'(x) \ne 0$ (Why?), leaving
the case that it is as homework, no doubt.
Well, in Frechet's proof, nothing ever got divided. But there is a subtlety
we sort of skipped over. Can you find it? Here's a chance to win a very fat
bonus in this course!
\subsection{Payoff}
The beauty of Fr\'echet's definition of derivative is in its generalization
to the multivariate calculus (MA241). Because there aren't any denominators,
every product in the formula becomes a scalar, vector, dot, or matrix product
depending on the kinds of functions we're dealing with. All that crazy notation you learned about gradients, curls, and divergences now are unified into a
single concept from linear algebra.
Recall that all of those crazy derivatives involved
partial derivative. So, packaging all the partials of $f(x)$ into the matrix
$f'(x)$ of the appropriate rectangular shape fits into Fr\'echet's formula.
This yields the multivariate interpretation of $dy = f'(x)dx$ as a "small"
displacement of the vector $y$ as the matrix of partials
$\frac{\partial y}{\partial x} $ applied to the "small" displacement vector
of $x$. This makes mechanics a whole lot easier to work with. But it may
take another century for this improvement to be accepted by the engineering
schools of America.
All you have to put up with is the various kinds of multivariate null functions.
\end{document}