Friday, December 27, 2013

Let us first recall the proposition mentioned (without proof) in yesterday's post: Proposition: Let a real function $f$, with domain $D_{f}$ and continuous on an interval $\Delta \subseteq D_{f} \subseteq \mathbb{R}$ and let $a \in \Delta$ be a fixed point of $\Delta$. Then the function $F(x)=\int_{a}^{x}f(t)dt$ is an antiderivative function of $f$. In other words:
$$
F'(x) = \big( \int_{a}^{x}f(t)dt \big)' = f(x)
$$
for all $x \in \Delta$.
If $x \in \Delta$ is a variable point,
then the value of the definite integral $F(x)= \int_{a}^{x}f(t)dt$ represents the area of the shaded region shown in the figure:Remark: Notice that the function $F(x)$ is defined on any interval $\Delta \subseteq D_{f}$ in which:

$a \in \Delta$

f is continuous on $\Delta$

and (according to the preceding proposition) is differentiable on that $\Delta$.

Considering $h$ to be infinitesimal, we can now compute
$$
\begin{array}{c}
\Delta F(x) = F(x+h)-F(x) = \int_{a}^{x+h}f(t)dt - \int_{a}^{x}f(t)dt = \\
\\
= \int_{x}^{x+h}f(t)dt = E(\Omega) \approx h \cdot f(x)
\end{array}
$$
In the above, the change $\Delta F(x)$ in the value of the function $F(x)=\int_{a}^{x}f(t)dt$ when $x$ changes to $x+h$ is denoted by $E(\Omega)$ and equals the area of the shaded strip $\Omega$, displayed in the next figure:

Consequently, $f(x) \approx \frac{E(\Omega)}{h}$ and since $h$ is considered to be infinitesimal, we can write:
$$
f(x)=\lim_{h \rightarrow 0}\frac{F(x+h)-F(x)}{h}=F'(x) = \frac{d \big(\int_{a}^{x}f(t)dt \big)}{dx}
$$Remarks:(1). The above should not be considered to be a rigorous proof. It should rather be taken as an intuitive line of thinking, aiming to shed some light into "what is really going on" inside the function $\int_{a}^{x}f(t)dt$(2). Let a real function $f$, continuous on an interval $\Delta \subseteq \mathbb{R}$ and let $a \in \Delta$ be a fixed point of $\Delta$. Let another real function $g(x)$ with domain $D_{g}$ and differentiable (and thus: continuous) on an interval $\Delta_{1} \subseteq D_{g}$. We can then consider the composite function $G=F \circ g$:
$$
G(x) = \int_{a}^{g(x)}f(t)dt
$$
The domain $D_{G}$ of $G$ will be:
$$
D_{G} = \{ x \in D_{g} \ , \ g(x) \in \Delta \}
$$
The composite function $G=F \circ g$ is differentiable on $\Delta_{2} = \Delta_{1} \cap D_{G}$. Its derivative can be computed by combining the above differentiation rule with the chain rule of differentiation (for differentiating composite functions). We readily get the following formula:
$$
\Big( \int_{a}^{g(x)}f(t)dt \Big)' = f\big( g(x) \big) \cdot g'(x)
$$
for all $x \in \Delta_{2}$.

Thursday, December 26, 2013

Suppose we are given a continuous, real function $f(x)$ defined on an interval $\Delta \subseteq \mathbb{R}$ and let $a \in \Delta$ be a fixed point.
Any other function $F(x)$, with domain $D_{F}= \Delta$ will be called an antiderivative function of $f$ if
$$
F'(x)=f(x)
$$
(notice that $F$ is by definition differentiable (and thus continuous) in $\Delta$.)
The above definition implies that: the antiderivative function is not uniquely determined, but rather there is a family of functions satisfying the above relation. Actually, any other function $G(x)$, $D_{G}= \Delta$ with the property
\begin{equation} \notag
G'(x)=F'(x)=f(x)
\end{equation}
will also be an antiderivative function. In such a case $F(x)$ and $G(x)$ will differ by a constant:
\begin{equation} \notag
G(x)=F(x)+c
\end{equation}
for some $c \in \mathbb{R}$. (this comes from a well known theorem of elementary calculus). We can thus now lay the followingDefinition: We will call antiderivative or indefinite integral of $f$, and we will denote it by $\int f(x)dx$ the set of all functions satisfying the above property, thus:
\begin{equation} \notag
\begin{array}{r}
\int f(x)dx = \{F | F'(x)=f(x), \ x \in \Delta \} = \\
\\
= \{G(x)+c |\textrm{for all } c \in \mathbb{R} \} \ \ \ \ \ \ \ \
\end{array}
\end{equation}
where in the last equality $G$ is an antiderivative function (actually any antiderivative function) of $f$.

Now it can be proved (the proof can be found on standard calculus texts and i will -hopefully- post it here later) that:Proposition: one of these functions belonging in the above set (thus, one of the antiderivative functions of $f$ or equivalently: one of the indefinite integrals of $f$) is the function
\begin{equation} \notag
\int_{a}^{x} f(t)dt
\end{equation}
In other words: $(\int_{a}^{x} f(t)dt )'=f(x)$ for all $x \in \Delta$.Remarks:(1). Notice that the above proposition readily implies that $\int_{a}^{x} f(t)dt$ is differentiable (and thus continuous) for any $x \in \Delta$. Of course the $\ ' \ $ symbol indicates differentiation with respect to the variable $x$.(2).Thus: the definite integral $\int_{a}^{x} f(t)dt $ with variable upper limit of integration, is an antiderivative function of $f$. Consequently, we can write
\begin{equation} \notag
\int f(x)dx = \int_{a}^{x} f(t)dt + c
\end{equation}
for all $c \in \mathbb{R}$.(3). What the above proposition actually tells us is that: any function $f$ which is continuous on an interval $\Delta \subseteq \mathbb{R}$, has an antiderivative function given by $\int_{a}^{x} f(t)dt $ for $a,x \in \Delta$.(4). It is worth noticing the meaning of the number $a \in \Delta$: Varying the value of $a \in \Delta$ produces different antiderivative functions (because the variation of $a \in \Delta$ simply alters the value of the constant of integration $c$). However, we cannot hope that varying the value of $a \in \Delta$ "covers" all possible antiderivatives of a given (continuous function $f$). In other words, this means that: although the above theorem tells us that $\int_{a}^{x} f(t)dt$ is an antiderivative function of $f$, not all antiderivative functions of $f$ can necessarily be expressed as $\int_{a}^{x} f(t)dt$ for some $a \in \Delta$. This can be clearly seen in the following example:
If $f(x)=2x$ and $\Delta = \mathbb{R}$, then for any real $a$ we have
\begin{equation} \notag
\int_{a}^{x} 2tdt = [t^{2}]_{a}^{x}=x^{2}-a^{2}
\end{equation}
But, the family of functions $\{F(x)=x^{2}-a^{2}, \ x \in \mathbb{R} | \textrm{for all } a \in \mathbb{R} \}$ does not include for example the function $G(x)=x^{2}+1$, which is an obvious antiderivative function of $f(x)=2x$.

Tuesday, December 24, 2013

This week's posting will have to do with integral calculus and more specifically with integral equations and antiderivatives of continuous functions (recall that the antiderivative is another name for the indefinite integral).Let a continuous real function $f$ satisfying $f(x)=e^{\int_{0}^{x}f(t)dt}$ for all $x<1$.Find the formula of the function $f$.

Monday, December 16, 2013

This week's Question comes from complex numbers: Given a complex number $z$, determine its locus, given that $w=\frac{i}{z^{2}+1}$ belongs on the real axis (i.e. $w$ is a real number)
You can check out the pdf version here
Waiting till next week, for your ideas and thoughts.

Sunday, December 15, 2013

Last week's question was dealing with two different standard forms of the hyperbola equation.
The answer to the question, is that these two different forms are equivalent descriptions and this can be shown by a counterclockwise rotation $(x,y)\rightarrow(x',y')$ of the planar coordinate system, through an angle $φ=π/4 (rad)$.
Here are some more details (in the example that follows $a > 0$):

Sunday, December 8, 2013

This week's question comes form $2d$ analytic geometry, and deals more specifically with the coordinate equations of the hyperbola:

"In a given coordinate system $(x,y)$ the equation $y=\frac{a}{x}$, $a \in \mathbb{R}$ represents an hyperbola. Show that under a suitable change of coordinates i.e. under a suitable transformation $(x,y)\rightarrow(x',y')$ the same hyperbola becomes $x'^{2} - y'^{2}=2a$"
check out the pdf version here.

ok, so here we have the answer to last week's question on the continuity of the derivative function at a given point ...The answer is in general negative! A function may be differentiable at a point of its domain, with the derivative being discontinuous at that point! Here is a counterexample. you can provide a proof on your own by directly applying the definition of the derivative at $x=0$, but if you find the computation cumbersome, here are some more details ;)

Sunday, December 1, 2013

Those of you using smartphones or tablets running android can download some free e-notes with material relevant to the A-levels (but is also useful for the IB's math courses).
Take a look at A-level math, #1 and also at A-level math, #2

They cover only a portion of the necessary material but are still particularly useful, although far from complete ! They also have the advantage that they can be carried at your mobile device (tablet or smartphone) and be readily available anytime you need them!

Beware: the purpose of distributing such material is strictly educational! Please do not use them for cheating in tests or exams !