This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.

I'm not sure whether there is anything to proof here or whether the Then in what you post is just another way of expressing Cauchy's Integral Theorem. I didn't look at your problem in detail; just from the structure of your post this seems to be related to one of Cauchy's theorems though.
–
lomppiDec 8 '13 at 6:42

@sebastian Thanks! It may be a problem about ODE or dynamical system.
–
Eden HarderDec 8 '13 at 7:51

I remember doing something vaguely similar a couple of years back. At least in that problem I integrated over the equations and used $delta$ and $xi$ as the integration borders. What it came down to in this case was basically applying The Fundamental Theorem of Calculus and therein the Cauchy Version. This is just a hunch though, I don't want to mislead you!
–
lomppiDec 8 '13 at 9:21

Since the first derivatives of $x$ and $y$ are given so we can write a Taylor series expansion of $x(t)$ and $y(t)$. It gives a result of the form $x(t)-y(t)= x_0^2 h(t,x_0)$ for some function $h$. If one can show that $h$ is bounded in a rectangular region $[0,T]\times[-a,a]$ with some upper bound $M$, then the theorem will follow. Since then $|x(t)-y(t)|<\delta \epsilon$ if we take $|x_0|<\delta$ and $\delta<\epsilon/M$
–
user10001Dec 8 '13 at 12:29

1

Would math.se be a better home for this question?
–
E.P.Dec 8 '13 at 19:09

2 Answers
2

The difference between $x$ and $y$ satisfies the equation
$$\frac{\text d}{\text dt}(x(t)-y(t))=A(x-y)+R_2(x).$$

This can be solved, formally, by variation of parameters, to give
$$x(t)-y(t)=\int_0^t e^{A(t-\tau)}R_2(x(\tau))\text d\tau.$$
Their separation is then bounded by
$$\begin{align}|x(t)-y(t)|
\leq\int_0^t | e^{A(t-\tau)}R_2(x(\tau))|\text d\tau
\leq\int_0^T e^{|A|(t-\tau)}|R_2(x(\tau))|\text d\tau.\end{align}$$

Now, $R_2(x)=O(x^2)$, which means that there exists a constant $M$ and a distance $X$ such that $|R_2(x)|\leq Mx^2$ whenever $|x|\leq X$. To be able to use this, we need to ensure that $|x(t)|$ will be no greater than some $\delta$-controllable constant for all times $t\in[0,T]$, and our tool to achieve this is making the initial condition $x(0)=x_0$ small enough. This is essentially the statement that the solution is continuous with respect to the initial condition (i.e. around the solution $x(t)\equiv0$ for $x(0)=0$).

This continuity is a standard fact in differential equations but I'll sketch a proof here. Any solution $x(t)$ must be differentiable, hence continuous, and hence bounded. This means the condition $|R_2(x(t))|\leq M_{x_0}x(t)^2$ will hold for all $t\in[0,T]$, for some suitable constant $M_{x_0}$. The right-hand side of the differential equation for $x(t)$ is therefore Lipschitz continuous:
$$|Ax(t)+R_2(x(t))|\leq L|x(t)|\tag1$$
for some $L>0$. (Alternatively, you might impose this to begin with.) Setting $u(t)=|x(t)|^2$, you have
$$
\frac{\text du}{\text dt}
\leq\left|\frac{\text du}{\text dt}\right|
= \left|2x(t)\frac{\text dx}{\text dt}\right|
\leq 2L |x(t)|^2=2L u(t).
$$
By Gronwall's inequality this implies that $|u(t)|\leq|u(0)|e^{2L t}$, which means that $|x(t)|\leq |x_0| e^{LT}$ for all $t\in[0,T]$.

So, in a nutshell: if you can assume or prove that (1) holds with $L$ independent of $x_0$, then your solution $|x(t)|$ will be bounded by $|x_0| e^{LT}$ for all $t\in[0,T]$. Additionally, this means that if you choose $|x_0|<\delta\leq Xe^{-LT}$, you can bind $|x(t)|<X$ for all $t\in[0,T]$.

Thus, if you're given $\xi>0$ and $T<0$, then choosing $\delta=\min\{Xe^{-LT},\xi e^{-(2M+|A|)T}/MT\}$ you can ensure that
$$|x(t)-y(t)|<\xi\delta$$
for all $t\in[0,T]$.

Finally, note that while this is phrased in one-dimensional language, it holds with trivial extensions (i.e. simply with appropriately defined norms) for the multidimensional case, and therefore for higher-than-first-order ODEs.

Hmm.. but I can't see how the derivative of second equation is same as the first?
–
user10001Dec 8 '13 at 21:58

That's just a plain differentiation. You need to be careful to differentiate the exponential inside, which will give $A(x-y)$, and with respect to the upper limit of integration, which will give $R_2(x(t))$.
–
E.P.Dec 8 '13 at 22:18

Can you include this calculation in your answer ? I am not sure if the method of variation of parameters applies to nonlinear equations
–
user10001Dec 8 '13 at 22:31

Then we have (by taking derivative wrt $x_0$ on both sides of the two differential equations and using equation (3), and also the fact that $R_2(x)=O(x^2)$):

$\displaystyle \frac {dh_1 (t)}{dt}=Ah_1(t)$, with $h_1 (0)=1$

$\displaystyle \frac {dh_2 (t)}{dt}=Ah_2(t)$, with $h_2 (0)=1$

From these two equations we conclude that $h_1 (t)=h_2(t)$ for all $t\in [0,T]$, which is equation (2).

From equations (1) and (2) above we conclude that $X(x_0,t)-Y(x_0,t)$ is of the form

$$X(x_0,t)-Y(x_0,t) = x_0^2 h(x_0,t)\tag 4$$ for every $t\in[0,T]$

Assuming that for any fixed $t\in[0,T]$, Taylor series of $X(x_0,t)$ and $Y(x_0,t)$ exists (wrt $x_0$) in some neighborhood $V_t$ of $x_0=0$, we can say that $h(x_0,t)$ is smooth and bounded in $V_t$. However, in order to prove the theorem one needs to show that there exists an $a>0$ (depending upon $T$) such that $h(x_0,t)$ is bounded in the rectangle $[-a,a]\times[0,T]$ with some upper bound $M$ on its absolute value. I am not sure how to prove this part and would highly appreciate if someone can provide some hint.

Once the boundedness of $h$ is proved one can take $\delta<\xi/M$ and the statement of the theorem follows.