Proving a theorem: Changing the Order of Differentiation (large question)

This question is right out of Taylor & Mann Advanced Calculus, Third Edition, and is a REALLY tricky one. Due to how context-heavy this is, I can't write it word-for-word.

We're to prove the following theorem:
Let and its first partial derivatives be defined in a neighborhood of the point , and suppose that and are differentiable at that point. Then .

This theorem uses part of another theorem's proof to start things off, before getting to the parts that I can't figure out.

Let be a number different from zero such that the point is inside a square having its center at . We then consider the following expression:
If we introduce the function,
we can express in the form (*)
Now has the derivative
Hence is continuous, and we may apply the mean-value theorem for derivatives to (*), obtaining the following:, where

That ends the part of the separate proof; now to parts which I'm in the dark about (these are from the actual question).

From the fact that is differentiable at , one can write, where as .
Explain why this is so.

Next, go on to explain how to obtain the following expression: where as .

Explain the derivation of the similar expression where as ,
using the fact that is differentiable at .

With all this, complete the proof of the theorem.

----------

This is a lot of info to work with, I know. But I can't summarize it any better than this because the question has such a high context requirement.

This question is right out of Taylor & Mann Advanced Calculus, Third Edition, and is a REALLY tricky one. Due to how context-heavy this is, I can't write it word-for-word.

We're to prove the following theorem:
Let and its first partial derivatives be defined in a neighborhood of the point , and suppose that and are differentiable at that point. Then .

This theorem uses part of another theorem's proof to start things off, before getting to the parts that I can't figure out.

Let be a number different from zero such that the point is inside a square having its center at . We then consider the following expression:
If we introduce the function,
we can express in the form (*)
Now has the derivative
Hence is continuous, and we may apply the mean-value theorem for derivatives to (*), obtaining the following:, where

That ends the part of the separate proof; now to parts which I'm in the dark about (these are from the actual question).

From the fact that is differentiable at , one can write, where as .
Explain why this is so.

It is exactly Taylor's theorem for a once differentiable function, which informally is that for small increments on the function approximately is linear and the error decreases "faster" than the increment as the increment goes to zero.

Okay, I got through the first hurdle; I had to show that another Theorem followed from a prior chapter (and I'm hoping that it's a valid answer), but now comes a more difficult part.

I'm not sure how I'm supposed to get from here, as
to here, as

The big problem is that epsilon statement at the end of it. I don't know how it gets there.

EDIT: I should mention this; this question is more along the lines of what is found in the following link. Click here

Unfortunately, the proof of what's in that link is NOT the correct answer for what I'm supposed to answer. It's very close, but not quite there (check the wording CAREFULLY on the theorem).

Whereas the theory in the link has the second partial derivatives defined and continuous at a point, the theorem I'm supposed to prove does not have that statement. The theorem I'm to prove says that and are differentiable, but does not say that and are defined and continuous (even if they are implied).

Last edited by Runty; November 5th 2010 at 03:03 PM.
Reason: Added a link

Sorry for triple-post, but I'm still stuck on the part I listed earlier. I dunno if this will help, but I'm out of ideas. (note that there's a lot to read)

This is a theorem we're meant to use to try and answer this.

Let the function be defined in some neighborhood of the point . Let the partial derivatives also be defined in this neighborhood, and suppose that and are continuous at . Then .

Note the differences between this theorem and the theorem I'm meant to prove (in the first post). The theorem above has and defined and continuous at , but the theorem I'm meant to prove only says that and are differentiable at . This might sound redundant, but you can clearly see there's a difference.

(much of the following is excerpted from my textbook; this proof DOES NOT answer the question)
To prove the theorem above, I work entirely inside a square having its center at , and lying inside the neighborhood described earlier. Let be a number different from zero such that is inside the square. Then consider the expression
If we introduce the function,
we can express D in the form. (*)
Now has the derivative.
Hence is continuous, and we may apply the mean value theorem to (*), with the result, where . (**)
Next, let.
The function g has the derivative.
Now we can write (**) in the form
and apply the mean value theorem. The result is, where .
Alternatively, we could have started by expressing D in the form, where.
This procedure would have led to the following, where , .
On comparing the two expressions for D, we see that. (***)
If we now make , the points at which the derivatives in (***) are evaluated both approach . Hence, by the assumed continuity of and , we conclude that .
(Remember, this is NOT the correct answer to the theory in the starting post)

This is the theory and proof that correlate closely to what I'm meant to solve, but unfortunately using this proof would be an incorrect answer. Though the proofs start off similarly, they deviate at (**).

As to this question, this is the part I'm stuck at.
We have, where as .
I now have to explain how to obtain the following expression:, where as .

If I could just get that, I could do the same things for , and hopefully complete the proof afterwards. As such, I'd greatly appreciate any help that could be provided.

Again, bad interpretation has made me miss out on answering this question correctly. I have my Prof.'s answers to this, and I doubt I'll get any good mark on this. Although in all honesty, this question was ****ing ridiculous.

Here is my Prof.'s answer:

Since is differentiable, we have:(*)
where as .
If we replace by and by , we get
so and as (as ) so .
Alternatively, replace by and take in (*) to get
so as , as before.
Substitute into
to get, where and so as .
Everything is symmetric in and , so reversing their roles in the above gives, where as .
Then
and as so we get as required.

This question seriously was a pain in the neck, and I'd rather my Prof. give us a little more to work with next time, or at least be HELPFUL when I ask it of him.