Divergence Theorem

Real Analysis
Divergence Theorem

Let V be a region in &#8299;3complying with the hypotheses of the divergence theorem,
and denote by S its boundary surface. Let also &#966;: &#8594; &#8299; be a scalar function, and c an arbitrary constant vector.

By applying the divergence theorem to the vector field &#966;c
(1) show that:

(&#8747;&#8747;&#8747;v &#9660;&#966;dV - &#8747;&#8747;s &#966;ndS).c = 0
with the understanding that the integral of a vector is the vector of the integrals of the components.

(2) Use the above result to deduce carefully that:

&#8747;&#8747;&#8747;v &#9660;&#966;dV = &#8747;&#8747;s &#966;ndS.

See the attached file.

Attachments

Solution Preview

Real Analysis
Divergence Theorem

Let V be a region in &#8299;3complying with the hypotheses of the divergence theorem,
and denote by S its boundary ...

Solution Summary

This solution is comprised of a detailed explanation of the Divergence Theorem.
It contains step-by-step explanation for the following problem:
Let V be a region in &#8299;3complying with the hypotheses of the divergence theorem,
and denote by S its boundary surface. Let also &#966;: &#8594; &#8299; be a scalar function, and c an arbitrary constant vector.
By applying the divergence theorem to the vector field &#966;c
(1) show that:

(&#8747;&#8747;&#8747;v &#9660;&#966;dV - &#8747;&#8747;s &#966;ndS).c = 0
with the understanding that the integral of a vector is the vector of the integrals of the components.