The arithmetic - geometric mean inequality states that
$$\frac{x_1+ \ldots + x_n}{n} \geq \sqrt[n]{x_1 \cdots x_n}$$
I'm looking for some original proofs of this inequality. I can find the usual proofs on the internet but I was wondering if someone knew a proof that is unexpected in some way. e.g. can you link the theorem to some famous theorem, can you find a non-trivial geometric proof (I can find some of those), proofs that use theory that doesn't link to this inequality at first sight (e.g. differential equations …)?

Induction, backward induction, use of Jensen inequality, swapping terms, use of Lagrange multiplier, proof using thermodynamics (yeah, I know, it's rather some physical argument that this theorem might be true, not really a proof), convexity, … are some of the proofs I know.

As requested by dani_s, I will give the thermodynamic proof of the AM-GM inequality. This is certainly an example of an original proof, although you might argue about whether or not it's rigorous.

Let's start with a list of numbers $x_i$ for which we want to prove the inequality. Take $n$ identical heat reservoirs with the same heat capacity $c$. Reservoir $i$ had initial temperature $x_i$. Bring those reservoirs in contact with each other such that this system evolves to an equilibrium temperature A.

The first law of thermodynamics (conservation of energy) implies that A equals the arithmetic mean of the $x_i$, AM.

The second law of thermodynamics states that the entropy increases until the equilibrium is reached, where the entropy has a maximum. The corresponding formula of change in entropy is:
$$\Delta S=c \ln{\frac{T}{T_0}}$$
$c$ is the heat capacity, $T_0$ the initial temperature and $T$ the end temperature.

In our case $T_i=A$ for all $i$ and $T_{0,i}=x_i$. The total entropy didn't decrease and therefore,
$$\sum_{i=1}^n c \ln\frac{A}{x_i} \geq 0$$

By writing the sum of logarithms as a logarithm of a product, we recognize the geometric mean. Therefore (since $A=AM$):
$$\frac{AM^n}{GM^n} \geq 1$$
This prooves the AM-GM inequality.

Bernoulli's Inequality says that for $u\ge-1$ and $0\le r\le1$,
$$
(1+u)^r\le1+ru\tag{1}
$$
Setting $u=\frac xy-1$ in $(1)$ says that for $x,y\gt0$,
$$
\left(\frac xy\right)^r\le(1-r)+r\frac xy\tag{2}
$$
If we multiply $(2)$ by $y$, we get
$$
x^ry^{1-r}\le rx+(1-r)y\tag{3}
$$
Now $(3)$ can be used inductively to get
$$
x_1^{r_1}x_2^{r_2}x_3^{r_3}\dots x_n^{r_n}\le r_1x_1+r_2x_2+r_3x_3+\dots+r_nx_n\tag{4}
$$
where $r_1,r_2,r_3,\dots,r_n\ge0$ and $r_1+r_2+r_3+\dots+r_n=1$.

Inductive step:

Suppose that $(4)$ holds, then we can use $(3)$ to get
$$
\begin{align}
&\left(x_1^{r_1}x_2^{r_2}x_3^{r_3}\dots x_n^{r_n}\right)^{1-r_{n+1}}x_{n+1}^{r_{n+1}}\\
&\le(1-r_{n+1})\left(r_1x_1+r_2x_2+r_3x_3+\dots+r_nx_n\right)+r_{n+1}x_{n+1}\tag{5}
\end{align}
$$
where $(1-r_{n+1})(r_1+r_2+r_3+\dots+r_n)+r_{n+1}=1$

I shall provide a simple geometric proof of the inequality in the case of two variables (which I have not been able to find anywhere else - a proof involving a triangle in a circle seems to be popular).

Consider the square of side $a + b$ in the figure below.

The area of the square is $(a + b)^2$. But as it completely contains the four blue rectangles, each of area $ab$, it follows that

$(a + b)^2 \ge 4ab \Rightarrow\\
\dfrac{a + b}{2} \ge \sqrt{ab}
$

Further, note that there is a square in middle, of side $(b - a)$, and hence area $(b - a)^2$. Therefore the inequality is strict except when $a = b$.

This proves the two-variable case. The same can be extended to the $n$-variable case. I have tried extending it to three variables, but it is difficult to argue why exactly $27$ rectangular parallelepipeds (of sides $a, b, c$) fit in the cube (of side $a + b + c$), though I can see it is so. Any suggestions?

We can assume that $0<a_1\leq \ldots \leq a_n$. Let set $A=\frac{a_1+ \ldots +a_n}{n}$.
First notice that $(a_n-A)(A-a_1)\geq 0$, it can be rewritten as $\frac{a_1a_n}{A}\leq a_1+a_2-A $.
Now we assume that the property hold for $n-1$. We can apply it with ${a_2,\ldots , a_{n-1},a_1+a_n-A}$. It gives : $$(\frac{\prod_{k=1}^na_k}{A})^{1/(n-1)}\leq [(\prod _{k=2}^{n-1}a_k)(a_1+a_n-A)]^{1/(n-1)}\leq \frac{a_1+\ldots +a_n-A}{n-1}= A.$$

This is a proof using Buffalo way (see this link for what buffalo way is http://www.artofproblemsolving.com/Forum/viewtopic.php?f=55&t=522084), I couldn't find it anywhere, I jush heard it's possible, so I tried it, hopefully there are no mistakes, but even if there are I think they will be easy to correct, the main idea should still be right. We'll proceed by strong induction, base case holds trivially from inequality $(x_1 - x_2)^2 \geq 0$, assume that AM-GM holds for all $k$ such that $1 < k < n$. Then we wish to prove that $$x_1^n + x_2^n + \cdots + x_n^n - nx_1x_2\cdots x_n \geq 0$$ This inequality is clearly symmetric, hence we may WLOG suppose $x_1 = \min\{x_1,x_2,\cdots,x_n\} = x$, therefore there exist $y_1,y_2,\cdots,y_{n - 1} \geq 0$ such that $x_i = x + y_{i - 1}$ for all $i \in \{1,2,\cdots,n\}$. Therefore the inequality can be rewritten as $$x^n + (x + y_1)^n + (x + y_2)^n + \cdots + (x + y_{n - 1})^n - nx(x + y_1)(x + y_2) \cdots (x + y_{n - 1}) \geq 0$$ This is a polynomial in $x$ and expanding this polynomial we get that the coefficient of $x^{n - k}$, where $1 < k < n$, is $$\binom{n}{k}p_k(y_1,y_2,\cdots,y_{n - 1}) - ne_k(y_1,y_2,\cdots,y_{n - 1})$$ where $p_k$ is $k$-th power sum and $e_k$ is $k$-th elementary symmetric polynomial. So it suffices to prove that $$\binom{n}{k}p_k(y_1,y_2,\cdots,y_{n - 1}) - ne_k(y_1,y_2,\cdots,y_{n - 1}) \geq 0$$ But by inductive hypothesis we get $$\frac{n}{k} \cdot \left(y^k_{i_1} + y^k_{i_2} + \cdots + y^k_{i_k}\right) - ny_{i_1}y_{i_2} \cdots y_{i_k} \geq 0,$$ where $i_1,i_2,\cdots,i_k \in \{1,2,3,\cdots,n - 1\}$ are pairwise distinct. Doing this for all the possible combinations of indices $i_1,i_2, \cdots, i_k$ we in fact get stronger inequality $$\frac{n}{k} \cdot \binom{n - 2}{k - 1} p_k(y_1,y_2,\cdots,y_{n - 1}) - ne_k(y_1,y_2,\cdots,y_{n - 1}) \geq 0.$$ Now also clearly coefficients of $x^n$ and $x^{n - 1}$ are $0$ and coefficient of $x^0$ is just $p_n(y_1,y_2,\cdots,y_{n - 1})$. Hence all the coefficients of our polynomial are non-negative, therefore the polynomial is non-negative, thus the inductive step is proved and the whole proof is finished.