I tried to reject the edit by Peter Mortenson on the grounds that it was better before the edit. But someone had already approved it. One shouldn't make the question look as if it may have been simply copied from a textbook exercise, when the original poster didn't make it look that way.
–
Michael HardyApr 25 '12 at 17:44

A completely elementary proof: try proving that $n! > 2^n$ for $n$ sufficiently large, then $n! > 3^n$ for $n$ even larger, and so on, so eventually you'll have for each $k$ there is some $n$ such that $n! > k^n$, i.e. $\sqrt[n]{n!} > k$, which is enough to prove what you asked.

Intuitively it's obvious that $n! > k^n$: as we go from $k^n$ to $k^{n+1}$ we multiply by $k$, but going from $n!$ to $(n+1)!$ multiplies by $n+1$ which will eventually be far larger than $k$. In fact, by the time you reach $n = k^2$, we're growing twice as fast as $k^n$, which ought to give us a crude bound for when we must pass it: $(k^2 + l)!$ is a product involving $l$ terms greater than $k^2$, so it is certainly bigger than $(k^2)^l = k^{2l}$, so setting $l = k^2$, $(2k^2)! > k^{2k^2}$.

Of course, if you actually check the numbers, the factorial beats the exponentiation way sooner than that, but that's irrelevant to the proof.

We wish to show that for a fixed real number $\alpha$, we have
$$
(n!)^{\frac{1}{n}}>\alpha
$$
for sufficiently large $n$. Clearly it suffices to show that the logarithm of this quantity exceeds $\alpha$ (for sufficiently large $n$).

Using elementary log rules, we have
\begin{align}
\log(n!^{\frac{1}{n}}) &= \frac{\log(n!)}{n}\\
&= \frac{1}{n}\sum_{i=1}^{n} \log(i)\\
\end{align}
We will start the sum at $2$ since $\log(1)$ is 0. Now
$$
\frac{1}{n}\sum_{i = 2}^{n} \log(i)
$$
is a right-handed Riemann sum (with step-size $\Delta x$ = 1) which gives an overestimate for the integral
$$
\int_1^n\log(x)dx.
$$
But this integral is just
$$
n\log(n) - n + 1.
$$
Thus we have shown that
$$
\log(n!^{\frac{1}{n}}) \geq \log(n) - 1 + \frac{1}{n}
$$
and the right hand side goes to $\infty$ as $n$ goes to $\infty$, which is what we want.

Perhaps my wording in the first lines is bad, but the proof is fine. If the logarithm of the formula eventually exceeds any fixed real number, which is what I've shown, then the formula also eventually exceeds any fixed real number, i.e. has an infinite limit. This isn't circular.
–
user29743Apr 25 '12 at 7:30

Since $b_n$ is increasing and tends to $\infty$ and we have
$$ \frac{a_{n+1}-a_n}{b_{n+1}-b_n}=\log(n+1) \to \infty$$
we can apply Stolz-Cesaro theorem and conclude that the limit $L=\lim_{n \to \infty} \log S_n$ exists and $L=\infty$. This implies that $S_n \to \infty$.

In light of sdcvvc' answer, this answer may be a bit much; but you can generalize the following argument to show that if $(a_n)$ is a sequence of positive numbers and if $\lim\limits_{n\rightarrow\infty}{a_{n+1}\over a_n}=\infty$, then $\lim\limits_{n\rightarrow\infty}{\root n\of{a_n}}=\infty$. (More generally, one can show that $\limsup\limits_{n\rightarrow\infty}{\root n\of{a_n}}
\le \limsup\limits_{n\rightarrow\infty}{a_{n+1}\over a_n} $ and that
$
\liminf\limits_{n\rightarrow\infty}{a_{n+1}\over a_n} \le
\liminf\limits_{n\rightarrow\infty}{\root n\of{a_n}}
$. )

Let $a_n=n!$. One can show by induction that $a_{n+k}\ge n^k a_n$ for all positive integers $n$ and $k$.

Now fix a positive integer $N$ and let $n$ be a positive integer with $n\ge N$. Then
$$\tag{1}
a_n =a_{N+(n-N)} \ge N^{n-N} a_N=N^n\cdot {a_N\over N^N},\qquad\qquad(n\ge N).
$$
Taking the $n^{\rm th}$ roots of the left and right hand sides of $(1)$ gives
$$\tag{2}
\root n\of{a_n}\ge N\cdot{\root n\of {a_N}\over (N^N)^{1/n}}, \qquad\qquad(n\ge N).
$$
Now, as $n\rightarrow\infty$, the righthand side of $(2)$ tends to $N$. From this it follows that $\liminf\limits_{n\rightarrow\infty} \root n\of{a_n}\ge N$. But, as $N$ was arbitrary, we must then have $\lim\limits_{n\rightarrow\infty} \root n\of{a_n}=\infty$.