Several of the algorithms in the SHA-3 competition are 512 bits or more.

That doesn't sound like much, does it? But it's a tremendous amount, really.

The smallest unit of time that a modern physicist is concerned with is called the Planck time -- 5.3124×10-44 seconds. This is a time scale which, if you zoomed in close enough, something wildly different from anything familiar must be happening. (We know that something weird must be happening because we have two completely different theories -- general relativity, which we apply at the scale of galaxies, and the standard model, which we apply at the scale of the atomic nucleus. They're totally different, and as far as we can tell, completely incompatible with each other; but we can use both of them, because they don't overlap much in their respective domains. They don't overlap anywhere in the middle, either: at the scale of you or me, we neither have to calculate spacetime curvature, nor solve the Schödinger equation. But they do overlap at this miniscule time scale -- and at a similarly small length scale.)

This number, we might express as 2-143.75547 seconds or so. If you were going to count once every Planck second, you would only need 144 bits to count out one full second.

At the other end of the scale, the universe is 13.7 billion years old, give or take about 0.1-0.2 billion. This, we might express as 258.6 seconds or so. A modern 64-bit computer could count out the universe in seconds rather handily in one register.

Divide the second time by the first, and you get the total number of Planck times that have happened in the life of the universe. By exponent rules, you just add the numbers of bits: 58.6 + 143.8 = 202.4 bits.

An ideal 512-bit hash function is so secure that even if someone had one quadrillion of the fastest computers that a modern physicist can imagine, and worked on finding a single collision for the entire lifetime of the universe, their chance of success would only be about 8%. We would have to make a significant advance in physics before we could conjecture what sorts of sub-Planck computing could be possible. (We're not even completely sure what we can and can't do with quantum computing, and we're relatively solid on the fundamental rules of quantum mechanics.)

Lubos Motl' has an interesting post explaining how even the standard attempt to quantize GR (using canonical quantization) reveals that there must be fundamentally new physics at the planck scale. It's a bit technical, but if you look through some of the details to the other side, it's pretty clear how the argument goes. Fun stuff.