Our ability to store data on or in physical media continues to grow, with the maximum amount a data you can store in a given volume increasing exponentially from year to year. Storage devices continue to get smaller and their capacity gets bigger.

This can't continue forever, though, I would imagine. "Things" can only get so small; but what about information? How small can a single bit of information be?

Put another way: given a limited physical space -- say 1 cubic centimeter -- and without assuming more dimensions than we currently have access to, what is the maximum amount of information that can be stored in that space? At what point does the exponential growth of storage density come to such a conclusive and final halt that we have no reason to even attempt to increase it further?

@MarkEichenlaub But surely the higher and higher energy eigenstates fill up more and more space: IIRC there is no bound on the eigenstate "size" as you go higher in energy.
–
WetSavannaAnimal aka Rod VanceSep 30 '13 at 7:54

3 Answers
3

The answer is given by the covariant entropy bound (CEB) also referred to as the Bousso bound after Raphael Bousso who first suggested it. The CEB sounds very similar to the Holographic principle (HP) in that both relate the dynamics of a system to what happens on its boundary, but the similarity ends there.

The HP suggests that the physics (specifically Supergravity or SUGRA) in a d-dimensional spacetime can be mapped to the physics of a conformal field theory living on it d-1 dimensional boundary.

The CEB is more along the lines of the Bekenstein bound which says that the entropy of a black hole is proportional to the area of its horizon:

$$ S = \frac{k A}{4} $$

To cut a long story short the maximum information that you can store in $1 cc = 10^{-6} m^3$ of space is proportional to the area of its boundary. For a uniform spherical volume, that area is:

$$ A = V^{2/3} = 10^{-4} m^2 $$

Therefore the maximum information (# of bits) you can store is approximately given by:

Of course, this is a rough order-of-magnitude estimate, but it lies in the general ballpark and gives you an idea of the limit that you are talking about. As you can see, we still have decades if not centuries before our technology can saturate this bound !

Cheers,

Edit: Thanks to @mark for pointing out that $1 cc = 10^{-6} m^3$ and not $10^{-9} m^3$. Changes final result by three orders of magnitude.

On Entropy and Planck Area

In response to @david's observations in the comments let me elaborate on two issues.

Planck Area: From lqg (and also string theory) we know that geometric observables such as the area and volume are quantized in any theory of gravity. This result is at the kinematical level and is independent of what the actual dynamics are. The quantum of area, as one would expect, is of the order of $\sim l_{pl}^2$ where $l_{pl}$ is the Planck length. In quantum gravity the dynamical entities are precisely these area elements to which one associates a spin-variable $j$, where generally $j = \pm 1/2$ (the lowest rep of SU(2)). Each spin can carry a single qubit of information. Thus it is natural to associate the planck areas with a single unit of information.

Entropy as a measure of Information: There is a great misunderstanding in the physics community regarding the relationship between entropy $S$ - usually described as a measure of disorder - and useful information $I$ such as that stored on a chip, an abacus or any other device. However they are one and the same. I remember being laughed out of a physics chat room once for saying this so I don't expect anyone to take this at face value.

But think about this for a second (or two). What is entropy?

$$ S = k_B \ln(N) $$

where $k_B$ is Boltzmann's constant and $N$ the number of microscopic degrees of freedom of a system. For a gas in a box, for eg, $N$ corresponds to the number of different ways to distribute the molecules in a given volume. If we were able to actually use a gas chamber as an information storage device, then each one of these configurations would correspond to a unit of memory. Or consider a spin-chain with $m$ spins. Each spin can take two (classical) values $\pm 1/2$. Using a spin to represent a bit, we see that a spin-chain of length $m$ can encode $2^m$ different numbers. What is the corresponding entropy:

$ S \sim \ln(2^m) = m \ln(2) \sim \textrm{number of bits} $

since we have identified each spin with a bit (more precisely qubit). Therefore we can safely say that the entropy of a system is proportional to the number of bits required to describe the system and hence to its storage capacity.

I've heard this a few times, might as well ask now. What if you take a volume $V_2$ that lies inside your volume $V_1$ such that $A_2 > A_1$. Which one would be able to hold more information?
–
MalabarbaDec 27 '10 at 4:24

1

@space_cadet: This has the makings of a great answer; my one (hopefully constructive) criticism is that you don't really explain why the proportionality constant $S/A$ is related to $A_{pl}$. Of course a full proof would be overkill, but I think it'd help to include a few words on the significance of the Planck area in this argument, for people who aren't familiar with it. Also I'd rather see a different symbol used instead of $S$ in your last equation, since entropy doesn't quite measure the number of bits of information. (I know it's just a constant factor difference, it just looks weird)
–
David Z♦Dec 27 '10 at 4:42

2

@Bruce: $V_2$ obviously; the whole point of holography is that volume doesn't matter at all, only area does :-) Of course, I am not sure to what degree this has been proved (as in calculated microscopically) for generic surfaces (not smooth even?) rather than for horizons of quite generic BH.
–
MarekDec 27 '10 at 9:11

2

In the Pleasure of Finding Things out, Richard Feynman also speculates on the limits of information density. He doesn't go as far as the Planck length, as far as I can recall, presumably because that is very far of technologically, even nowadays. Interesting lecture.
–
RaskolnikovDec 28 '10 at 14:52

Ok then, let's say that for a given volume, and nano molecular data retrieval technology. Assuming that you want the data safe, retrievable, made of a long term stable atom what is the maximum data that can usefully be stored.

So firstly we need 1/2 of the total volume to be used for a single molecular layer of your chosen molecule, this will be the "platter" for our "hard drive".

Onto this you place the atoms that represent bits so you have your volume divided by the volume of your chosen molecule/element divided by 2 as the total number of bits.

But with molecular storage, you could use different molecules and have for example,

No molecule = 0
Gold = 1
Platinum =2
Silver = 3

Then you have 4 bit data storage without much loss in size, throw in some carbon 12 and carbon 13 and your up to 6 bit, find some more stable elements and your up to 8 bit and so on.

Of course data retrieval would be terribly slow, but for long term, small size storage. Your talking quadrillions of bits per cm3

I'm not a physicist, but I do know computer science, and keep up with the basics of physics, so let me give another answer for this:

We don't know yet. As long as there are smaller things that can be found, changed, and observed, we can use them to store information.

For example, if a new quantum property is found which can be in state A or state B, that's a new bit. If that's in every billion atoms of something, that's a billion more bits of data. If we then learn to manipulate that property into two additional states (say, right-way-out, and inside-out), then we've just added a new bit, raising that capacity to the power of 2.

So, the problem is that we're still learning what matter and spacetime are made of. Until we come up with a provably correct, unified theory, we don't know how many varying things there are within any material. Given that every single additional state is at least a ^2 change in information density, it's fairly useless to give "ballpark" figures until we know more. So it's probably just better to give something like Moore's Law - a prediction that we'll double the storage every so often, until we run out of new discoveries/technologies.