Suppose you have a memory of n bits, or a memory based on a larger r-letters alphabet, and suppose that a “random noise” hits the memory in such a way that the probability of each bit being affected is t.

What will be the typical behavior of such a random digital noise? Part of the question is to define “random noise” in the best way possible, and then answer it for this definition.

In particular, Will the “random noise” typically behave in a way which is close to be independent on the different bits? or will it be highly correlated? or pehaps the answer depends on the size of the alphabet and the value of t?

The source of this question is an easy fact about quantum memory which asserts that if you consider a random noise operation acting on a quantum memory with n qubits, and if the probability that every qubit is damaged is a tiny real number t, then typically the noise has the following form: with large probability nothing happens and with tiny probability (a constant times t) a large fraction of qubits are harmed.

I made one try for the digital (binary) question but I am not sure at all that it is the “correct” definition for what “random noise” is.

3 Responses to Test Your Intuition (10): How Does “Random Noise” Look

Can $t$ depend on $n$? Suppose it does not. Then we can define the Bernoulli noise. Each bit is flipped with prob. $t$. So typicalally n(1-t) bits are not affected.
More bits are affected with smaller prob.

This “Math Overflow” site is great ! The design is close to perfect. I like the votes and views statistics, the tags and users rankings etc…

As an sporadic blog commenter i miss this kind of stuff in blogs. You have not information about your audience nor if your comments are abhored or loved (although you can imagine).

I´m surprised by how low digital flavoured subjects (computer science science related topics) as computational complexity are ranked. The hotest topic up to now seems to be algebraic geometry. We must wait and see how the ranking “evolves”.