The OP featured a link to a Science Blog article about the simplification of a scientific subject and has the number 101 in the title. I decided to post another link to a Science Blog article about the simplification of a scientific subject and has the number 101 in the title. Here's another one on Creation Science 101:

Yep, also proof that at it's simplest level, I still can't wrap my head around quantum physics.

Lets try it on an even simpler level:

1) There is a minimum energy/mass things can have, everything can be measured in a multiple of this minimum.

2) Objects at this size, or close to it, don't have an exact position or velocity, so they look like waves in most experiments.

3) If you try to measure the location, they act more like particles, just to fuck with you, but the velocity gets more uncertain, also just to fuck with you.

Conclusion: God hates physicists.

It's only natural that solipsism gets a bad rep but the logical, all be it mindfucking conclusion ought be for the time that we are not autonomous from what we observe. I'm not talking strict solipsism, 'cause that's just silly. But some sort of of all inclusive cosmic parody.

Alternate conclusion: God's not up to the challenge of completely obliterating a self endowed sense of wonder.

When things get really small, we have to rely on machines to tell us what's happening.

If you set the machine to detect a wave, it will.

If you set the machine to detect a particle, it will.

You can't set it to detect both.

Here's more detail:

Quote from: John Marburger III, "Beneath Reality"

The Schrödinger field pattern in position space determines where a detection event is likely to be found, and its pattern in wavelength space determines the momentum we associate with the object causing the event.

If the events are localized in a small region, the wave pattern will be localized but consequently it will contain many elementary waves – its momentum will not be well-defined.

Conversely, if the momentum detector clicks only for a narrow range of momentum values, the wavelength is well-defined, and the wave pattern must extend over many cycles – its location in space is not well-defined.

You can have waves with well-defined position or well-defined momentum, but not both at once. This is the true meaning of the uncertainty relation first enunciated in 1927 by Heisenberg.

The "Heisenberg uncertainty relation" emerged in an atmosphere of confusion from which it has never quite escaped. Much of the fault lies with Heisenberg himself who was not content with setting forth the bare theory, more or less along the lines I have described above (but in mathematical language), he also tried to make the result more comprehensible with suggestive physical arguments.

For example, he implied that the uncertainty has its origin in the inevitable disturbance caused by the measurement process (which is not inherently a quantum concept). Bohr objected to these explanatory efforts, convinced that the matter was deeper than Heisenberg made it out to be.

As I see it, most problems of interpretation are resolved by the simple fact that the microscopic theory does not refer to any physical waves or particles. It refers to well-defined detectors and unambiguous events of detection.

Accounts that ascribe position to particles and momentum to waves apply macroscopic language inappropriately to microscopic nature. You can set a detector to register an event with well-defined momentum, or you can set it to record an event with well-defined position. That does not entitle you to say that the event is caused by a "wave" or by a "particle."