Advertisements

Eureka! 2012’s Biggest Moments in Science

An infographic lists a few of what its author considers to be “2012’s biggest moments in science.” These are:

Higgs boson discovered.

Curiosity lands on Mars.

Fetal genome sequencing.

Quantum teleportation distance record broken.

The discovery of an earth-sized exoplanet “orbiting Alpha Centauri B, one of the stars in the stellar system nearest to our own” (This is listed despite the fact that, as the infographic itself informs us, “Because the planet orbits much closer to its star than Earth, it likely does not host life.”).

Superstorm Sandy as a consequence of climate change.

Notice number 3 in particular. The text states,

Researchers in June announced the successful sequencing of a fetus’ genome using snippets of DNA in the mother’s blood. They indicated a test might be widely available in about five years, which brings up potentially monumental consequences. If such tests became as routine as sonograms, what would expectant parents do with such information — which diseases their child-to-be would be more prone to or knowledge about personality traits or physical appearance?

Despite the innocent-sounding language, this is bound to raise all kinds of moral and larger metaphysical questions. Thoughts?

11 Responses to Eureka! 2012’s Biggest Moments in Science

A little side story to the quantum teleportation story, is that Zeilinger’s ultimate aim is to instantaneously teleport particles to space:

A team of physicists in Vienna has devised experiments that may answer one of the enduring riddles of science: Do we create the world just by looking at it? – 2008
Excerpt page 3: Leggett doesn’t believe quantum mechanics is correct, and there are few places for a person of such disbelief to now turn. But Leggett decided to find out what believing in quantum mechanics might require. He worked out what would happen if one took the idea of nonlocality in quantum mechanics seriously, by allowing for just about any possible outside influences on a detector set to register polarizations of light. Any unknown event might change what is measured. The only assumption Leggett made was that a natural form of realism hold true; photons should have measurable polarizations that exist before they are measured. With this he laboriously derived a new set of hidden variables theorems and inequalities as Bell once had. But whereas Bell’s work could not distinguish between realism and locality, Leggett’s did. The two could be tested.
When Aspelmeyer returned to Vienna, he grabbed the nearest theorist he could find, Tomasz Paterek, whom everyone calls “Tomek.” Tomek was at the IQOQI on fellowship from his native Poland and together, they enlisted Simon Gröblacher, Aspelmeyer’s student. With Leggett’s assistance, the three spent six months painfully checking his calculations. They even found a small error. Then they set about recasting the idea, with a few of the other resident theorists, into a form they could test. When they were done, they went to visit Anton Zeilinger. The experiment wouldn’t be too difficult, but understanding it would. It took them months to reach their tentative conclusion: If quantum mechanics described the data, then the lights’ polarizations didn’t exist before being measured. Realism in quantum mechanics would be untenable.,,,
Leggett’s theory was more powerful than Bell’s because it required that light’s polarization be measured not just like the second hand on a clock face, but over an entire sphere. In essence, there were an infinite number of clock faces on which the second hand could point. For the experimenters this meant that they had to account for an infinite number of possible measurement settings. So Zeilinger’s group rederived Leggett’s theory for a finite number of measurements. There were certain directions the polarization would more likely face in quantum mechanics. This test was more stringent. In mid-2007 Fedrizzi found that the new realism model was violated by 80 orders of magnitude; the group was even more assured that quantum mechanics was correct.
Leggett agrees with Zeilinger that realism is wrong in quantum mechanics, but when I asked him whether he now believes in the theory, he answered only “no” before demurring, “I’m in a small minority with that point of view and I wouldn’t stake my life on it.” For Leggett there are still enough loopholes to disbelieve. I asked him what could finally change his mind about quantum mechanics. Without hesitation, he said sending humans into space as detectors to test the theory. In space there is enough distance to exclude communication between the detectors (humans), and the lack of other particles should allow most entangled photons to reach the detectors unimpeded. Plus, each person can decide independently which photon polarizations to measure. If Leggett’s model were contradicted in space, he might believe. When I mentioned this to Prof. Zeilinger he said, “That will happen someday. There is no doubt in my mind. It is just a question of technology.” Alessandro Fedrizzi had already shown me a prototype of a realism experiment he is hoping to send up in a satellite. It’s a heavy, metallic slab the size of a dinner plate.http://seedmagazine.com/conten....._tests/P1/

further notes:

Quantum physics says goodbye to reality – Apr 20, 2007
Excerpt: Many realizations of the thought experiment have indeed verified the violation of Bell’s inequality. These have ruled out all hidden-variables theories based on joint assumptions of realism, meaning that reality exists when we are not observing it; and locality, meaning that separated events cannot influence one another instantaneously. But a violation of Bell’s inequality does not tell specifically which assumption – realism, locality or both – is discordant with quantum mechanics.
Markus Aspelmeyer, Anton Zeilinger and colleagues from the University of Vienna, however, have now shown that realism is more of a problem than locality in the quantum world. They devised an experiment that violates a different inequality proposed by physicist Anthony Leggett in 2003 that relies only on realism, and relaxes the reliance on locality. To do this, rather than taking measurements along just one plane of polarization, the Austrian team took measurements in additional, perpendicular planes to check for elliptical polarization.
They found that, just as in the realizations of Bell’s thought experiment, Leggett’s inequality is violated – thus stressing the quantum-mechanical assertion that reality does not exist when we’re not observing it. “Our study shows that ‘just’ giving up the concept of locality would not be enough to obtain a more complete description of quantum mechanics,” Aspelmeyer told Physics Web. “You would also have to give up certain intuitive features of realism.”http://physicsworld.com/cws/article/news/27640

“I’m going to talk about the Bell inequality, and more importantly a new inequality that you might not have heard of called the Leggett inequality, that was recently measured. It was actually formulated almost 30 years ago by Professor Leggett, who is a Nobel Prize winner, but it wasn’t tested until about a year and a half ago (in 2007), when an article appeared in Nature, that the measurement was made by this prominent quantum group in Vienna led by Anton Zeilinger, which they measured the Leggett inequality, which actually goes a step deeper than the Bell inequality and rules out any possible interpretation other than consciousness creates reality when the measurement is made.” – Bernard Haisch, Ph.D., Calphysics Institute, is an astrophysicist and author of over 130 scientific publications.

Preceding quote taken from this following video;

Quantum Mechanics and Consciousness – A New Measurement – Bernard Haisch, Ph.D (Shortened version of entire video with notes in description of video)http://vimeo.com/37517080

Violation of Leggett inequalities in orbital angular momentum subspaces – 2010
Main results. We extend the violation of Leggett inequalities to the orbital angular momentum (OAM) state space of photons, which is associated with their helical wavefronts. We define our measurements in a Bloch sphere for OAM and measure the Leggett parameter LN (where N is the number of settings for the signal photon) as we change the angle ? (see figure). We observe excellent agreement with quantum mechanical predictions (red line), and show a violation of five and six standard deviations for N = 3 and N = 4, respectively.http://iopscience.iop.org/1367-2630/12/12/123007

Looking Beyond Space and Time to Cope With Quantum Theory – (Oct. 28, 2012)
Excerpt: To derive their inequality, which sets up a measurement of entanglement between four particles, the researchers considered what behaviours are possible for four particles that are connected by influences that stay hidden and that travel at some arbitrary finite speed.
Mathematically (and mind-bogglingly), these constraints define an 80-dimensional object. The testable hidden influence inequality is the boundary of the shadow this 80-dimensional shape casts in 44 dimensions. The researchers showed that quantum predictions can lie outside this boundary, which means they are going against one of the assumptions. Outside the boundary, either the influences can’t stay hidden, or they must have infinite speed.,,,
The remaining option is to accept that (quantum) influences must be infinitely fast,,,
“Our result gives weight to the idea that quantum correlations somehow arise from outside spacetime, in the sense that no story in space and time can describe them,” says Nicolas Gisin, Professor at the University of Geneva, Switzerland,,,http://www.sciencedaily.com/re.....142217.htm

As to #3, it has some potentially ominous implications, but I think we have to temper concerns with an important practicality:

– Most genotyping can only provide probabilities by comparing with existing populations. For example, let’s say you have your genotype done and it says you are at greater risk for, say, alzheimer’s. What the genotype does not tell you is that you will get alzheimer’s. In fact, it doesn’t in most cases even give you something like a “likelihood” or a “probably.” Rather, it might say that you have “2x the average risk.” In this particular case, for example, the genotype would essentially be saying: “You have type X, and people with type X are known to have a 14% risk of developing the disease, whereas most people only have a 7% risk.” In either case, you are still unlikely to develop alzheimer’s, it is just that in one case your risk is increased by a few % points.

The reality is that in most cases there really aren’t “disease genes,” meaning genes that cause a particular disease. Rather, there are a hundred ways something can go wrong in the body and lead to a particular physical result, and some broken genes can increase the risk or, in some cases, definitively break the system.

So far it appears that there are many other factors to most diseases other than just a particular genetic sequence. As a result, I don’t see genetic sequencing as the definitive tool it is sometimes portrayed to be in the media.

To be sure, however, there are a number of specific problems that could be identified early on (Down’s, for example). And in those cases we will need to exercise some measure of moral judgment or restraint.

Of note, believe it or not, a Darwinist, just yesterday, right here on UD, tried to deny that ‘Junk’ DNA was ever a serious prediction of Darwinian theory. He was apparently oblivious to the fact that the Junk DNA ‘prediction’ has a long history going back several decades and arose directly from mathematical considerations of the modern synthesis of neo-Darwinism through population genetics (Not to mention this also plays into Darwinism’s Theodological Core i.e. God would not have made creatures with so much junk!):

Of note, believe it or not, a Darwinist, just yesterday, right here on UD, tried to deny that ‘Junk’ DNA was ever a serious prediction of Darwinian theory.

Oh, I believe it! 🙂

The current tactic is multi-pronged:

(i) deny that junk DNA was ever predicted or that it was ever used as evidence for evolution and against design (this is sometimes based on a stray quote or two from a couple of lone researchers years ago crying in the wilderness that we shouldn’t be so hasty to throw away huge portions of DNA as junk, in contrast to the vastly more numerous examples of evolutionists routinely citing junk DNA as confirmation of evolution and as a refutation of design);

(ii) claim that there really is still lots of junk and that (A) ENCODE and similar results don’t really matter, and (B) most of the stuff for which we don’t yet know a function is functionless (because, hey, we don’t know a function yet).

We’ve had whole threads in the past few months on this issue, with the more formidable evolutionary apologists taking these two stances.

OT: Sad news today for those who wish there were a quasi infinite number of themselves in parallel universes:

You don’t exist in an infinite number of places, say scientists – January 25, 2013
Excerpt: But the scientists’ biggest criticism of the idea of infinite repetition in both proposals is the assumption that the universe is infinite. Whether the universe is infinite or finite is a big open-ended question in cosmology that scientists may never answer. Soler Gil and Alfonseca note that, looking back at the history of physics, situations emerged where infinities seemed impossible to avoid, yet improved theories eliminated the infinities. Currently the two basic theories in physics, general relativity and quantum theory, both predict infinities. In relativity, it’s gravity singularities in black holes and the big bang. In quantum theory, it’s vacuum energy and certain parts of quantum field theory. Perhaps both theories are simple approximations of a third more general theory without infinities. Soler Gil and Alfonseca also note that, Paul Dirac once stated that the most important challenge in physics was “to get rid of infinity.”
While Soler Gil and Alfonseca can’t disprove the proposals of infinite repetition, they emphasize that the point of their critique is to show that the idea remains in the realm of philosophy, mythology, and sci-fi tales, not modern cosmology. They call the speculation “ironic science,” a term used by science journalist John Horgan to describe options that do not converge on truth but are at best “interesting.”http://phys.org/news/2013-01-d.....tists.html