Archive for October, 2011

Only a few decades back, there were serious scientists who thought that planets might be miraculous. Not miracles like a burning bush or a docile teenager, but highly improbable objects. These researchers figured that the conditions necessary for making small, cold worlds could be rare—perhaps extremely rare. Most stars were believed to live their luminous lives alone, bereft of planetary accompaniment.

Well, those thoughts have been banished. In the last 15 years, hard-working astronomers have found many hundreds of so-called exoplanets around nearby stars, and NASA’s Kepler telescope is set to uncover thousands more. (If you don’t know this already, you’ve probably reached this site by mistake. But you’ve come this far already, so keep reading.) Kepler’s principal task is to find habitable exoplanets—worlds with solid surfaces at the right distance from their host star to sport temperatures amenable to the presence of watery oceans and protective atmospheres—planets that might be very much like Earth (depending on some other factors that are harder to measure from light-years away, like geology and chemistry).

Kepler has already found about five dozen candidate objects that, while somewhat larger than our own, seem to meet these criteria. As this space-based telescope continues to peer into the heavens, more such planets will emerge from the data. Indeed, it seems a good bet to guess that at least a few percent of all stars are blessed with “habitable” worlds. That would tally to billions of life-friendly sites, just in our galaxy. This has already prompted SETI scientists to swing their antennas in the directions Kepler’s most promising candidate planets, hoping to pick up the ABCs and MTVs of alien worlds. After all, these systems are arguably the best targets that SETI (the Search for Extraterrestrial Intelligence) has ever had. It’s like discovering a prolific fishing hole.

But there’s a fly in the ointment: While eavesdropping on a small bunch of star systems known to have terrestrial-style worlds is better than taking your chances with random targets, it’s not actually that much better. The reason is simple. The oldest confirmed fossils on Earth are about 3.5 billion years old, and there’s indirect, if sketchy, evidence for life going back 4 billion years. That’s roughly 90 percent of the age of the Earth, which is to say that biology bedecked our planet very early. Life seems to have been an easy chemistry experiment. So that’s yet more encouragement, as it hints that many of those habitable worlds will actually be inhabited. There could be life on billions of planets in the Milky Way.

Gholson Lyon is on a crusade. It started last November, when he found out that a woman in a research study that he was conducting was pregnant. Lyon’s study had revealed that the woman carried a gene that causes a fatal disease. Yet he couldn’t tell the mother-to-be that she might be carrying a sick child due to the rules governing the study. The mother did give birth to a boy with the disease; he died in the same week that Lyon published his paper on the study, as I reported recently in Nature. Lyon was so disturbed by the situation that he is now trying to find a way for researchers to work within the rules so that they don’t face these same ethical dilemmas. And he is speaking and writing about the issue everywhere he can.

The issue of what to tell patients about their DNA is difficult enough for doctors who are treating patents rather than studying them. But it has become urgent for researchers as well, because genetic sequencing technologies are now cheap and fast enough that scientists are planning to sequence five thousand patients’ genomes this year, and as many as 30,000 next year. The US National Human Genome Research Institute will soon begin a program that will spend tens of millions of dollars to sequence the genomes of patients, like Lyon’s study subjects, who have rare genetic diseases. And researchers are also sequencing thousands of otherwise healthy people across the lifespan, from newborns to old folks.

Inevitably, researchers will find stuff in these thousands of genomes. Most of it will be difficult to understand. Some of it will clearly be linked to disease. Some of it will be newly linked to disease through these studies. The whole point of these studies is to link genes and disease. So it would seem like a good idea to tell the gracious volunteers who have donated their time and blood for these studies that they have certain genetic disease risks, right?

Two people are dancing a waltz, and it is not going well. One is tall and the other short; one is graceful, the other flat-footed; and both are stepping to completely different rhythms. The result is chaos, and the dance falls apart. Their situation mirrors a problem faced by all complex life on Earth. Whether we’re animal or plant, fungus or alga, we all need two very different partners to dance in step with one another. A mismatch can be disastrous.

Virtually all organisms with complex cells—better known as eukaryotes—have at least two separate genomes. The main one sits in the central nucleus. There’s also a smaller one in tiny bean-shaped structures called mitochondria, little batteries that provide the cell with energy. Both sets of genes must work together. Neither functions properly without the other.

Mitochondria came from a free-living bacterium that was engulfed by a larger cell a few billion years ago. The two eventually became one. Their fateful partnership revolutionised life on this planet, giving it a surge of power that allowed it to become complex and big (see here for the full story). But the alliance between mitochondria and their host cells is a delicate one.

Both genomes evolve in very different ways. Mitochondrial genes are only passed down from mother to child, whereas the nuclear genome is a fusion of both mum’s and dad’s genes. This means that mitochondria genes evolve much faster than nuclear ones—around 10 to 30 times faster in animals and up to a hundred thousand times faster in some fungi. These dance partners are naturally drawn to different rhythms.

This is a big and underappreciated problem because the nuclear and mitochondrial genomes cannot afford to clash. In a new paper, Nick Lane, a biochemist at University College London, argues that some of the most fundamental aspects of eukaryotic life are driven by the need to keep these two genomes dancing in time. The pressure to maintain this “mitonuclear match” influences why species stay separate, why we typically have two sexes, how many offspring we produce, and how we age.

Today it is fashionable to contend that ethnic identity is a social construction. That fashion obviously has some genuine basis in reality. Univision host Jorge Ramos, a blue-eyed Mexican American, is considered a “person of color.” If his name was “George Romans” he would be coded as a white American simply on account of his physical appearance. This is due to the social construction of a Hispanic American identity, which has roots in decisions made by the United States government in ethnic classification in the 1960s. But this model of social construction allowing for plasticity is not universal. As outlined in The Cleanest Racist the North Korean national identity is strongly essentialist, to the point where even genetically close populations such as Japanese could never be part of the nation. Similarly, in Japan itself the native-born ethnic Koreans are still viewed as fundamentally guests in the Japanese nation. Both cases illustrate how social construction can impede rather than enable fluidity. Yet social construction as a total explanatory model has limits. Canada has the term “visible minority” to denote those populations which are distinct in origin from Anglophone and Francophone whites by virtue of their appearance. This is in contrast to groups like Ukrainian Canadians, which are minorities due to their chosen cultural distinctiveness.

When it comes to ethnic difference and conflict we can ascribe the divisions to both social and biological distinctions to varying degrees. In the mid-1990s there was a genocide in Rwanda. That genocide had an ethnic dimension, with conflict between the Tutsis and the Hutus being one cause. The Hutu regime which implemented the genocide against the Tutsis co-opted theories of biological difference and foreign origin pioneered by European scholars in the 19th century. Whereas these distinctions once justified Tutsi domination of the Hutu, now they served to mark off the Tutsi as an alien infestation. After the takeover of Rwanda by a Tutsi dominated rebel movement in the wake of the genocide there was an attempt to elide these deadly distinctions. The rationale is clear. Remove the ostensible basis for genocide, and you remove the risk of genocide. The argument that the Tutsi-Hutu distinction is a purely socially constructed European invention has now crept into the mainstream discourse, such as in the film Hotel Rwanda.

Fermilab’s Tevatron, the largest particle accelerator in the United States, was shut down on September 30 after a celebrated career of 28 years that has provided us with some of the greatest discoveries in particle physics. This leaves the European lab CERN (see photo on left) to lead the way into future discoveries with its Large Hadron Collider.This landmark in experimental physics is an opportunity to reexamine the theoretical model physicists have constructed and relied on in their search to understand the workings of the universe: the standard model of particle physics. The standard model is a comprehensive theory about nature’s elementary particles and the forces that control their behavior, and it has been constructed over a half-century of intensive work by many theoretical physicists as well as experimentalists. The model has worked amazingly well, harmoniously combining theory and experiments and producing extremely accurate predictions about the behavior of particles and forces. But could the model now be beginning to show some cracks?

It all started on a wintry evening in 1928. While staring at the flames in the fireplace at St. John’s College, Cambridge, Paul Dirac made one of the most important discoveries in the history of science when he saw how to combine the Schrödinger equation of quantum mechanics with Einstein’s special (but not general) theory of relativity. This achievement launched relativistic quantum field theory—which forms the theoretical basis for the standard model—and produced two immediate consequences: an explanation of the spin of the electron, and Dirac’s stunning prediction of the existence of antimatter (confirmed a few years later with the discovery of the positron).

In the late 1940s, Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga, all working independently, presented the first quantum field theory, called quantum electrodynamics, which explained the electromagnetic interactions of electrons and photons. It forms the first part of the standard model by handling interactions that are controlled by the electromagnetic field. The theory’s success inspired other theoretical physicists to construct similar quantum field theories for addressing the actions of the weak and strong nuclear forces—thus together accounting for everything in particle physics except for the action of gravity, the subject of Einstein’s general theory of relativity. By the 1970s, the result, the standard model, was ready: a quantum field theory of all elementary particles—leptons and quarks and their interactions through the actions of particles (such as the photon) called bosons.

The Future has stalled. Sure, some snazzy new gadgets came out this year, but all the Next Big Things are still just over the horizon. Neal Stephenson and Peter Thiel both have depressing articles trying to pin down the culprit for our technological stagnation. They both take some shots at government, at education, and at how technological progress has become self-defeating. One passage from Stephenson crystallized the argument:

Innovation can’t happen without accepting the risk that it might fail. The vast and radical innovations of the mid-20th century took place in a world that, in retrospect, looks insanely dangerous and unstable. Possible outcomes that the modern mind identifies as serious risks might not have been taken seriously—supposing they were noticed at all—by people habituated to the Depression, the World Wars, and the Cold War, in times when seat belts, antibiotics, and many vaccines did not exist. Competition between the Western democracies and the communist powers obliged the former to push their scientists and engineers to the limits of what they could imagine and supplied a sort of safety net in the event that their initial efforts did not pay off. A grizzled NASA veteran once told me that the Apollo moon landings were communism’s greatest achievement.

We’ve stopped innovating in big, world-changing ways. Clean energy, health care, and space travel are all roughly where they were in the ‘70s. So what’s missing? Why is it not just the West, but seemingly the entire world has suddenly shuddered to a crawl in terms of technological progress?

Oddly enough, the beginnings of an answer to our innovation woes can be found in another cultural throwback: the massive protests that are Occupy Wall Street.

Once more we are going through the annual ritual of the Nobel Prize announcements. The early morning phone calls, the expressions of shock, the gnashing of teeth in the betting pools. In the midst of the hoopla, I got an annoyed email on Tuesday from an acquaintance of mine, an immunology grad student named Kevin Bonham. Bonham thought there was something wrong with this year’s Prize for Medicine or Physiology. It should have gone to someone else.

Kevin lays out the story in a new post on his blog, We Beasties. The prize, he writes, “was given to a scientist that many feel is undeserving of the honor, while at the same time sullying the legacy of my scientific great-grandfather.” Read the rest of the post to see why he feels this way.

Kevin emailed me while he was writing up the blog post. He wondered if I would be interested in writing about this controversy myself, to give it more prominence. I passed. Even if I weren’t trying to carry several deadlines on my head at once, I would still pass. As I explained to Kevin, I tend to steer clear of Nobel controversies, because I think the prize is, by definition, a lousy way to recognize important science. All the rules about having to be alive to win it, about how there can be no more than three winners–along with the lack of prizes for huge swaths of important scientific disciplines–make these kinds of disputes both inevitable and tedious.

In 1917, a year after his general theory of relativity was published, Einstein tried to extend his field equation of gravitation to the universe as a whole. The universe as known at the time was simply our galaxy—the neighboring Andromeda, visible to the naked eye from very dark locations, was thought to be a nebula within our own Milky Way home. Einstein’s equation told him that the universe was expanding, but astronomers assured him otherwise (even today, no expansion is evident within the 2-million-light-year range to Andromeda; in fact, that galaxy is moving toward us). So Einstein inserted into his equation a constant now known as “lambda,” for the Greek letter that denoted it. Lambda, also called “the cosmological constant,” supplied a kind of force to hold the universe from expanding and keep it stable within its range. Then in 1929, Hubble, Humason, and Slipher made their monumental discovery using the 100-inch Mount Wilson telescope in California of very distant galaxies and the fact that they were receding from us—implying that the universe was indeed expanding, just as Einstein’s original equation had indicated! When Einstein visited California some time later, Hubble showed him his findings and Einstein famously exclaimed “Then away with the cosmological constant!” and never mentioned it again, considering lambda his greatest “blunder”—it had, after all, prevented him from theoretically predicting the expansion of the universe.

Fast forward six decades to the 1990s. Saul Perlmutter, a young astrophysicist at the Lawrence Berkeley Laboratory in California had a brilliant idea. He knew that Hubble’s results were derived using the Doppler shift in light. Light from a galaxy that is receding from us is shifted to the red end of the visible spectrum, while a galaxy that is approaching us has its light shifted to the blue end of the spectrum, from our vantage point. The degree of the shift is measured by a quantity astronomers call Z, which is then used to determines a galaxy’s speed of recession away from us (when Z is positive and shift is to the red).

“My name is Legion, for we are many”: according to the Gospel of Matthew, that’s how a man possessed by demons answered when Jesus asked his name. Demonic possession is supernatural claptrap, but science has nevertheless revealed no end of proofs that we are not alone in our own skins. Bacteria swarm through our mouths and intestines; fungi and yeasts take hold where they can in our moist, warm places; parasitic animals and microbes homestead in our blood and tissues.

The sources of those chemical signals don’t have to stop at our skins, however, and they needn’t have anything to do with symbiosis, as an intriguing paper from September underscored. In Cell Research, scientists at Nanjing University in China reported that some of the regulatory molecules called microRNAs found in foods can survive digestion and change gene expression in the creatures that eat them—including humans.

By now you’ve probably heard the widely reported news about the possible discovery of neutrinos that allegedly travel faster than light. The OPERA (Oscillation Project with Emulsion tRacking Apparatus) collaboration of almost 200 scientists working at the Gran Sasso underground laboratory in central Italy has discovered a phenomenon the physicists could simply not explain. For over three years, the scientists have been collecting data on the flight of neutrinos—those mysterious, nearly massless particles that can travel through anything at immense speed—originating in the SPS accelerator at CERN, near Geneva, and traveling underground all the way to Gran Sasso, 731 kilometers (about 450 miles) away. The experiment showed that the 16,000 neutrinos measured at Gran Sasso had traveled there through Earth’s crust at faster than light speed.

Facing a crowded lecture hall at CERN last Friday, Dario Autiero of the OPERA group explained how the researchers went to great lengths to remove any sources of error in their measurements: they measured distances using an extremely high-precision GPS called PolarX, measured time at the two locations to an accuracy of one nanosecond using cesium clocks, and accounted for the tides, Earth’s rotation, variations between day and night and spring and fall, etc. The statistical significance of the finding was six-sigma—meaning that the probability that the experimental result was a random fluke was only one in a billion. For a full hour after the presentation, Dr. Autiero was grilled by a roomful of physicists, and seemed to be able to account for all of the many potential errors brought up by the audience.

But physicists remain very skeptical. They want to see a confirmation of the findings from another experiment in a separate laboratory before they accept such a bizarre finding. After all, this result, if true, would appear to run against the spirit of Einstein’s special theory of relativity. When I showed the Gran Sasso paper to Nobel Laureate Steven Weinberg, he told me: “It looks pretty impressive, but I still think that this will go away.” The sentiment was echoed by almost every physicist I have spoken with since. The results seem mind-boggling. After all, nothing can go faster than light, right?