Search form

A panoramic shot of the Advanced Cold Molecule Electron EDM, a device in the laboratory of Silsbee professor of physics John Doyle that is designed to make measurements of the quantum physical behavior of electrons so precise that the results could change understanding of the Standard Model of particle physics.Photograph courtesy of John Doyle/Harvard Research Center for Quantum Optics

A panoramic shot of the Advanced Cold Molecule Electron EDM, a device in the laboratory of Silsbee professor of physics John Doyle that is designed to make measurements of the quantum physical behavior of electrons so precise that the results could change understanding of the Standard Model of particle physics.Photograph courtesy of John Doyle/Harvard Research Center for Quantum Optics

Cook’s tour: Harvard wideout Jack Cook leaves Yale’s Deonte Henson in the dust on a third-quarter, 15-yard touchdown. The score gave the Crimson a 28-24 lead, which it would not surrender.Photograph by Tim O’Meara/The Harvard Crimson

Search form

Science's “Third Branch”

Why are doctors from Boston’s Brigham and Women’s Hospital working with Harvard astrophysicists? And why is Professor Jeff Lichtman of Harvard’s Center for Brain Science working with the associate director of Mitsubishi Electric’s industrial research lab? University provost Steven E. Hyman provided the answer on March 21 at the inaugural symposium of Harvard’s new Initiative in Innovative Computing (IIC). “Scientific computing is at the heart of progress for the coming generation,” Hyman said. Since the summer of 2006, the IIC has facilitated numerous collaborations between experts in imaging and software engineering and scientists working in traditional fields in order to accelerate the pace of research. From star surveys to neural wiring, from pedagogy to cardiovascular medicine, the IIC has already begun to demonstrate the promise of computer-enabled scientific advances.

At least as important are the people whom the initiative is pairing with Harvard researchers to solve difficult computing problems in science. Thus Lichtman, who is trying to understand how the human brain is wired, is working with visiting scholar Hanspeter Pfister of Mitsubishi Electric, the chief architect of the company’s real-time volume-rendering hardware, VolumePro. Understanding how the brain is wired requires being able to render it in three-dimensional space, explains Lichtman; using fluorescent proteins of different colors, he and Pfister have been able to trace individual neural connections or “wires,” creating a multicolored, two-dimensional wiring diagram Lichtman lightly calls a “brainbow.” But the aims of the research are serious. “Many neurological diseases are diseases of wiring,” he explains: understanding the anatomy will help in the search for what goes wrong in a diseased brain.

Neuroscientists capture anatomical data by mounting a specimen brain on a rotating spindle. They then peel off ultra-thin layers with a sharp, diamond-edged blade in much the same way that a machine peels an apple. But translating that to three dimensions on a computer will be a tremendous challenge. To begin with, the quantity of information to be captured is massive. The human “connectome,” or wiring diagram of the brain, says Lichtman, is a data set “larger than all the written material in all the libraries of the world”—at least one million petabytes. (One petabyte, or a million gigabytes, is the equivalent of all the written information held in all U.S. academic research libraries.)

Image courtesy of the Initiative in Innovative Computing

Harvard astronomers have adapted medical imaging software to the study of interstellar regions of gas and dust, as shown here. These are places where stars are born.

Rendering that data, in turn, so that scientists can follow a neural-trunk connection down to a synapse in a “Google Earth”-like application is an equally daunting task. (Google Earth, which provides aerial views of the entire surface of the globe, requires 1019 pixels to render the earth’s surface at a one-foot resolution. The human brain “connectome” would require 1022 voxels—a volume element analogous to a pixel—in order to provide a relatively equivalent level of detail!) Finally, neural connections change over time. That, says Pfister, means that scientists will have to visualize this data set “in four dimensions, the fourth dimension being time.”

Astrophysicist Alyssa Goodman, professor of astronomy and director of the IIC, faces similar challenges in her research on understanding how interstellar gas arranges itself into new stars. Goodman’s group has been collaborating with scientists at Harvard Medical School (HMS) and the affiliated teaching hospitals to adapt existing three-dimensional medical-imaging software—the kind physicians use to look for tumors—to look for jets, outflows, and “cavities” in interstellar gas. Dubbed “Astromed,” the project has already identified several previously unknown large bubbles and shells of gas, as well as more than a dozen new jets of material shooting from newborn stars.

Goodman says that the astronomers’ innovations in extending and improving the medical-imaging software will eventually benefit physicians as well: “The algorithms that we develop…might be used to identify” medically important tumors, she says. Eventually, the revised software is likely to prove useful in other areas, such as weather modeling and geophysics.

Image courtesy of the Initiative in Innovative Computing

IIC promotional artwork illustrates the premise that advanced computation will accelerate the pace of scientific research.

Professor of astronomy and of physics Christopher Stubbs studies the accelerating expansion of the universe. He described the problem faced by astronomers collecting data from telescopes around the world: the volume of data is expanding at an astronomical rate, too. (The telescope in the Pan-STARRS project, an optical all-sky survey, for example, captures 1.4 billion pixels per frame.) Sometimes, the data stream in faster than they can be easily stored. One telescope that generates 20 terabytes (one terabyte is a thousand gigabytes) a night makes its data accessible to scientists worldwide via Federal Express, because that’s the fastest way to move such large stores of information—the Internet can’t handle it. But an IIC-sponsored collaboration is now tackling new ways to analyze stored and streaming data.

Robert A. Lue, director of undergraduate life-sciences education, demonstrated the power of computer-generated animation as an aid to teaching. His Biovisions project, supported by the Howard Hughes Medical Institute, depicts complex biological processes in action and will release a teaching tool this spring, The Inner Life of the Cell. Lue’s testing has shown that student understanding is enhanced beyond what can be communicated in a textbook alone by such animations of otherwise abstract concepts.

These are just a few of the collaborative, computation-intensive projects taking place under the umbrella of the IIC, which employs about 20 people at its 60 Oxford Street headquarters. By the end of this year, that number is expected to grow to 30, with an eventual goal of 80 to 100 employees. “Computation has become a ‘third branch’ of science, alongside theory and experiment,” asserts the IIC’s promotional literature. Tim Clark, an instructor in neurology at HMS and the director of IIC’s research programs, goes even further: “Advanced computation,” he predicts, “will enable a second scientific revolution.”