Barack Obama’s first press conference after the 2008 election focused largely on a crucial appointment- the type of dog the president would deliver, as promised, to his two daughters. Obama said that his choice for First Puppy has to take into account the allergies of his older daughter, Malia. But are some breeds really more hypoallergenic than others?

Breeders say short-haired dogs, such as schnauzers, and dogs that shed less, such as poodles, are safer for sensitive pet owners than long-haired breeds. However, no credible studies support this claim. The idea of a “hypoallergenic breed” of dog is wishful thinking at best.

People have allergic reactions to dogs because they have developed antibodies to specific molecules, called allergens, that the animal produces. When the allergen is inhaled or rubbed on the skin, it binds to antibodies located on the surface of mast cells, a type of immune cell distributed throughout the body. The mast cell reacts by releasing its store of histamine compounds, which provoke itchiness, sneezing, and a host of irritating symptoms.

The most common canine allergens are primarily found in a dog’s saliva and skin. In all studies, the dog’s hair length does not influence the amount of allergen detected. Instead, scientists have discovered that allergens are most concentrated in dander, or dead skin cells. These flakes of skin slough off and, because they are tiny, can hang in the air for up to two weeks. Dander, moreover, is common to all breeds. Dogs with short hair, or no hair at all, still shed flakes of skin. Instead of debating between goldendoodles and Portugese water dogs, the president would be best advised to simply invest in a moisturizing shampoo for the First Pup. Vets agree this the only known way to reduce dander.

A week before his inauguration, Obama joked with reporter George Stephanopoulos about the puppy-vetting process. “This has been tougher than finding a commerce secretary,” he said. No word yet on whether confirmation hearings for Senator Judd Gregg, the designate for that position, included the dandruff question on the agenda.

Play the pipes slowly for Atlantis, Endeavor, and Discovery. In 2010, NASA will retire its old workhorses for good.

The shuttering of the shuttle program leaves American space exploration in an awkward situation. The next generation of spacecraft, dubbed the Constellation Project, will not be operable until at least 2015. For the intervening five years, the U.S. space program must rely on Russia to ferry its astronauts to the International Space Station.

The International Space Station (image: NASA.gov)

NASA has used Soyuz capsules to transport astronauts and parts since the ISS first sailed into orbit in 1998. Despite the ease of scientific cooperation, the deal has always been a political tightrope walk. The Iran-North Korea-Syria Non-Proliferation Act, a 2000 resolution, would have prohibited NASA from dealing with Russia, which often sells military technology to Iran. A little bit of legislative acrobatics resulted in a waiver that allowed NASA to rent out seats on Russian spacecraft.

A renewal of the INKSNA waiver, scheduled for this year, seemed to be a given. Without access to Soyuz capsules, the U.S. would have no means of carrying its own crew, or rescuing them from the space station if something should go awry. Should the waiver fail to be renewed in time, NASA would most likely withdraw its presence from the ISS.

Then came the August 2008 invasion of the former Soviet republic of Georgia. Furious hawks in the U.S. government, including Republican presidential candidate John McCain, called for sanctions and penalties against Russia. The move to renew the INKSA waiver stalled.

Ironically, it was Democratic Senator Barack Obama, who is considered unenthusiastic about funding human space exploration, who stepped in. He wrote to his colleagues, urging them to prioritize allowing the U.S. access to the International Space Station. On September 24, it passed easily, garnering Obama a letter of thanks from the administrator of NASA, Michael Griffin, who concluded by saying that “without your leadership, this would not have happened”. Griffin, a Republican appointee, would usually have little cause to praise a Democratic candidate for president. But science is often a sphere in which cooperation trumps political grudges.

It began on July 17th, 1975. On that date, 200 kilometers above the city of Metz, France, an American astronaut and a Russian cosmonaut shook hands.

Leonov and Stafford bridge the divide. (Image source: nasa.gov)

Thomas P. Stafford and Alexei Leonov were the commanders of American and Russian spacecraft in the Apollo-Soyuz Test Project, the first joint spaceflight undertaken by the two countries. During the nine-day mission, the two modules performed several scientific missions, including experiments in photographing solar eclipses. Apollo-Soyuz had another goal in mind: détente. The meeting between American and Soviet astronauts signaled that Cold War tensions were thawing.

Space hadn’t always been this friendly.

A decade earlier, President Kennedy had remarked that America’s competition with the Soviet Union for supremacy in space was a challenge “that we are willing to accept, one we are unwilling to postpone, and one which we intend to win.”

“It was like boys in a locker room bragging,” said Professor Igor Lukes, a professor of International Relations at Boston University. “Just another substitute for international competition: to see who could put more and more people up there.”

The launch of the Soviet satellite Sputnik in 1957 sparked an effort by the two superpowers to outdo each other in displaying technological supremacy, culminating in the American moon landing of 1969. While both powers staked claims of “winning” the space race, by the mid-seventies the zeal for competition had cooled, on earth as well as above it.

After Apollo-Soyuz, other collaborations followed. The Shuttle-Mir missions, conducted from 1994 to 1998, flew American astronauts to the Mir space station to perform experiments and engage in cultural exchange. NASA also used Shuttle-Mir to gather data on the physiological and psychological effects of long-term spaceflight. During this period, both agencies talked of undertaking another, more ambitious joint venture: the construction of an international space station.

On November 20, 1998, the first element of the station, the Zarya Functional Cargo Block, blasted into space on the back of a Russian Proton rocket. The latest addition is a Japanese experimental laboratory nicknamed Kibo, attached in May of 2008. The ISS is the size of a football field and can be seen with the naked eye from earth. Fifteen countries from across the world have a stake in seeing it stay afloat.

There is still potential for politics to disrupt the peace. Alexandros Petersen, an analyst at the Center for Strategic Intelligence Studies, warns against relying too much on the volatile superpower that Russia has become.

“Even though there is little that Moscow would gain from shifting away from space cooperation, it would serve the function of showing the Russian people that Moscow can deny Washington’s access to space.”
He advocates exploring other options, including collaborating with European or Japanese space agencies to quickly develop alternative means of transporting astronauts to the ISS, to ensure the space station’s future does not hinge upon the US maintaining a friendly relationship with Russia.

Petersen’s words of caution will not go unheard. President Bush signed the NASA Reauthorization Act on October 16, which stipulates that the American space agency determine “the impact of a Space Shuttle flight program extension on the United States’ dependence on Russia for International Space Station crew rescue services”. The U.S. government, while maintaining relatively cordial relations with Russia, still hedges its bets.

Still, the two countries find common ground in space. Despite the occasional bump in the road, it seems that the heavenly honeymoon will continue, still strong after ten years and two billion kilometers traveled.

Growing coffee in the shade may be the smartest way for small farmers to cope with global warming, according to research published in this month’s issue of BioScience. University of Michigan scientists concluded that the industrial method of coffee cultivation, where swaths of forest are cut down to create large plantations, leaves the crop vulnerable to harmful fluctuations in temperature as a result of climate change.

Brenda Lin, the leader of the Michigan team, planted coffee in shaded and sun-exposed plots at three farms in the Soconusco region of the Mexican state of Chiapas. Her data showed that sites with low shade experience higher temperatures and lower humidity throughout an average day than a site with high shade coverage, and that higher shade sites had more consistent conditions overall.

Consistency, for coffee, is the highest of virtues. The plants, said Lin, are “very sensitive to changes within the climate, especially in South America, where there’s no irrigation; the farms are rain-fed.” Quality beans are produced within a very narrow range of conditions- above a temperature threshold of 23 degrees Celsius, the rate of photosynthesis declines and the fruit ripens quicker, leading to inferior coffee. Lower temperatures produce stunted plants. Where there is the potential for large temperature and rainfall fluctuations, there is also the potential for massive crop failure.

Lin’s paper concludes that a return to the traditional practice of coffee farming, where coffee is grown underneath a canopy of shade trees, is the best way to mitigate the looming effects of global warming. The shade-coffee movement is not a radically new idea, but Lin’s justification is. Prior to her research, the literature on coffee growing focused on the effects of the industrial plantation system and its attendant clear-cutting on the birds of Central and South America.

“They noticed that migration patterns were changing… because the forest habitat was changing, due to the loss of the shade trees,” said Lin. These studies noted the negative effect that coffee plantations had on biodiversity in the surrounding area. Environmental groups began to carry the banner for replanting of shade trees, and a niche market was born soon afterwards. The Audubon Society enthusiastically promotes its own brand of shade-grown coffee through its website, touting it as “bird-friendly”.

Some farmers, more concerned with the bottom line, were not easily swayed by pleas for more stable bird migration patterns. But Lin knew that the potential conclusions of her research could make the case that an environmentally-conscious plantation was a profitable one as well.

“I chose to look at climate change because farmers in that region had noticed the change in climate- the longer dry seasons, the El Nino years- seriously affecting their ability to produce more coffee.” By highlighting the link between global warming and crop yields, Lin felt it would be easier to convince farmers to adopt the shade growing method “even if they didn’t care about biodiversity.”

The farmers originally practiced deforestation because it removed competition for natural resources “The idea was that less shade would equal greater yield, because you had more sun going to the plants,” says Lin. Coupled with fears that fungal diseases, such as coffee leaf rust, would breed in the untamed lowlight, “sun coffee” caught on in the late 1970s, but really gained traction in the 1990s, when government programs in Central and South America incentivized the planting of sun coffee. Genetic engineering has produced strains of coffee more tolerant of direct sunlight and producing, on average, more beans than its unaltered cousin, with a little help from manufactured fertilizers. The big plantation system is able to produce a lot more coffee this way, but the smaller farmer is increasingly unable to imitate the sun coffee method.

“The rural growers on the hillsides can’t afford all the chemicals that come with implementation of intensified sun coffee, they can’t afford the fertilizers and pesticides.” Shade-growing is easier to implement, says Lin, and better suited to local farms. There is less coffee produced, but it is brings a higher price. And, according to Lin’s study on shade and climate change, it may just be the soundest strategy in the long term.

Coffee, which grows as a small shrub, thrives naturally in the shade of a native canopy. Shade-growing easily translates to similar plants, like tea and cacao. But what about other crops like corn and wheat? Lin is not advocating shade trees as a one-size-fits-all solution; she stresses the need for investigating a variety of sustainable agricultural practices.

One potential approach that Lin speculated upon was a return to the traditional method of crop rotation, which replenishes nutrients in the soil without the use of fertilizers, as well as “mixed plantations”, where a variety of food crops are grown in the same space. A farmer is better able to hedge his bets growing corn, squash, and oranges together. In mixed plantations, Lin says, “your food security isn’t necessarily tied to one crop.”

“Listen,” she says, “climate change is occurring, and that there is nothing at this point that we can do to stop it. It’s imperative to start thinking about adaptation and coping strategies.” The best advice for coffee growers, it seems, is to stay out of the sun.

Crosspost: I have a new article up on Science Metropolis, a wonderful site run by fellow BU Science Journalism student Joseph Caputo. It’s about pigeons (my article, not the site). Click here to toss a few breadcrumb-sized hits our way.

A tree is growing at Cornell University.Unlike the English elms, Japanese maple, and swamp oaks lining the Ithaca campus, this specimen is about seven centimeters tall and made of plastic.It is a synthetic tree, the brainchild of two biomolecular engineers, and the first man-made system able to mimic the powerful pumping capability of plants.It could also be the seed of an engineering revolution.

Most plants transport water through internal channels called xylem.This system operates upon the principle of negative pressure.Water evaporating from the surface of a leaf creates a tensile force akin to pulling on a rope.This force draws water from the ground, through the roots and up the entire plant.Tobias Wheeler, the leader of the project, explained that the mechanism of water transport in plants was deduced in the 1890s.Efforts to produce a replica of this model, however, have progressed little since then.

The first and greatest hurdle to be overcome was the selection of the material used to construct the tree.Wheeler and his mentor, Abraham Stroock, knew they needed something that would form tiny pores to hold water within the artificial xylem.They eventually hit upon the idea of using a hydrogel, the material used in soft contact lenses.In hydrogels, “the polymer network and the water are… mixed, such that the pores are effectively molecular in scale”, said Stroock.

The finished product is a relatively simple apparatus: two circular regions on the surface of the hydrogel, etched with 80 parallel channels, connected by a central “trunk” groove.Water moved through the synthetic tree at forces of up to 10 atmospheres.The previous record in any liquid pumping system was 0.7 atmospheres.

Wheeler said that scaling up their tiny tree to the size needed for civil engineering projects was “not a trivial matter”.Nevertheless, he and Stroock enthusiastically speculated about a slew of possible applications for their discovery.Biologists could use them to study the finer aspects of plant physiology.Viticulturalists could use them to closely monitor water levels in the vines of wine grapes.Synthetic trees could be used to construct deeper wells and larger heat pipes, cooling homes for a fraction of the cost of air conditioning.The heat transfer systems used today to cool electronics only exist on the centimeter scale.

“We could potentially build the sequoia of heat pipes,” said Stroock.

The original article: Wheeler, Tobias and Stroock, Abraham. “The transpiration of water at negative pressures in a synthetic tree.” Nature 455:208-212 (11 September 2008)

Deep in the heart of the Milky Way lurks an ever-hungry force, consuming planets, stars, and nebulae with voracious appetite.The monster in question is Sagittarius A*, thought to be the location of a massive black hole. Its 25,000 light year distance from Earth and relatively tiny size make SgrA* difficult to observe.A recent study led by MIT astronomer Sheperd Doeleman, published in the Sept. 4 issue of Nature, offers the closest look yet.

Doeleman’s project made use of Very Long Baseline Interferometry, or VLBI.This technique uses multiple telescopes distributed across the planet, which is like using a telescope equal in size to the distance between them.The researchers sought to maximize resolving power by linking telescopes from California, Arizona, and Hawaii, and measuring wavelengths of 1.3mm, the smallest ever attempted.This allowed the scientists to map the size of SgrA* to 37 microarcseconds.To give some idea of the scale, Doeleman said, “it’s like seeing a baseball on the moon with the naked eye.”

But how were the telescopes able to see something from which even light cannot escape?Scientists measured radio, infrared, and x-ray emissions from SgrA*, but these do not originate from the black hole itself.As Alan Marscher, professor of astronomy at Boston University, describes, matter drawn towards the event horizon, the outer limit of the black hole, doesn’t just fall straight in- it spins around like water circling a drain.As it rotates, it loses energy in the form of radiation, which can be picked up by the telescope array.

Mapping SgrA* has bolstered the case for identifying the object at the galactic center as a black hole, given its extremely small size and its extremely large mass of four million solar masses.No other phenomenon is known to have such a high density.There is, Doeleman said, a slight chance that SgrA* could be a more exotic construct known as a Boson star, which is composed of unique particles that exist much closer to each other than in normal stars.The Boson star, however, is only a theoretical possibility, and has yet to be observed.

Furthermore, scientific principles rule out the possibility that SgrA* is an aggregate of matter.According to the laws of physics, “any object that was originally at the center of the galaxy, whether it was a planet, some gravel, or a collection of Hello Kitty dolls, would have collapsed into a black hole after at least 500 years”, said Doeleman.

Future studies will attempt to push the resolving power even further, with telescopes placed even further apart and measuring even smaller wavelengths.Doeleman hopes to eventually make use of the newly constructed ALMA array, a collection of 50 dishes located in Chile.The addition of more telescopes will ultimately allow scientists to construct a real image of a black hole, rather than a mathematical representation.

Still, Marscher mused, there is a point beyond which even the most powerful telescope cannot see.Little is known about the character of what lies beyond the event horizon, he said, and at this point we lack the capability to describe it.“The universe,” he said, “conceals a lot from us.”

Sparkwatch is my personal foray into science blogging. Here I’ll post the various articles and multimedia that I make for class, as well as commentary and links to science news. To get the ball rolling, though, here is a humorous bit I did for Professor Imbriglio’s science writing class back at Brown:

The Ten Scientific Commandments

I. Thou shalt have no other interests before the pursuit of truth.

II. Thou shalt not take the name of Einstein in vain.

III. Remember to feed the graduate students, occasionally.

IV. Honor the work of those who came before while you improve upon it.