The “Pillars of Creation,” a photograph of part of the Eagle Nebula, is one of the most iconic images ever taken by the Hubble telescope. Yesterday, astronomers released a bigger, better, sharper version of the pillars, taken almost two decades after the first.

But an ironic twist – and what we didn’t know twenty years ago – is that the Pillars might have been long ago torn apart by a distant explosion. The photos we snap of them today are high-tech and modern but their subject is clouded by thousands of light-years of remove. Like the post-mortem photography of the Victorian era, the resulting images are lifelike, and beautiful, and sad.

Food labels seem to provide all the information a thoughtful consumer needs, so counting calories should be simple. But things get tricky because food labels tell only half the story.

A calorie is a measure of usable energy. Food labels say how many calories a food contains. But what they don’t say is that how many calories you actually get out of your food depends on how highly processed it is.

Processed Food Makes You Fatter

Food-processing includes cooking, blending and mashing, or using refined instead of unrefined flour. It can be done by the food industry before you buy, or in your home when you prepare a meal. Its effects can be big. If you eat your food raw, you will tend to lose weight. If you eat the same food cooked, you will tend to gain weight. Same calories, different outcome.

For our ancestors, it could have meant the difference between life and death. Hundreds of thousands of years ago, when early humans learned to cook they were able to access more energy in whatever they ate. The extra energy allowed them to develop big brains, have babies faster and travel more efficiently. Without cooking, we would not be human.

When we fall ill we visit a clinic or a pharmacy. Our ancestors, however, didn’t have that luxury. Instead, early humans likely observed and learned from sick animals that healed themselves by eating certain plants. Yet, only in the past two decades have biologists and chemists begun to recognize that animals do self-medicate – select and use substances specifically to cure themselves of parasites and ailments.

Early accounts of animal self-medication came in the late 1980s from Michael Huffman, a primatologist at Kyoto University. His decades-long research on chimpanzees, which revealed that they use plant compounds to rid themselves of parasites, helped established self-medication as a fundamental animal behavior.

“Any animal species alive today is alive in part because of its ability to adapt and to fight off diseases,” Huffman says. Self-medication does not require high intelligence, but was simply the reaction of animals to remove an ailing symptom that evolved into strategies to expel parasites. “Self-medication is a very basic behavior that’s important to the survival of so many species,” he says.

And animal self-medication points to a treasure larger than mere fascination. By following the animals’ lead, we tap into a medicine vault furnished by millions of years of natural selection. The world’s best bio-prospectors – the animals themselves – may very well show us new pharmaceuticals to improve the health of our livestock and ourselves.

While developing drugs to cure Ebola is crucial to end the current epidemic, a vaccine that prevents the infection altogether is the end-game for viral outbreaks – a way to protect healthcare workers on the front lines and to prevent future outbreaks.

It typically takes 10 or 20 years to develop and test a vaccine and get it to market. But in Ebola’s case, this time frame has been compressed into a matter of months, bringing pharmaceutical companies, scientists and regulators into uncharted territory, striving for a vaccine to curb the still-escalating epidemic without compromising safety.

“Never before has there been a push to develop a vaccine for an emerging public health threat in this short a time frame,” said Dr. Mark Feinberg, vice president and chief public health and science officer of the drug company Merck’s vaccine division.

Dr. Ripley Ballou, head of Ebola vaccine research for GlaxoSmithKline, concurs. “I’ve been doing this kind of work for 30 years, and this is the first time I’ve encountered anything with the compressed timeline and sense of urgency,” he said.

In the mid-1800s, English chemist William Henry Perkin serendipitously synthesized the first non-natural dye: starting with coal tar, he was hoping to produce the malaria drug quinine but instead created mauve. His discovery revolutionized the textile industry and launched the petrochemical industry. Natural dyes just didn’t have the staying power and vivid colors of the dye Perkin created. Never before had such a steadfast dye been found.

Soon after, August Hofmann (Perkin’s chemistry professor) noticed that a dye he had derived from coal tar formed a color when exposed to air. The molecule responsible was para-phenylenediamine, or PPD, the foundation of most permanent hair dyes today.

Although hair is a protein fiber, like wool, the dyeing process for textiles cannot be duplicated on the head. To get wool to take a dye, you must boil the wool in an acidic solution for an hour. The equivalent for hair is to bathe it in the chemical ammonia. Ammonia separates the protective protein layers, allowing dye compounds to penetrate the hair shaft and access the underlying pigment, melanin.

A defining feature of this Ebola epidemic has been the significant resistance of some of the affected communities to treatment and prevention measures by foreign aid workers and their own governments. Many local people, suspicious and fearful, have refused to go to treatment centers or turn over bodies for safe burial, and whole communities have prohibited the entry of doctors and health teams.

As the months have gone by that resistance has been less reported upon, and there are signs that it may be lessening. In the Forest Region of Guinea, where the Ebola epidemic started, foreign staff previously faced roadblocks, stone-throwing and violent attacks. But in the last few weeks, as the New York Times has reported, locals have opened up the literal and figurative barricades around their villages and sought outside help.

Still, the friction continues to shape the spread of the disease. Doctors Without Borders’ December briefing paper [pdf] calls the situation in Guinea “alarming,” with 25 percent more cases reported in November than October and many areas where there is “still a great deal of resistance towards Ebola response” and their teams are “not welcome.”

The solution, some say, is to reevaluate treatment and prevention tactics with the benefit of an anthropological perspective. That was the call delivered last week by a meeting of the American Anthropological Association in Washington D.C. If international staff had approached the epidemic from day one with more understanding of cultural, historical and political context, attendees said, local traditions and community leaders could have become assets rather than obstacles in the fight against Ebola.

The American Anthropological Association is asking for anthropologists to become more involved in the global Ebola response. They have started the Ebola Emergency Response Initiative to connect anthropologists who are already working in or experienced with West Africa, and to build structures and programs that help more anthropologists spend time directly involved in the Ebola response on the ground.

“We’ve worked in these places and we’re watching our friends die,” said University of Florida professor Sharon Abramowitz, one of the founders of the initiative.

Abramowitz points out that the anthropologists involved in the initiative have a total of 300 years of ethnographic experience in the affected West African nations – experience which could help medical scientists both understand and respond to the epidemic.

What would it take for an animal to be considered a person? In a landmark court case that reached its conclusion in a New York State appellate court yesterday, a five-judge panel refused to grant legal personhood to a chimpanzee named Tommy. Their unanimous decision: He’s not a person, in spite of the best arguments put forward by a group called the Nonhuman Rights Project (NhRP).

Tommy’s owner keeps the chimp in a wire-mesh cage, inside a nondescript warehouse, in upstate New York. That’s not illegal, because it’s not illegal to own a chimpanzee in New York State. In the eyes of the law, Tommy isn’t a person – he’s property.

Tommy, the court ruled, “is not a ‘person’ entitled to the rights and protections afforded by the writ of habeas corpus” – the legal term for a petition urging a court to halt the unlawful detention of a prisoner.

The court decision ends this particular battle, but the legal wrangling, and the larger philosophical questions that swirl around human-animal relations, are sure to continue.

Nearly a century ago, Edwin Hubble’s discovery of red-shifting of light from galaxies in all directions from our own suggested that space itself was getting bigger. Combined with insights from a handful of proposed non-Euclidean geometries, Hubble’s discovery implied that the cosmos exists in more than the three dimensions we’re familiar with in everyday life.

That’s because parts of the cosmos were moving further apart, yet with no physical center, no origin point in three-dimensional space. Just think of an inflating balloon seen only from the perspective of its growing two-dimensional surface, and extrapolate to four-dimensional inflation perceived in the three-dimensional space that we can see. That perspective suggests that three-dimensional space could be curved, folded, or warped into a 4th dimension the way that the two dimensional surface of a balloon is warped into a 3rd dimension.

We don’t see or feel more dimensions; nevertheless, theoretical physics predicts that they should exist. Interesting, but are there any practical implications? Can they become part of applied physics?

When considering extreme environments it is easy to make assumptions about personality, which on closer examination do not stand up to scrutiny. Take, for example, one of the best-researched personality dimensions: introversion-extraversion. Extraversion as a trait appears in all established psychological models of personality, and there is considerable evidence that it has a biological basis. The concepts of introversion and extraversion long ago escaped the conﬁnes of academic psychology and are widely used in everyday conversation, albeit in ways that do not always reﬂect the psychological deﬁnitions.

Broadly speaking, individuals who score highly on measures of extraversion tend to seek stimulation, whereas those who score low tend to avoid it. When asked to describe a typical extravert, most people tend to think of the lively ‘party animal,’ equating extraversion with a preference for social interactions. However, individuals who score highly for extraversion seek more than just social stimulation: they also tend to gravitate toward other stimulating situations, including active leisure and work pursuits, travel, sex, and even celebrity. Introverts, on the other hand, have a generally lower affinity for stimulation.

They ﬁnd too much stimulation, of whatever type, draining rather than energizing. Contrary to popular belief, introverts are not necessarily shy or fearful about social situations, unless they also score highly on measures of social anxiety and neuroticism.

On this basis, one might assume that extraverts would be drawn to extreme environments, where they could satisfy their desire for stimulating situations, whereas introverts would ﬁnd them unattractive. And yet, extreme environments may also expose people to monotony and solitude — experiences that extraverts would ﬁnd aversive, but which are tolerated or even enjoyed by well-balanced introverts. The point here is that simple assumptions about broad personality traits are unlikely to provide good explanations of why people engage in extreme activities.

Why does it take so long for human children to grow up? A male chimp and male human, for example, both end up with the same body weight but they grow very differently: at year one the human weighs twice that of the chimp but at eight the chimp is twice that of the human. The chimp then gains its adult weight by 12 – six years before the human. A male gorilla is also a faster growing primate – a 330-pound male gorilla weighs 110 pounds by its fifth birthday and 265 pounds by its tenth.

Clues to the answer can be found in the young human brain’s need for energy. Radioactive tracers allow scientists to measure the glucose used in different areas of the brain but this procedure is only used rarely when it is justified by investigating neurological problems. However, the few cases we do have reveal how radically different the childhood brain is from that in adults or infants.

From about the age of four to puberty, the young brain guzzles glucose – the cerebral cortex, its largest part, uses nearly (or more than) double that used earlier or later in life. This creates a problem. A child’s body is a third of the size of an adult but its brain is nearly adult-sized. Calculated as a share, a child’s takes up half of all the energy used by a child.