May 2012

05/30/2012

About 42 percent of the way through Erika Hayasaki’s Kindle Single, Dead or Alive, a book about Near Death Experiences (NDEs), I experienced a freaky coincidence. I was in the field with a team of scientists when a member of the crew started telling me about a horrific accident he had been in a year before. I asked him if he still had nightmares. Not as often, he answered, before telling me about the out-of-body experience he had in the operating room, when he suddenly found himself above the action, looking down on his body as doctors and nurses struggled to save his life.

"I don’t know why I’m telling you this," he said. “I’ve told almost no one and I don’t like to talk about it.” The subtext being, he keeps it to himself to avoid either one of two reactions: disbelief or way too much belief.

Hayasaki explores this treacherous territory in Dead or Alive, investigating the science behind NDEs. The story is perfect for the length of a Kindle Single: the study of NDEs is in its infancy and so there's little solid scientific evidence. A longer book would be repetitive, recounting endless anecdotes and relying too heavily on speculation.

Hayasaki opens Dead or Alive with the NDE experienced by her uncle, Richard K. Harris, a lawyer turned writer. It sounds like the typical NDE description familiar to anyone who reads, watches movies, television, or roams the Internet, complete with tunnels, brilliant lights, and the presence of already departed loved ones.

It’s a brave place to start. If not for the fact that Hayasaki is a former reporter for the Los Angeles Times, (where she wrote several articles that inspired this book), I might have rolled my eyes and left Dead or Alive to languish in the “Books” collection of my Kindle. But Hayasaki understands something fundamental about NDEs — the universality of NDE descriptions can make them less credible, since anyone can describe an NDE whether they’ve had one or not. To give the experience some specificity, she focuses on Harris, recounting his experiences before his own brush with death.

By introducing both Harris and his NDE, Hayasaki hooks the reader. It becomes paramount to find out if NDEs can be scientifically explained. Is it just the product of a brain trying to make sense of dying? Or is it possible that a meta-consciousness awaits us all at the end?

Hayasaki delves into past writings about NDEs. In the mid-1970s the psychologist Raymond Moody interviewed 150 people who had been declared dead and were then revived. From the interviews he drew a universal description of NDEs, which, it turns out, have been reported throughout history. Research has accelerated since Moody’s study. A number of studies suggest that a lack of oxygen to the brain may be the cause of NDEs. Hayasaki interviews NDE researchers, even finding a neurosurgeon who experienced an NDE himself.

This is compelling reading. Who does not want to know if science can determine if death is final? (Less compelling is Harris’s story, which Hayasaki weaves into her narrative. Hayasaki never knew her uncle well. He had distanced himself from his family, and he died of cancer soon after they met.) And when it comes to the subject of NDEs, Hayasaki’s timing is impeccable. Baby boomers are reaching the age when their family and friends are starting to die. The cynic in me says the latest research on NDEs is driven by a dominant generation accustomed to questioning the status quo. Boomers, after all, made 40 the new 30 and 50 the new 40. They can’t cheat death, so they’re questioning it through science.

The non-cynic in me says technology is the true driver of this research. It’s easier than ever to study NDEs. Better brain maps courtesy of medical imaging equipment have allowed scientists to stimulate specific parts of the brain with electrodes to induce out-of-body experiences in test subjects, for example. With MRIs scientists can study test subjects’ brains as they recall NDEs. And scientific papers on the topic have turned up in journals like The Lancet and Trends in Cognitive Sciences.

Although studies often focus on the out-of-body experiences associated with NDEs, I didn’t know until reading Dead or Alive that people near death frequently say they feel nothing as they gaze at their corporeal forms, however broken and distressed they might look. When my new acquaintance, the accident survivor, expressed the same sentiment, I hadn’t read far enough into the book to know that about NDEs and I doubt he scoured the Internet looking for lesser known details. This volunteered bit of information, his hasty retreat from the conversation, and reading the book, made me think the NDE experience — brain-based or not — is more real and profound to people than I had previously accepted.

It’s hard to tell if Hayasaki believes NDEs reside only in the brain or that consciousness lives on despite the body’s death. The balance of probabilities tips toward the brain studies and their conclusions so far. But ultimately no one knows what happens after death and in Dead or Alive I get the sense that Hayasaki is asking us all to keep our minds open.

Jude Isabella writes about science for kids and grown-ups. She has written for The Walrus, New Scientist, Archaeology Magazine, Canadian Geographic, and other publications. Her books, Chit Chat, a Celebration of the World’s Languages will be published fall 2013 by Kids Can Press, and Salmon: a Scientific Memoir in spring 2013 by RMB. Follow her on Twitter.

05/24/2012

The Chemical History of a Candle, by Michael Faraday, (Griffin, Bohn And Co., London, 1861), available free from Project Gutenberg in multiple e-reader formats and also from LibriVox as a free audiobook.

reviewed by Deborah Blum

"There is no better, there is no more open door by which you can enter into the study of natural philosophy than by considering the physical phenomena of a candle."

It was the above line that first caught my attention. The recognition that we often best appreciate our extraordinary natural world by seeing it through the lens of the ordinary: crystalline structure as revealed by the stitchery of winter frost, the chemical dance of light and life found in the changing colors of leaves, the hot whisper of oxygen as it sends the flame higher.

That recognition has driven much of my own science writing – the idea that we can often illuminate science through tales of the everyday. I wish I could tell you that I'd thought of it first, that it was somehow primordially my own. But, at best, I think I can claim to be carrying on a time-honored tradition. Because it's very clear that the 19th century scientist Michael Faraday was doing that and doing it exceptionally well some 150 years ago.

Here at Download the Universe, we reviewers are mostly looking toward the future - what we hope is the promise of e-books, their potential to transform the reading experience. Possibly transcend it. But I want to take this opportunity to explore another aspect of the electronic publishing world, the ability to explore our past, the free archives offered by publishers like Project Gutenberg.

Founded in 1971 by the late Michael Hart, Project Gutenberg began as a labor of love, the painstaking transfer of books in the public domain - many of them once forgotten-- into digital life. The Gutenberg website now makes 39,000 free e-books available. It also links with digital partners to provide access to another 60,000 e-manuscripts. Like Faraday's candle--to stretch that analogy a little here - it offers an open door, a brightly lit access to the words, and even the wisdom, of our past. Like no other generation, we can explore this virtual library, stumble across old chemical histories of candles and learn to think differently about our own work.

And stumble is exactly what I did.

I see you are not tired of the candle yet, or I am sure you would not be interested in the subject in the way you are.

Not that it was much of a fall. More of a sidestep. I spend a lot of my time writing about and researching the history of science, for books like The Poisoner's Handbook, my recent story of poison, murder and the invention of forensics in the early 20th century. I do so because I believe--no, really, I know--that we cannot understand who we are unless we understand how we got here. And so I was doing some research into the history of chemistry and Faraday's book almost immediately appeared in my browser.

This, I think, is the other magic wrought by on-line publishers like Project Gutenberg. You can be happily rambling through the history of chemistry (a phrase, I know, that only a geek could write) and suddenly discover that a scientist born in the close of the 18th century (1790) understood perfectly the very principles of science communication that you'd been preaching in the 21st century.

05/22/2012

The Atavist is no stranger to this site. In fact, we've set up a category for the ebooks that come from this innovative ebook publisher. Yesterday, The New York Times's David Carr broke the news that it has gotten $1.5 million in seed money from some of the biggest names in technology, such as Eric Schmidt of Google. So this afternoon I Skyped Evan Ratliff, the chief executive of the Atavist, to talk about how they do what they do, why they end up publishing so much science, and what lies in the future for their operation. I recorded our Skype conversation on a Macbook that's really only good these days as a walkway tile. But for some reason the video file turned out to be fairly viewable, and the audio very audible (I think an office dog chimes in late in the conversation). So I've uploaded it to YouTube and embedded it below. I've posted the audio below, too.

If you're fonder of the written word (which would make eminent sense for people who come to this site), I can give you the lowdown. Ratliff comes to the Atavist as a seasoned journalist, writing mostly about technology and science. Like many journalists, he loved writing long pieces but struggled to find many opportunities to write them. He then had something of an epiphany while working on a story for Wired for which he vanished and dared readers to find him. He took a lot of video while on the run, which he wished he could have used somehow. And he also did a lot of promotion for the story on television, which got him thinking about what it would be like to get a royalty every time someone read his article.

As Ratliff describes it, he groused about it until his friend Nicholas Thompson, then at Wired and now at newyorker.com, suggested they do something about it. So they co-founded a company to publish long-form nonfiction augmented with video, audio, maps, timelines, and other features.

The Atavist and a few other publishers have recognized the value of stories that used to fall between the cracks. Magazines may put a ceiling on stories at 5,000 words, while book publishers may set a floor at 50,000 words. But that doesn't mean that a 20,000-word story is, by definition, a bad story. In fact, it can be quite compelling. Making a place like the Atavist work also requires good taste and an ability to see the potential for a story where other editors might see a wall of boredom. Some of the Atavist's most successful stories started out in life as magazine stories that were rejected for what, in hindsight, can only be called stupid reasons.

I asked Ratliff to take me through the production of a piece. The pace feels more like a newspaper office than a book publisher. To hit those frantic deadlines, the Atavist depends on its software. Ratliff & Co. can put their text and other elements into the software, and out come files ready for the many venues where they sell their pieces, from Amazon's Kindle Store (straight text only) to their app on the iPhone or iPad, where all the bells and whistles can play at maximum volume.

Science is heavily represented at the Atavist, and it's not just due to the journalistic background of its founders. Science often benefits from great illustrations, and video--when used judiciously--is the best illustration of all. Science also does well at length--there's room to tell a great narrative and weave in the concepts that the scientists in the story are exploring.

As Ratliff explained in our talk, the software has shown great value of its own. The Atavist has licensed it out to conventional publishers and other companies, and this summer they're going to roll out a public version anyone can use to self-publish their own books. The Atavist is also going to offer a marketplace that may resemble a kind of literary Etsy. As I mentioned in our talk, Etsy doesn't market its own clothes. Ratliff admitted that was a tricky canyon for the Atavist to navigate. But he feels it's worth the trip, because he's become a strong believer in people getting hold of tools to make interesting stories.

I expect some of those stories will make their way over here.

Audio:

Video:

Carl Zimmer writes frequently about science for the New York Times and is the author of 13 books, including A Planet of Viruses

If you're looking to sound like less of a dunce at astronomer cocktail parties, you might want to check out The Open University's free 70-page ebook on the basics of Moon geology. It's well, if plainly, written and provides links to the original research that underlies our understanding, though on several important counts it falls short of fulfilling its promise as an interactive textbook.

I learned some interesting tidbits, especially in the first chapters, that made me look at the Moon differently. For instance, we're still not really sure how it formed; each of the leading theories explains some, but not all, of what we've observed about it. One of the most widely accepted models proposes that the Earth was struck by an object the size of Mars about 4.6 billion years ago, and the debris flung off by the cataclysm coalesced into the Moon. Alongside this rather dry description was a truly alarming figure showing sequential shots of our planet in the 23 hours after the impact, lurching on its axis and spraying out matter like a soaked tennis ball shedding water.

But the breathtaking violence wasn’t over yet. As it turns out, after the planets of our solar system themselves coalesced from surrounding debris millions of years earlier, there had been quite a lot of stuff left over. This floating junk was cleared from the inner part of the solar system by the planets sweeping around their orbits. Once the giant planets—Jupiter and Saturn—started to creep outwards, things got nasty fast. “Resonance effects caused orbital eccentricities that destabilised the entire planetary system,” the text relates. “Rapid and dramatic movement of the giant planets then occurred, causing 99% of the mass of the primordial disc to be ejected from the solar system and for much of the remainder to be thrown inwards to cause an influx of asteroids and thus a surge of impacts on the inner planets.”

To translate: as the planets shifted into a new alignment, they pulled on each other gravitationally such that their neat, concentric orbits went all to hell, and they careened around in a way that sets my teeth on edge just thinking about it, in the process flinging a punishing rain of giant boulders onto the inner planets, which is how our Moon got so bunged up.

The next chapter fast-forwards several billion years. Things are much quieter. It's 1971, and the Apollo 15 astronauts are collecting pieces of the moon when one of them, David Scott, gasps and cries, “Guess what we just found! Guess what we just found.” As related in a transcript of the mission audio, he's found a crystalline rock, which is beautiful proof that when the Moon was young, it formed a crust like Earth's.

I was excited to see that the book contained an embedded video of the Apollo 15 astronauts making their discovery. I love these fuzzy old recordings, these time capsules of men with gentle mid-twentieth century American accents exclaiming “Oh boy!” when they come across a chunk of lunar feldspar. But whenever I clicked on the video (I tried several times), it froze up, and all I got was the audio.

Audio from the Moon is better than no audio from the Moon, but I was much more disappointed when I got to the book's interactive activities. The book makes several mentions of petrology—the geology equivalent of pathology. Basically, you take fine slices of rock and look at them through a microscope in various kinds of light and identify the minerals within. With each mention of petrology I looked forward to trying it out for myself with the book's “virtual microscope.”

When I reached it, I skimmed the first activity's description, which laid out how the different kinds of light revealed the identifying properties of minerals, and jumped right into the interactive element. Once I was in, though, I saw that there was no interpretive text. There were zoomable views of the sample in various lights, but all of the explanations of what I was looking at were stranded back out in the main text. To learn anything about what to look for, I had to close the interactive element, read from the text, then open the element and try to click back to exactly the place where I had been before.

This was roughly as frustrating as being told you can look at either a guidebook or a city map, but never both at the same time.

As a result, I got very little from the seven virtual microscope activities, aside from some aesthetic enjoyment. I would have learned more if I'd had a paper textbook and a companion app that could be used together at the same time. I’m surprised that the Open University, a 41-year-old UK institution whose focus is distance learning, would have bungled this point.

The disappointment of the virtual microscope aside, the book succeeds fairly well as a teaching text for curious amateurs. I know quite a bit more about the Moon now than I did before. I can say with certainty that it's not made of green cheese but of things that--to a Moon n00b like me--can seem just as fanciful: substances like the mineral olivine, which is a remarkable canary yellow under cross-polarized light; and moondust, which, for reasons that are still mysterious, smells just like gunpowder.

Throughout the history of neuroscience, we have gained an inordinate amount of knowledge by studying people with severe brain damage, and watching how they manage to live. HM’s surgically altered brain revealed secrets about how memories are formed – after his death, he was revealed to be an American man called Henry Molaison. KC, a Canadian man whose real name is still unknown, also taught us much about how memory works, following brain damage sustained during a motorcycle accident. SM, a woman with an inherited brain disease, reportedly feels no fear.

These patients are known by abbreviations that preserve their anonymity, but also shroud their contributions. Their hopes, struggles and lives are condensed into patterns of injury and aberrant behaviours, and distilled into pairs of letters. But sometimes, very rarely, we get a privileged opportunity – a chance to unpack the people behind the letters, and to learn not just how they became a part of science, but how science became a part of them.

Jessica Benko’s new story, The Electric Mind, provides just such an insight. It is the latest in an increasingly strong portfolio of stories from The Atavist, a digital publisher that produces stories “longer than typical magazine articles but shorter than books”.

The Electric Mind is the story of Cathy Hutchinson, a woman known in the scientific literature as S3. She’s a mother-of-two who was “always goofing around and singing and dancing”, until a stroke disconnected her brain from her spinal column and left her with an active mind imprisoned in a frozen frame.

For several years, Cathy has been taking part in a groundbreaking experiment called BrainGate – not a sordid cerebral scandal, but a bold project that aims to give paralysed people control over mechanical limbs. The scientists behind the project fitted Cathy with microscopic electrodes that read the neural buzz within her motor cortex – the area of her brain that controls movements. The implant acts like an electronic spine that links Cathy’s brain to a computer or robot, bypassing her own immobilised flesh.

At first, she used the electrodes to control the movements of an on-screen cursor. More recently, she commandeered a robotic arm. As she thought about grabbing a bottle, the electrodes deciphered her mental commands and the arm carried them out. “For the first time in 14 years—indeed, for the first time for any quadriplegic—Cathy was able to reach out into the world.”

The project’s crowning results are published today in the journal Nature, concurrently with the launch of Benko’s story. The paper itself preserves Cathy’s anonymity, and describes her in the starkest of terms. She’s “a 58-year-old woman with tetraplegia caused by brainstem stroke… She is unable to speak (anarthria) and has no functional use of her limbs. She has occasional bilateral or asymmetric flexor spasm movements of the arms that are intermittently initiated by any imagined or actual attempt to move. S3’s sensory pathways remain intact.”

The reality behind these cold, precise words comes through in Benko’s skilful narration. Right from the start, she plunges us into Cathy’s world, as she wakes from a coma to hear the sound of the ventilator beside her bed.

We get to know Cathy through Benko’s eyes, as she tracks down the woman via her son, and meets her for the first time. First-person accounts can break the fourth wall to a distracting extent, and many journalists would balk at inserting themselves so prominently into a story. But Benko earns her place as a protagonist in her own tale, in a way that reminds me of Rebecca Skloot’s The Immortal Life of Henrietta Lacks. The author’s quest becomes an inextricable part of the story itself. Through Benko’s expectations of meeting Cathy, her descriptions of their first meetings, and her difficulties in interviewing a woman who can only communicate via eye-flickers, we learn the extent of Cathy’s disability, and the frustrating complexity of seemingly simple tasks.

Writing about extreme disability (and attempts to overcome it) is not easy. You’re always an adjective away from being mawkish, and an adverb away from being ghoulish. Benko deftly negotiates the tightrope. She cleverly uses essays from other locked-in patients to describe hardships that would sound overwritten from her own hand. And she’s a master of keenly observed but simply delivered prose. When Cathy laughs, for example, it’s “a short burst of air that vibrated across vocal cords she can’t voluntarily control.” No embellishments required. These scenes throw their own punches. Benko just puts you in the ring.

Benko’s eye for detail also elevates her descriptions of experiments that have been reported again and again in the press. We see what Cathy’s nursing home room is like. We learn that the electrodes were fired onto her brain with “a pneumatic device like a tiny air hammer”. We discover that the bottle that Cathy lifted via robot was a thermos full of coffee (she loves coffee), “emblazoned with the initials and insignias of the research team and sponsors”. She finds drama in minutiae. While other reporters rush straight for a snare-drum crash of incredible implications, Benko takes her time with scenes that build to a steady crescendo.

Using Cathy’s story as an anchor, The Electric Mind stretches back in time to look at the historical events that preceded BrainGate (including a horse accident and suspected psychic powers). The story also pulls outwards at other means of reaching the same ends, such as functional electrical stimulation, where electrodes stimulate a patient’s own muscles instead of a robotic limb.

These sections, where we leave Cathy and focus on the field at large, are arguably the weakest elements of the story. Around the two-thirds mark, the tale threatens to veer off course. From rich details about a woman steering a robot arm with difficulty, we’re suddenly plunged into hand-waving speculation about infrared vision, Avatar-like… well…. avatars, and telepathic soldiers (and the irony of reading a journalist’s words about electronic telepathy on a handheld device was not lost on me).

But then, in a rather daring move, it becomes clear that this was exactly the point (keep an eye out for the start of Chapter Seven). All the other characters not involved in BrainGate, from Nicolelis to a ridiculously breathless DARPA spokesperson, serve as foils for Cathy. Their visions are too far removed from the reality of her condition. They remind us about what The Electric Mind could easily have been – a story of technological triumph and glorious futurism. Instead, Benko has treated us to something far better – a story of extreme limitations and what happens when people (and science) run up against them.

*****

Ed Yong is a British science writer who writes the award-winning blog Not Exactly Rocket Science. His work has appeared in Nature, New Scientist, the BBC, the Guardian, the Times, Wired UK, Discover, CNN, Slate, the Daily Telegraph, the Economist and more. He lives in London with his wife. He has never been impregnated by a botfly but he does rather like ants.

05/14/2012

Blazing My Trail: Living and Thriving with Autism. Published by the author. Kindle, $7.99.

Reviewed by Steve Silberman

It's fashionable to say that autism has become a fashion.

If you think overweening psychologists are hastily applying labels like Asperger syndrome to quirky nerds who should be perfectly capable of making their way in the world with no special help, assistance, or accommodations, you have plenty of company. This past January, for example, the New York Times ran two op-eds in one day making that claim, including one by a young novelist named Benjamin Nugent who declared, "Under the rules in place today, any nerd, any withdrawn, bookish kid, can have Asperger syndrome."

The source of his authority on the subject, apparently, was that Nugent himself once received such a diagnosis as a teenager -- at the urging of his mother, a psychology professor -- and appeared in an educational video called Understanding Asperger's in a "wannabe hipster polo shirt." Now, however, Nugent has come to believe that the behavior his mother took for the telltale signs of a developmental disorder was merely his geeky teenage lifestyle, which included spending "a lot of time by myself in my room reading novels and listening to music." He went on to say that the cure for his misdiagnosis was moving to New York City, where he was finally able to meet other formerly bookish kids and schmooze with them in cafés. Having left his dreary foray on the spectrum behind him -- followed by a "long time" of sulking in his mother's presence for having put him through the ordeal -- he's now a professor of creative writing in New Hampshire.

Nugent's glib report surely provided a kind of comfort to some readers, who could return to their lives secure in the knowledge that many of these "Aspies" whom one keeps hearing about are simply "withdrawn, bookish kids" unnecessarily labeled by their histrionic parents with the help of psychologists eager to vault aboard the latest diagnostic bandwagon. After spending the past couple of years interviewing and spending time with autistic people and their families for a book, however, I can tell you that Nugent's experience is the exception, not the rule.

Everyone I've met who has been diagnosed with Asperger's syndrome or other form of autism faces profound challenges in day-to-day life. Even the most "high-functioning" autistic people (a term I now avoid using, because it renders certain forms of cognitive disability harder to see, while obscuring the gifts and competence of those branded as "low-functioning") work tremendously hard to find and sustain friendships; to manage the jarring changes that intrude into the most carefully planned-out schedules; to maintain their composure in noisy sensory environments; to get hired for jobs worthy of their intelligence and skills; and to navigate their way daily through a minefield of unspoken social rules and cues designed by and for people whose brains are wired differently from their own.

That's one reason the revision of the criteria for autism in the upcoming fifth edition of the Diagnostic and Statistical Manual of Mental Disorders -- the bible of psychiatry used to determine diagnosis, access to services, and reimbursement from insurers -- has become so controversial. Even psychiatrist Allen Frances, who led the task force that developed the criteria in the DSM-IV, has gotten into the act, claiming that spectrum diagnoses have become "faddish." Many autistic self-advocates suspect the American Psychiatric Association is about to pull a diagnostic sleight-of-hand by shaving off a portion of the population that would have been eligible for an Asperger's diagnosis under the DSM-IV criteria, and give them a newly minted diagnosis of Social Communication Disorder, which has no legacy services or support systems. Some fear the APA is trying to finesse the increasing scarcity and overloading of services for autism, when budgets are being slashed in the name of austerity, by manipulating labels to lower demand.

There is no question that people diagnosed with Asperger's syndrome have an authentic need for help, long after they've "aged out" of the meager support provided to kids until they turn 21. Contrast Nugent's breezy anecdotes about pissing off his schoolmates by "trying to speak like an E.M. Forster narrator" with this description of attempting to absorb an ordinary conversation written by Rachel Cohen-Rottenberg, one of the most passionate and articulate disability-rights bloggers, and author of a new ebook called Blazing My Trail: Living and Thriving with Autism. "When I hear, I see the spelled-out words in my mind, and I have to internally read and translate those words in order to understand their meanings. As a result, even in quiet environments, I cannot keep up with verbal input for more than five or 10 minutes without falling behind, unless the other person slows down his or her speech and leaves a number of pauses in which I can respond. Pacing is everything."

Or consider this list of activities that Cohen-Rottenberg identifies as particularly challenging:

Food shopping

Sweeping and mopping the floor

Cooking

Driving

Running errands

Going to appointments

Planning, executing, and transitioning between tasks

Working at a job

Making friends

Autistic people in Cohen-Rottenberg's generation never got the chance to be diagnosed with Asperger's as kids, because the diagnosis didn't exist. It's easy to forget that just 40 years ago, there was no concept of a broad, inclusive spectrum that encompassed accomplished professionals like Temple Grandin and autistic people who may never learn to speak or put on their clothes in the morning without help. (Indeed, Grandin's debut memoir, Emergence, was initially billed as the first book by a "recovered" autistic person, because the idea that an autistic woman could enroll at a university, earn an advanced degree, and become a leader in a demanding field didn't seem possible.)

On websites for parents, an autism diagnosis is often framed as a heartbreaking event, an occasion for grieving the typical child they'd planned for. That's understandable and human, but it's illuminating to read statements like "Don't Mourn for Us" by autistic adults like Jim Sinclair, one of the pioneers who has inspired a generation of self-advocates to view their autism as an essential part of who they are, rather than as a pathology they might be cured of someday.

For Cohen-Rottenberg, who was 50 years old when she was diagnosed, the label arrived as a blessing. She felt she finally understood why she had been relentlessly bullied and teased when she was young; why she found certain environments that other people enjoyed (such as crowded restaurants) unendurable; why her first marriage went off the rails; and why she had to work so hard to parse non-verbal cues that her peers can take for granted. "I was like a person with mobility issues trying to run a marathon every day and keep up with people whose bodies worked differently from mine. Burnout was inevitable," she writes. "In a few short years, I seemed to go from a lifetime of being super-functional to struggling with basic things… It was my lifelong ignorance of being autistic that was catching up with me."

Unlike many of the ebooks reviewed at Download the Universe, Blazing My Trail offers no multimedia bells and whistles; it's just text with a few family photographs. But it represents the promising potential of the form to provide a venue for highly skilled writers who might never have been able to convince a corporate publisher that their message was capable of engaging a mainstream audience.

Cohen-Rottenberg's first ebook, The Uncharted Path, available as a PDF, recounted her difficult upbringing and her path to diagnosis. "My attempts at making contacts always felt a bit like trying to drive a car by gripping the steering wheel with my teeth," she wrote. Blazing My Trail continues the story, and addresses how she and her second husband, Bob, have worked as a team to manage her sensory sensitivities and social challenges while building a happy life together. Her unaffected honesty makes Blazing My Trail an uplifting journey -- not in the usual sense of being a heroic saga of a narrator "overcoming" disability with pluck and guile; but by bearing witness to the power of accepting and celebrating oneself exactly as one is.

Cohen-Rottenberg comes through her writing as a wise elder of her tribe and a role model for young people, as well as a smart critic of social attitudes toward disabilities, both visible and invisible. "If we lived in a society that took human diversity for granted, that made room for difference as a deeply held value, every one of us would benefit," she says. "Our view of one another would become much more expansive, much more respectful, and much more compassionate. Ultimately, we might even see one another as perfectly different and perfectly human."

Steve Silberman is writing a book about autism and neurodiversity called NeuroTribes: Thinking Smarter About People Who Think Differently for Avery/Penguin 2013. He is a contributing editor of Wired magazine and one of Time's selected science tweeters (@stevesilberman). He lives with his husband in San Francisco.

A little after 10 pm on May 2, 2011, the Army Corps of Engineers detonated explosives along a two-mile stretch of the Bird's Point levee, just below the confluence of the Ohio and Mississippi Rivers. The goal was to save the city of Cairo, Illinois, which was facing such severe flooding that all but 100 of Cairo's 2,831 residents had already been evacuated. It was a dramatic event; pictures of the explosions, like the one below, have a vaguely apocalyptic feel.

Since the initial explosions took place at night, reporters sequestered a half-mile away weren't able to see how fast the water from the swollen river was flowing. In all, officials estimated up to three trillion gallons of water -- that's 3,000,000,000,000 gallons -- poured onto the Bird's Point-New Madrid floodway, comprised of approximately 130,000 acres of farmland and 90 homes.

05/09/2012

Farthest North: America's First Arctic Hero and His Horrible, Wonderful Voyage to the Frozen Top of the World. Byliner Orignals. $1.99 Publisher site.

Reviewed by David Dobbs

When people today imagine scientists, they tend to picture a man in a white lab coat, glasses, and a scraggly beard. A century and a half ago, however, people imagining a scientist were more likely to conjure a man with a heavy fur coat, a telescope, and a beard twisted not by eccentricity but by the gales of distant places. It was the great age of exploration, when many scientists did their work afoot or at sea. The scientist was a person not just of thought but of action.

In America, no one typified this scientist-explorer image more thoroughly than Elisha Kane, an unlikely explorer who trained formally in neither science nor seamanship; who led one of the era's most extraordinary and influential polar journeys; who was ill much of his life but found extraordinary strength during his severest trials; and who convinced himself and others, for a time, that he had made one of the most important discoveries of his era, only to be largely forgotten. He vividly occupies Todd Balf's Farthest North: America's First Arctic Hero and His Horrible, Wonderful Voyage to the Frozen Top of the World.

This is great material, and Balf, a former editor at Outside, handles it deftly. He gets the scientific dilemmas spot-on while telling a gripping, overlooked tale. He also paints a wonderful picture of how a person's qualities, applied with energy and savvy, can find the doors of opportunity in an era and knock them open.

For the restless Kane, the exploration of the Arctic proved an irresistible draw. The voyage examined here was his second, but the first under his command. The prior journey, which he took as a naval officer, went so badly from an exploratory point of view that its leader happily left the traditional captain's author's account to Kane. Kane, with a romantic's heart and a novelist's touch for earthy detail, seduced the American public with an Arctic world they paid little heed to before; his treatment was half Twain, half Whitman, says Balf. Their mission had been to find and rescue the lost British explorer John Franklin, who had disappeared years before while seeking the Northwest passage. Kane's poignant description of the traces they found of Franklin's path — an abandoned camp with three sailors' graves, an armorer's forge, and a pair of officer's gloves washed and set out to dry — flamed enough interest in Franklin's fate to generate funding for a second rescue attempt, this one led by Kane.

So in May 1853 he set sail. He would search not just for Franklin, but for the "Open Polar Sea" — a coveted passage to the North, and ultimately the Pacific. Kane suspected Franklin may have found this sea but not lived to report or take credit for it. A British adventurer named Inglefield, thinking likewise, set sail from England at about the same time thatKane did, and on the same mission. Kane's trip was at once an attempt at rescue, a test of a hypothesis, a bid for fame, and a race.

As a scientific venture, his search for an Open Polar Sea posed all the seductions and dangers of any powerful idea. It tempted not only extremes of action but the perceptual warping we're all subject to — the tendency to see what one wants to see. The expedition's naturalist-surgeon, Isaac Hayes, encountering in the hills around Baffin Bay a "lush summer bloom," thought it presaged mild weather and open water ahead. Likewise, as they worked their way up through Baffin's ice flows that July of 1853, both Hayes and Kane found hope in seeing many animals moving northward, as if warmth lay there.

They soon found otherwise. Above Baffin they met cold gales that sent the ship careening among ice floes. The sea glazed over. Two weeks later, the ice seized them. They were further north than anyone had ever wintered and survived — 78 degrees, 44 minutes. And though it was only September, it soon became apparent that winter was coming early and hard. Over the next 18 months, locked in ice the whole time, the men suffered a near-continous stretch of arctic torments: weeks on end of darkness and subzero temperatures; scurvy that turned old wounds into open sores; frostbite that forced amputations. Kane's journal through those winters, writes Balf, "is a record of unbroken misery."

Kane's great feat is that he got 14 of his 17 men through an ordeal that should have killed them all. Through the second winter, Kane, who actually felt stronger then than in the winter before, relentlessly nursed and cajoled and supported his men, even as he himself sometimes bordered on delirium. It was a spectacular triumph of deadening, dumb, determined endurance. Finally, in the spring of 1855, they abandoned the ship. After weeks of dragging two lifeboats southward over300 miles of brutal terrain to reach open water, they sailed 1200 miles to Greenland and safety.

That October, Kane returned to the United States to a hero's welcome, his book-won fame spread explosively by news of his survival. But his health deteriorated. When he died in 1857 in Cuba, where he'd gone hoping to recuperate, it made all the front pages. His funeral procession from New Orleans back home to Philadelphia was watched by thousands -- the biggest public mourning the young country had yet seen. It wouldn't be topped until Lincoln was shot. His status is suggested by a banner overhanging Fifth Avenue: "Science Weeps, Humanity Weeps, the World Weeps."

Now few know of Kane. He's rarely mentioned in short lists of great Arctic explorers. Balf's tale serves both as an historical corrective and a sort of fable of the fickleness of fame and the cruel risk of reaching for but failing to bring home a big idea. "Like the earliest, most ambitious pioneers to any new land, he got some things wrong," writes Balf. "He also got a lot right." He found new ways to survive cold and hunger. He returned "by a smart retreat and an unprecedented alliance with the native Inuit; he worked tirelessly to nurse his party back to strength."

This contrasts, Balf notes, with Franklin, who died early on and left his men to march to their deaths. Kane's program for surviving an Arctic winter "was brilliant … and duplicated by almost all future Arctic expeditions," including Shackleton's more famous escape. A notable exception is Scott's disastrous but romantic failure at the South Pole, which arose partly because he ignored some of Kane's lessons and innovations. Yet both Franklin and Scott remain far better known, probably because they did not return. And Shackleton's name far outshines Kane's, even though Kane accomplished something every bit as difficult and unlikely. They both did the impossible. Shackleton's impossible was just more obvious.

It didn't help that someone else largely solved the mystery of Franklin's party. Kane also had the back luck to get the science wrong.

In that spring of 1855 in which he finally took his men south and home, he first sent two of the strongest men north to take one more shot at finding the Open Polar Sea. They marched 200 punishing miles, all the way to 81N, 22', "shedding everything" to get that far. There they encountered a 500-foot bluff. Only one of the men, steward William Morton, had the strength to climb it. When he reached the top, he saw before him an "unfrozen sea" with "waves, … surging from the furthest north, breaking at my feet." A northerly gale blew in his face — but carried no ice toward him. The open water stretched north to the horizon.

From this tantalizing data point — a big, fat, seemingly infinite n of 1 — Kane drew an understandable conclusion: He had found the Open Polar Sea. Balf properly forgives Kane this error. And when he reveals the freakishly unique alignment of forces and events from which this false finding rose — an assembly that starts with Franklin and ends with an astonishing satellite photo taken in 2010 — it's hard not to join him. For the full, strange, richly told story, steer your browser to Farthest North.

05/04/2012

There's no point in beating around the bush. Leonardo da Vinci: Anatomy is simply the best ebook about science that I have ever encountered. To me, it is the exemplar of what ebooks can be.

Leonardo da Vinci: Anatomy comes from Touch Press, whose lavish apps we've reviewed before at Download the Universe (Gems, The Solar System, The Elements). I've personally toyed around with all three of those apps, and while they each offered a number of pleasures, each one felt limited in one way or the other. Gems, for example, lets you twirl diamonds and rubies, but, as Virginia Hughes noted in her review, it doesn't tell you much about them or about their place in human history. The Solar System, reviewed by Jennifer Ouellette, has some very impressive features for navigating among the planets, but Jennifer noted that it lacks a clear story.

Given this track record, I launched Leonardo da Vinci: Anatomy expecting a good-looking but flawed production. No shortcomings came to light, so I tried looking for them. I looked hard. And I couldn't find any. Leonardo da Vinci: Anatomy has everything I could ask for in an ebook about one of the greatest stories in the history of science: a pioneering work on anatomy that was lost for over four hundred years.

Living during the Renaissance, Leonardo's initial understanding of the human body came from ancient scholars like Galen and Aristotle. He was taught that animal spirits traveled through giant holes in the head and then flowed into the nerves. He was taught that blood was produced in the liver and then flowed outward to the ends of the body. One reason that these obviously wrong ideas persisted for over a thousand years was that medieval scholars did not conduct their own autopsies or experiments. Galen and and company had figured out everything there was to know about anatomy, so the best thing they could do was read, not conduct research.

With the Renaissance, that obedience began to crumble. Leonardo was the quintessential do-it-yourself-er. He conceived of new kinds vehicles and weapons; he investigated optics and geology. Wikipedia has set aside a separate page for a startling long list of his accomplishments.

Leonardo also became obsessed with human anatomy, and did not hestitate to make up his own mind about it. He dissected human cadavers. To figure out how the heart worked, he created a glass model of it. To probe the brain, he injected hot wax into the head of a freshly slaughtered ox.

As I wrote in my book Soul Made Flesh, Leonardo had a hard time breaking free from the old notions of how the body worked. When he discovered that the head did not contain three linked chambers, he couldn't break free from the old theory of animal spirits. He could not accept that perhaps the brain itself was responsible for thought. Likewise, although Leonardo discovered a valve in the aorta, he did recognize that blood circulates around the body, pumped by the heart. Nevertheless, his drawings were the greatest anatomical works that existed in his time. Not only were they anatomically correct, but they displayed his artistic mastery.

Leonardo actually came close to publishing a textbook of anatomy while he was living in Milan, but battles in 1511 drove him from the city and he never quite managed to finish it before his death in 1519. Instead, his drawings remained hidden away until the twentieth century.

Today, the Royal Collection is unveiling the largest ever exhibition of Leonardo’s anatomical drawings at The Queen’s Gallery in Buckingham Palace. They also teamed up with Touch Press to create an app based on the show. All the members of the team brought their A game to this undertaking. Leonardo da Vinci: Anatomy contains a richly informative narrative about the artist's hidden career as an anatomist, written by Martin Clayton, Senior Curator of Prints and Drawings at the Royal Collection. It is illustrated elegantly with Leonardo's drawings, as well as interactive images of human anatomy as we know it now. You can see for yourself just how good his drawings of the heart or uterus were. You can turn arms to see how well Leonardo appreciated the body's biomechanics.

These components are copious but never intruding. And they always answer the question raised in the reader's mind by the text. Videos from historians and scientists end each chapter--usually I hate these features, but in Leonardo, the talking heads actually have something to say.

The app also contains Leonardo's notebooks themselves. The interface for this part is nothing short of brilliant. You can search through the pages by organ or system. Each page is presented in its original state, scanned to exquisite resolution. Tap the screen, and the app instantly translates the inscrutable notes Leonardo scribbled by his drawings. Each page is also annotated with useful explanations of what Leonardo was contemplating with each image.

Three decades after Leonardo's death, Andreas Vesalius published Fabrica, which has long been considered the first modern work of anatomy. Leonardo da Vinci: Anatomy demonstrates that Fabrica was not the only masterpiece of the body to come out of the Renaissance. Not many ebooks can claim such achievements.

Carl Zimmer writes frequently about science for the New York Times and is the author of 13 books, including A Planet of Viruses

05/02/2012

Infinity can be cruel. Tablet computers have become so powerful that it's practically impossible to reach the limits of what you can do while creating an ebook. You can embed videos, sprinkle music and voices here and there, let people post a book-inspired thought to Twitter, manipulate a simulated bat, incorporate an encyclopedia of information about chemistry, and on and on. Unfortunately, this virtual infinity of possibilities may leave ebook creators with a virtual infinity of work. Some science ebooks we've reviewed rise to that challenge. They sport a well-integrated collection of features. Other ebooks seem like wild acts of desperation. And others still are acts of wise self-restraint. Yes, you could make a science ebook that does many things poorly. Or you could make a science ebook that does one thing well. One such ebook is Fragile Earth.

Fragile Earth got its start in 2006 as a beautifully disturbing coffee table book published by Collins, filled with satellite images showing how humanity is reworking the planet. "Before" and "after" photographed were paired to show how deforestation, climate change, and other factors have changed the face of Earth. Now Collins has turned it into an app. The Youtube video below gives you a good run-through of its features. The most important one is that instead of putting images side by side, the app does what a book cannot: it lays them on top of each other. You can then use a screen button to slide one picture away to reveal the other one underneath. I'm not quite sure of the visual neuroscience behind this effect, but it works very well. Seeing the same landscape in Alaska covered by glaciers a few decades ago now turned to mostly bare Earth is a sobering experience.

Fragile Earth is not perfect, though. The book and the app alike are presented as a way to see how we're changing the planet. But the app is loaded with other images that show natural changes, such as the devastation caused by the eruption of Mount Saint Helens. Combining these images together makes no thematic sense; the only thing that joins them together is an elegant slider. There's no introduction where you might find an explanation for what a volcano and the deforested Amazon have in common. Instead, Fragile Earth has short caption that describe the specifics of each set of images but leave you wanting more.

That's too bad, because Fragile Earth illustrates on a profoundly important fact about our species. In 2005, Bruce Wilkinson, a University of Michigan geologist, published a paper called, "Humans as geologic agents: A deep-time perspective." [free pdf] "Humans are now an order of magnitude more important at moving sediment than the sum of all other natural processes operating on the surface of the planet," Wilkinson concluded. We're also having other huge effects on the planet--acidifying the oceans faster than at any time in the last 300 million years, for example. Humanity isn't just leaving a mark on the Earth you can see from space. It's a mark that will be preserved in the fossil record for millions of years to come.

Before and after pictures can go a long way to showing the magnitude of that change. But without context, they can also oversimplify it. If you were to select a before and after pair of pictures of glaciers in the Himalayas, for example, you would not see a frightening retreat over the past decade. In fact, you might not see any change at all. Such specific cases are ripe for cherry-picking by global warming denialists. The reason that this one case does not refute global warming because of the overwhelming evidence of changes caused by global warming on a planetary scale--such as the overall loss of ice from the entire Arctic Ocean.

Fragile Earth would thus be a better app if its pictures had more context and a more coherent point. But I'm also glad that its creators didn't try to grasp for infinity.

Carl Zimmer writes frequently about science for the New York Times and is the author of 13 books, including A Planet of Viruses