Wednesday, May 9, 2007

Media lab hopes to create humans, the next version

By Elizabeth Cooney, Globe Correspondent

Dan Ellsey brought down the house at the MIT Media Lab's symposium today on augmenting the human body.

Using an infrared tracker and HyperScore software developed by MIT music and media professor Tod Machover, the 33-year man with cerebral palsy used head movements to perform his electronic music composition "My Eagle Song." Its harmonies were translated into rippling waves of color on the screen behind him.

There were cheers and even tears in the audience, which had just seen athlete, model and actress Aimee Mullins say she can't imagine trading her life with two artificial legs for legs of flesh and bones.

"People say I have no legs, but I say I have 10," she said, pointing to a row of prosthetics, including a carved wooden one, a carbon fiber set based on cheetah dynamics and the set with stiletto heels that she was wearing. "My interaction with them has transformed me."

And then Hugh Herr, MIT professor, developer of human-powered artificial legs and also a bilateral amputee, scaled a climbing wall on stage.

"Doctors said I would never climb again," said Herr. "They were wrong."

It was a dizzying end to a day of thinking differently about how technology can forge "new minds, new bodies, new identities," as the conference was billed.

Media Lab director Frank Moss called it "hacking the human" when he introduced today's session, designed to show how scientists are melding human and machine to invent a better future not just for people who have lost the ability to walk or see or interpret facial cues, but for all people.

"Today we'll discuss designs to unleash an era of human adaptability to forever change our notions of abilities and disabilities," he said.

John Hockenberry, former NBC News journalist and distinguished fellow at the Media Lab, set the tone, saying he was looking for an upgrade for himself.

He has used a wheelchair for 30 years after being paralyzed in a car accident when he was 19. He said he has no trouble integrating the machine that helps him get around with the person he has become in this second life. Typewriters were created as a tool to help the blind, he reminded the audience.

Other speakers included neurologist Oliver Sacks, who sounded a note of caution. He told the story of a congenitally blind man whose life was turned into turmoil when he was surgically given sight but his brain could not interpret it.

MIT professor Rosalind Picard wired several audience members to get feedback from their facial expressions when her talk was boring them. The work has implications for people with autism, like her former student whose mother once told Picard that he learned math as easily as most people read social cues, and learned social cues with as much difficulty as most people learn math.

Deb Roy, an MIT professor, has lived, along his wife and 21-month-old son, under near constant video surveillance in their home. Videocams in the ceiling provide minute-by-minute details of how his son has learned the basics of speech – the first time speech acquisition has been analyzed so closely.

John Donoghue of Brown University and Cyberkinetics Neurotechnology Systems showed the familiar but still astonishing images of quadriplegic Matthew Nagle moving a computer cursor with his thoughts.

Architect Michael Graves made an eloquent plea for simple solutions to problems born of design done without much thought.

"It doesn't cost much," he said. "It's just a matter of using your mind and the strength of your convictions."

At the end of the day, after the music ended and the cheers subsided, Hockenberry wheeled on stage on a Segway, which he said was a hacked version adapted to fit the user. This was not something dumped on people, he said, but a device that users fashioned to fit their own needs and evolving identities.

"There is no such thing as normal," Hockenberry said. "With devices such as this, I'm liberated. I’m set free."