On June 12th in Sao Paulo's Corinthians Arena, Juliano Pinto wore a robotic suit that allowed him to kick a soccer ball down a short distance. Due to a car accident eight years ago, the 29-year-old Brazilian has complete paralysis of the lower trunk and limbs, and he was able to complete the ceremonial first kick of the World Cup using a brain-machine interface.

Back in 2011, SmartPlanet described a brain-machine-brain interface developed by Nicolelis and colleagues that successfully translated brain signals into movements and returned artificial touch feedback to the brain. Macaque monkeys were able to control objects on a computer display with a virtual arm controlled by their mind, while having the ability to "feel" the textures of the objects on the screen. The work was published in the journal Nature.

Nicolelis told Nature News back then that he and the Walk Again consortium hope to build a robotic suit that could restore mobility and the ability to sense texture to severely paralyzed patients -- and to demonstrate it at the 2014 World Cup in his homeland with the opening kick of the ball being delivered by a young Brazilian with paralysis.

A few years, more publications, and nearly $15 million dollars from the Brazilian government later, here we are. But scientifically, the project is a departure, Science reports. Nicolelis first intended the exoskeleton to read signals from implanted electrodes, but decided instead to use a noninvasive EEG cap. Previous results, he says, haven't been worth the implant risks.

The sensor EEG cap uses electrodes to magnify nerve impulses from the brain before sending them to processors that decode the signals and relay them to hydraulics in the exoskeleton strapped to the legs, The Scientist explains.

In a Q&A with Science, Nicolelis describes how a person moves the exoskeleton and what they can control:

The person has to imagine movements, and these movements are translated into commands that enact the movements in the exo. It's a concept that we published way back in 2002 called shared control. Part of the higher order decision is done by the brain, and the low-level movement is enacted by the robot. [High-order decisions include] "start walking," "stop walking," "accelerate," "slow down," "turn left," "turn right," "kick the ball."

Other mind-machine interfaces and EEG-based exoskeletons exist, and Nicolelis's critics didn't think the demo advanced the science. Others are concerned that the high-profile demonstration might give the public unrealistic expectations.

"We are going to demonstrate a very beautiful thing, but the demonstration for the scientific community will come in the papers that will come afterward," he replies. "We are not preparing this for peer review. This is a show for the world."

Janet Fang has written for Nature, Discover and the Point Reyes Light. She is currently a lab technician at Lamont-Doherty Earth Observatory. She holds degrees from the University of California, Berkeley and Columbia University. She is based in New York.
Follow her on Twitter.