Watch this video, and witness a breakthrough in the field of brain-machine interfaces. Researchers have been improving upon BrainGate — a brain-machine interface that allows users to control an external device with their minds — for years, but what you see here is the most advanced incarnation of the implant system to date. It is nothing short of remarkable.

Starting at around 3:10, you can watch Cathy Hutchinson — who has been paralyzed from the neck down for 15 years — drink her morning coffee by controlling a robotic arm using only her mind. According to research published in today's issue of Nature, Hutchinson is one of two quadriplegic patients — both of them stroke victims — who have learned to control the device by means of the BrainGate neural implant. The New York Times reports that it's the first published demonstration that humans with severe brain injuries can control a sophisticated prosthetic arm with such a system:

Scientists have predicted for years that this brain-computer connection would one day allow people with injuries to the brain and spinal cord to live more independent lives. Previously, researchers had shown that humans could learn to move a computer cursor with their thoughts, and that monkeys could manipulate a robotic arm.

The technology is not yet ready for use outside the lab, experts said, but the new study is an important step forward, providing dramatic evidence that brain-controlled prosthetics are within reach.

"It is a spectacular result, in many respects," said John Kalaska, a neuroscientist at the University of Montreal who was not involved in the study, "and really the logical next step in the development of this technology. This is the kind of work that has to be done, and it's further confirmation of the feasibility of using this kind of approach to give paralyzed people some degree of autonomy."

Hutchinson's control over the robotic arm is not perfect, but it's damn impressive. As the video points out, the arm featured in the video is currently programmed to compensate for lurches and unexpected collisions by "entering safety mode" and ceasing movement, but future versions of the arm will presumably be capable of finer, more delicate motions.

What remains to be seen is how such precision will be achieved. One of the things that makes the arm and hand movements of able-bodied people so precise is their ability to actually feel objects in the real world, and sense the position of their limbs in space (a sensation known as proprioception). The interface between our brains and our limbs is therefore bi-directional, meaning we can not only reach for something with our hands, but receive sensory feedback that allows us to make necessary adjustments to our movement, giving rise to improved dexterity and more purposeful, calculated movement.

A bi-directional brain-machine-brain interface may sound like blue-sky technology, but it was successfully demonstrated in monkeys less than a year ago. Could the technology demonstrated in the video up top be married with that of a bi-directional brain-machine-brain interface? That's probably still a ways off... but it certainly seems like the right direction to be heading.