Alisa Brownlee, ATP, CAPS blog offers recent articles and web information on ALS, assistive technology--augmentative alternative communication (AAC), computer access, and other electronic devices that can impact and improve the quality of life for people with ALS.
Email--abrownlee@alsa-national.org.
Any views or opinions presented on this blog are solely those of the author and do not necessarily represent those of the ALS Association.

Follow by Email

Thursday, June 14, 2012

Researchers at the
Federal Institute of Technology in Lausanne have developeda system that
enables paralyzed patients to control a robot using only theirthoughts. The
system features a cap that transmits the electrical signalsemitted by the
brain, when a user imagines performing a task, to a computer,where the
signal is almost instantly decoded. However, background noise hasemerged as
a major challenge in brain-computer interface research, saysLausanne's Jose
Millan. The researchers solved this problem by programmingthe system to work
in a way similar to the brain's subconscious. Once acommand such as "walk
forward" has been sent, the computer will execute ituntil it receives a
command to stop or the robot encounters an obstacle. Thisallows the user to
focus on other things instead of always having to focus ontelling the robot
to walk forward. The Lausanne team's research appears tomark an advance in
the field, says University of Washington professor RajeshRao, "especially if
the system can be used by the paraplegic person outsidethe
laboratory."

Monday, June 11, 2012

PHOENIX — A Paradise Valley teenager has won a national award for his work in developing technology that helps people with amyotrophic lateral sclerosis, or Lou Gehrig's disease.

Ben Mattinson, 18, who graduated recently from Phoenix Country Day School, won the FIRST Future Innovator Award in May.

Mattinson spent more than a year writing software code that allows immobilized people to write and surf the Internet. He won the award for creating the EyeWriterB 2.1, an improvement over an existing eye-tracking device.

Nearly everything he did was self-taught.

"I was looking it up on the Internet and teaching myself as I went along," said Mattinson, who will attend the Massachusetts Institute of Technology this fall. "That probably delayed things a bit."

His interest in writing computer code was piqued in eighth grade, when he joined the Phoenix Country Day School's robotics team and started fiddling around with programming the robot. That year, the team beat several high school teams in competition, and that inspired him even more. Ben's father, Rob Mattinson, is the club mentor.

A mechanical engineer, Mattinson said his son's work "is way beyond me."

A parent at the school, who is also a doctor at Barrow Neurological Institute, gave Mattinson the idea of working on the EyeWriter and put him in contact with Dr. Suraj Muley, director of the Neuromuscular Program at Barrow. Muley works with people who have ALS, a debilitating disease that leads to progressive weakness and eventual paralysis of the limbs. It also affects speech, which many patients lose.

Because ALS never affects eye movement, the EyeWriter eye-tracking technology was developed as a way for patients to communicate. But commercial versions of devices that use the technology are very expensive — as much as $20,000.

"My goal was to develop a comparable product at a tenth of the cost," Mattinson said. And he wanted it to do more.

So last summer he spent hours every day writing code and working on a prototype to improve the original EyeWriter software, which is open source, meaning it's free on the Internet. His first result ended up being a camera mounted on a pair of glasses — which he realized would be too bulky and impractical for a person who is immobilized.

After more work and feedback from the Barrow doctors, Mattinson had his current version ready by late winter, when he created the video that he submitted for the FIRST award. The winner is who best solves a complex engineering challenge facing the world today.

Mattinson's device, run off a laptop, is a 21-inch monitor atop a stand, with two small arms on either side on which several small infrared lights are attached, along with a PlayStation 3 camera. The whole setup, including the laptop, cost less than $2,000.

The camera lens tracks the retina, discerning the colored part from the white of the eyeball. The user's eyes control the mouse cursor, a green dot on the screen. The user "drags" the cursor by looking around the screen, and can fix the cursor on an icon to left-click or double-click, like a regular hand-held mouse. He also added a virtual keyboard.

The Barrow doctors gave Mattinson feedback, suggesting he use technology that could "predict" what word a person might type based on the first few letters, similar to the iPhone's auto-correct function. But in Mattinson's EyeWriterB 2.1, several word choices are given and the writer can choose one with his or her eyes.

Next, Mattinson will work on eliminating the lights, which can fatigue the eyes after prolonged use, and simplifying the setup.

He also wants to test the device with patients at Barrow.

As part of the award, Mattinson will meet with a venture-capital company in California, where he will present his invention and learn what it takes to bring it to market.
___
Information from: The Arizona Republic, http://www.azcentral.com