According to the researchers, current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups.

“To mitigate the drawbacks of these two status quo approaches, we created GazeSpeak, an eye gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable, and easy-to-learn, with a higher communication bandwidth than an e-tran board,” the researchers wrote in the abstract of a paper scheduled to be presented in May at the Conference on Human Factors in Computing Systems in Colorado, US.

“GazeSpeak can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication, with different user interfaces for speakers and interpreters,” the researchers said.

GazeSpeak uses artificial intelligence to convert eye movements into speech, and runs on the listener’s device so that he/she can understand what is being said in real time, New Scientist reported.

The app will be available on the Apple App Store before the conference in May, according to the report.