Electrooculogram based system for computer control using a multiple feature classification model

This paper discusses the creation of a system for computer-aided communication through automated analysis and processing of electrooculogram signals. In situations of disease or trauma, there may be an inability to communicate with others through standard means such as speech or typing. Eye movement tends to be one of the last remaining active muscle capabilities for people with neurodegenerative disorders, such as amyotrophic lateral sclerosis (ALS) also known as Lou Gehrig’s disease. Thus, there is a need for eye movement based systems to enable communication. To meet this need, the Telepathix system was designed to accept eye movement commands denoted by looking to the left, looking to the right, and looking straight ahead to navigate a virtual keyboard. Using a ternary virtual keyboard layout and a multiple feature classification model, a typing speed of 6 letters per minute was achieved