2019-05-25T16:23:29ZThe Unlock Project: A Python-based framework for practical brain-computer interface communication “app” developmenthttp://hdl.handle.net/1808/26518
The Unlock Project: A Python-based framework for practical brain-computer interface communication “app” development
Brumberg, Jonathan S.; Lorenz, Sean D.; Galbraith, Byron V.; Guenther, Frank H.
In this paper we present a framework for reducing the development time needed for creating applications for use in non-invasive brain-computer interfaces (BCI). Our framework is primarily focused on facilitating rapid software “app” development akin to current efforts in consumer portable computing (e.g. smart phones and tablets). This is accomplished by handling intermodule communication without direct user or developer implementation, instead relying on a core subsystem for communication of standard, internal data formats. We also provide a library of hardware interfaces for common mobile EEG platforms for immediate use in BCI applications. A use-case example is described in which a user with amyotrophic lateral sclerosis participated in an electroencephalography-based BCI protocol developed using the proposed framework. We show that our software environment is capable of running in real-time with updates occurring 50–60 times per second with limited computational overhead (5 ms system lag) while providing accurate data acquisition and signal analysis.
2012-08-01T00:00:00ZBrain-Machine Interfaces for Real-time Speech Synthesishttp://hdl.handle.net/1808/26517
Brain-Machine Interfaces for Real-time Speech Synthesis
Guenther, Frank H.; Brumberg, Jonathan S.
This paper reports on studies involving brain-machine interfaces (BMIs) that provide near-instantaneous audio feedback from a speech synthesizer to the BMI user. In one study, neural signals recorded by an intracranial electrode implanted in a speech-related region of the left precentral gyrus of a human volunteer suffering from locked-in syndrome were transmitted wirelessly across the scalp and used to drive a formant synthesizer, allowing the user to produce vowels. In a second, pilot study, a neurologically normal user was able to drive the formant synthesizer with imagined movements detected using electroencephalography. Our results support the feasibility of neural prostheses that have the potential to provide near-conversational synthetic speech for individuals with severely impaired speech output.
2011-08-01T00:00:00ZA Noninvasive Brain-Computer Interface for Real-Time Speech Synthesis: The Importance of Multimodal Feedback.http://hdl.handle.net/1808/26516
A Noninvasive Brain-Computer Interface for Real-Time Speech Synthesis: The Importance of Multimodal Feedback.
Brumberg, Jonathan S.; Pitt, Kevin M.; Burnison, Jeremy Dean
We conducted a study of a motor imagery brain-computer interface (BCI) using electroencephalography to continuously control a formant frequency speech synthesizer with instantaneous auditory and visual feedback. Over a three-session training period, sixteen participants learned to control the BCI for production of three vowel sounds (/ textipa i/ [heed], / textipa A/ [hot], and / textipa u/ [who'd]) and were split into three groups: those receiving unimodal auditory feedback of synthesized speech, those receiving unimodal visual feedback of formant frequencies, and those receiving multimodal, audio-visual (AV) feedback. Audio feedback was provided by a formant frequency artificial speech synthesizer, and visual feedback was given as a 2-D cursor on a graphical representation of the plane defined by the first two formant frequencies. We found that combined AV feedback led to the greatest performance in terms of percent accuracy, distance to target, and movement time to target compared with either unimodal feedback of auditory or visual information. These results indicate that performance is enhanced when multimodal feedback is meaningful for the BCI task goals, rather than as a generic biofeedback signal of BCI progress.
2018-04-01T00:00:00ZExamining sensory ability, feature matching, and assessment-based adaptation for a brain-computer interface using the steady-state visually evoked potentialhttp://hdl.handle.net/1808/26515
Examining sensory ability, feature matching, and assessment-based adaptation for a brain-computer interface using the steady-state visually evoked potential
Brumberg, Jonathan S.; Nguyen, Anh; Pitt, Kevin M.; Lorenz, Sean D.
PURPOSE:We investigated how overt visual attention and oculomotor control influence successful use of a visual feedback brain-computer interface (BCI) for accessing augmentative and alternative communication (AAC) devices in a heterogeneous population of individuals with profound neuromotor impairments. BCIs are often tested within a single patient population limiting generalization of results. This study focuses on examining individual sensory abilities with an eye toward possible interface adaptations to improve device performance.
METHODS: Five individuals with a range of neuromotor disorders participated in four-choice BCI control task involving the steady state visually evoked potential. The BCI graphical interface was designed to simulate a commercial AAC device to examine whether an integrated device could be used successfully by individuals with neuromotor impairment.
RESULTS: All participants were able to interact with the BCI and highest performance was found for participants able to employ an overt visual attention strategy. For participants with visual deficits to due to impaired oculomotor control, effective performance increased after accounting for mismatches between the graphical layout and participant visual capabilities.
CONCLUSION: As BCIs are translated from research environments to clinical applications, the assessment of BCI-related skills will help facilitate proper device selection and provide individuals who use BCI the greatest likelihood of immediate and long term communicative success. Overall, our results indicate that adaptations can be an effective strategy to reduce barriers and increase access to BCI technology. These efforts should be directed by comprehensive assessments for matching individuals to the most appropriate device to support their complex communication needs. Implications for Rehabilitation Brain computer interfaces using the steady state visually evoked potential can be integrated with an augmentative and alternative communication device to provide access to language and literacy for individuals with neuromotor impairment. Comprehensive assessments are needed to fully understand the sensory, motor, and cognitive abilities of individuals who may use brain-computer interfaces for proper feature matching as selection of the most appropriate device including optimization device layouts and control paradigms. Oculomotor impairments negatively impact brain-computer interfaces that use the steady state visually evoked potential, but modifications to place interface stimuli and communication items in the intact visual field can improve successful outcomes.
This is an Accepted Manuscript of an article published by Taylor & Francis in Disability and Rehabilitation: Assistive Technology on 01/31/2018, available online: http://www.tandfonline.com/10.1080/17483107.2018.1428369.
2018-01-31T00:00:00Z