Rachael Burns

Rachael Burns (née Bevill) joined the Haptic Intelligence Department in 2017 as a Whitaker International Fellow. Now as a Ph.D student, she continues to develop her current project, a Haptic Empathetic Robot Animal. She has research experience in human-robot interaction, neurodevelopmental disorders, and tissue engineering.

Rachael received her BSc and MSc degrees in Biomedical Engineering from the George Washington University in 2015 and 2017, respectively. Coming from a unique engineering education in America's capital city, her research is further shaped by an interest in medical device accessibility and regulatory law.

Rachael conducted her graduate research with Dr. Chung Hyuk Park in the Assistive Robotics and Telemedicine (ART-Med) Lab at GW. As an undergraduate, she studied 3D-bioprinting with Dr. Lijie Grace Zhang in the GW Bioengineering Laboratory for Nanomedicine and Tissue Engineering.

Online Articles

The Pelton Competition for Outstanding Senior Project: Rachael's team received 3rd place out of the senior student body for their work. An article from the 2016 Edition of GW Synergy (link) describes the competition in greater detail (article can be found on magazine page 17).

Recognitions

Congressman Frank Guinta of New Hampshire commended Rachael during a U.S. House general session. Video and transcript versions (center-top of first page) available.

Smart with Heart: The Human Face of the Fourth Industrial Revolution

A program produced in March 2017 by the Arirang TV documentary series, Arirang Special. This episode highlights the work being done by researchers across the globe to improve quality of life using robotics and assistive technology. Rachael and her colleagues at the ART-Med lab are featured beginning 6:52 minutes in.

The Pelton Competition for Outstanding Senior Project

A fifteen minute business-pitch from Rachael and her teammates explaining their undergraduate capstone project. The group's presentation begins at 13:15. Rachael's team received 3rd place out of the entire engineering senior class for their work on an improved communication device for special needs users.

Congressman Frank Guinta recognizes Rachael Bevill

Congressman Frank Guinta of New Hampshire commends Rachael for her leadership and biomedical engineering focus during a U.S. House general session. As the 2015 New Hampshire Cherry Blossom Princess, Rachael served as a goodwill ambassador in Washington DC, meeting with international ambassadors and elected officials. Studying engineering in DC inspired Rachael to pursue research with both disability advocacy and regulatory law in mind.

Children with autism often endure sensory overload, may be nonverbal, and have difficulty understanding and relaying emotions. These experiences result in heightened stress during social interaction. Animal-assisted intervention has been found to improve the behavior of children with autism during social interaction, but live animal companions are not always feasible. We are thus in the process of designing a robotic animal to mimic some successful characteristics of animal-assisted intervention while trying to improve on others. The over-arching hypothesis of this research is that an appropriately designed robot animal can reduce stress in children with autism and empower them to engage in social interaction.

Imitation is a powerful component of communication between people, and it poses an important implication in improving the quality of interaction in the field of human–robot interaction (HRI). This paper discusses a novel framework designed to improve human–robot interaction through robotic imitation of a participant’s gestures. In our experiment, a humanoid robotic agent socializes with and plays games with a participant. For the experimental group, the robot additionally imitates one of the participant’s novel gestures during a play session. We hypothesize that the robot’s use of imitation will increase the participant’s openness towards engaging with the robot. Experimental results from a user study of 12 subjects show that post-imitation, experimental subjects displayed a more positive emotional state, had higher instances of mood contagion towards the robot, and interpreted the robot to have a higher level of autonomy than their control group counterparts did. These results point to an increased participant interest in engagement fueled by personalized imitation during interaction.

2017

In International Conference on Intelligent Robots and Systems (IROS) 2017, International Conference on Intelligent Robots and Systems, September 2017 (inproceedings)

Abstract

This paper discusses a novel framework designed to provide sensory stimulation to children with Autism Spectrum Disorder (ASD). The set up consists of multi-sensory stations to stimulate visual/auditory/olfactory/gustatory/tactile/vestibular senses, together with a robotic agent that navigates through each station responding to the different stimuli. We hypothesize that the robot’s responses will help children learn acceptable ways to respond to stimuli that might otherwise trigger sensory overload. Preliminary results from a pilot study conducted to examine the effectiveness of such a setup were encouraging and are described briefly in this text.

In International Conference on Intelligent Robots and Systems (IROS) 2017, International Conference on Intelligent Robots and Systems, September 2017 (inproceedings)

Abstract

This abstract (and poster) is a condensed version of Burns' Master's thesis and related journal article. It discusses the use of imitation via robotic motion learning to improve human-robot interaction. It focuses on the preliminary results from a pilot study of 12 subjects.
We hypothesized that the robot's use of imitation will increase the user's openness towards engaging with the robot. Post-imitation, experimental subjects displayed a more positive emotional state, had higher instances of mood contagion towards the robot, and interpreted the robot to have a higher level of autonomy than their control group counterparts. These results point to an increased user interest in engagement fueled by personalized imitation during interaction.

This paper discusses a novel framework designed to increase human-robot interaction through robotic imitation of the user's gestures. The set up consists of a humanoid robotic agent that socializes with and play games with the user. For the experimental group, the robot also imitates one of the user's novel gestures during a play session. We hypothesize that the robot's use of imitation will increase the user's openness towards engaging with the robot. Preliminary results from a pilot study of 12 subjects are promising in that post-imitation, experimental subjects displayed a more positive emotional state, had higher instances of mood contagion towards the robot, and interpreted the robot to have a higher level of autonomy than their control group counterparts. These results point to an increased user interest in engagement fueled by personalized imitation during interaction.

In 2017 GW Research Days, Department of Biomedical Engineering Posters and Presentations, April 2017 (inproceedings)

Abstract

The number of middle-aged to elderly patients receiving shoulder surgery is increasing. However, statistically, very few of these patients perform the necessary at-home physical therapy regimen they are prescribed post-surgery. This results in longer recovery times and/or incomplete healing. We propose the use of a robotic therapist, with customized training and encouragement regimens, to increase physical therapy adherence and improve the patient’s recovery experience.

In 2017 GW Research Days, Department of Biomedical Engineering Posters and Presentations, April 2017 (inproceedings)

Abstract

We aim to use motion learning to teach a robot to imitate people's unique gestures. Our robot, ROBOTIS-OP2, can ultimately use imitation to practice social skills with children with autism.
In this abstract, two methods of motion learning were compared: Dynamic motion primitives with least squares (DMP with WLS), and Dynamic motion primitives with a Gaussian Mixture Regression (DMP with GMR). Movements with sharp turns were most accurately reproduced using DMP with GMR. Additionally, more states are required to accurately recreate more complex gestures.

2016

Workshop paper (5 pages) at the RO-MAN Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics, August 2016 (misc)

Abstract

In this full workshop paper, we discuss the positive impacts of robot, music, and imitation therapies on children with autism. We also discuss the use of Laban Motion Analysis (LMA) to identify emotion through movement and posture cues. We present our preliminary studies of the "Five Senses" game that our two robots, Romo the penguin and Darwin Mini, partake in. Using an LMA-focused approach (enabled by our skeletal tracking Kinect algorithm), we find that our participants show increased frequency of movement and speed when the game has a musical accompaniment. Therefore, participants may have increased engagement with our robots and game if music is present. We also begin exploring motion learning for future works.

It is known that children with autism have difficulty with emotional communication. As the population of children with autism increases, it is crucial we create effective therapeutic programs that will improve their communication skills. We present an interactive robotic system that delivers emotional and social behaviors for multi­sensory therapy for children with autism spectrum disorders. Our framework includes emotion­-based robotic gestures and facial expressions, as well as tracking and understanding the child’s responses through Kinect motion capture.

In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

In Proceedings of the Eleventh ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 421-422, March 2016 (inproceedings)

Abstract

In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy.
We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

Our goal is to understand the principles of Perception, Action and Learning in autonomous systems that successfully interact with complex environments and to use this understanding to design future systems