Researchers: Myounghoon “Philart” Jeon, PI
Sponsor: National Institutes of Health through the National Robotics Initiative
Amount of Support: $258,362
Abstract: The purpose of the research is to design novel forms of musical interaction combined with physical activities for improving social interactions and emotional responses of children with autism spectrum disorder (ASD). It is well known that a subject with ASD shows deficiency in emotional and social interactions. We propose to address two aspects of this issue: physio-musical stimulus for initiating engagement and empathizing for deepening interaction and thus enhancing a child’s emotional and social interactions. People with or without ASD between the ages of 5 and 10 may join this study.
Summary: In the United States, the rapid increase in the population of children with autism spectrum disorder (ASD) has revealed the deficiency in the realm of therapeutic accessibility for children with ASD in the domain of emotion and social interaction. There have been a number of approaches including several robotic therapeutic systems, but most of these efforts have been centered on speech interaction and task-based turn-taking scenarios. Unfortunately, the spectral diversity of ASD is so vast that many current approaches are, as novel and intriguing as they are, still insufficient to provide parameterized therapeutic tools and approaches.
To overcome this challenge, state-of-the-art techniques must still be developed to facilitate the autonomous interaction methods for robots to effectively stimulate the emotional and social interactivity of children. We focus on the recent studies that reveal strong relevance in premotor cortex among neural domains for music, emotion, and motor behaviors. We propose that musical interaction and activities can provide a new therapeutic domain for effective development in the children’s emotion and social interaction. Of key importance within this proposed work are providing capabilities for the robotic system to monitor the emotional and interactive states of children and provide effective musical and behavioral stimuli with respect to the emotion and social interaction of children. Major research questions include (1) What kinds of music-based signals and music-coordinated activities can play effective roles in encouraging emotional interaction and social engagement?, (2) How can robotic learning of human behaviors during musical activities increase the interaction between human and robot and reinforce the emotional and social engagement?, and (3) What metrics can be designed to effectively measure and evaluate the changes in emotional and social engagement through musical interaction activities?
Intellectual Merits: Designing and evaluating core techniques to fuse music, emotion, and socio-physical interaction should be invaluable to advancing affective computing, robotics, and engineering psychology theory as well as providing guidelines in developing an effective therapeutic robot companion. With this research endeavor, we will identify the most crucial features of musical components in stimulating emotional and interactive intentions of children and the most effective way to correlate those musical components with motion behaviors to maximize the children’s social engagement and development. The findings of the proposed work will also contribute to the design of interactive scenarios for natural and creative therapy with an individualized and systematic approach.
Broader Impacts: The successful development of our framework with the music-based approach has the capability of creating a new domain of pediatric therapy that can tremendously increase the abilities of robots to interact with children in a safe and natural manner. The novelty and significance of this approach correlates to therapeutic design for children with ASD, but the foundation for our interactive and adaptive reinforcement scheme can be extended to other pediatric populations and developmental studies. We plan to incorporate these knowledge and approaches into courses designed for robotics and affective computing. Furthermore, we plan to encapsulate many of these ideas into an “outreach” workshop for underrepresented students. Undergraduate research projects and demonstrations are expected to inspire the next generation of engineers and scientist with a new robot-coexistent world.
Main Results: We have developed an emotion-based robotic motion platform that can encapsulate spatial components as well as emotional dimensions into robotic movement behaviors. The Romo robot and DARwin-OP are first used for our project (Figure 1). The robotic characters and interfaces are also newly designed to accommodate characteristics of children with ASD while satisfying software design specifications. We have also developed a multi-modal analysis system to monitor and quantify physical engagements and emotional reactions of children through facial expression recognition app (Figure 2), Kinect based movement analysis system, voice analysis app, and a music analysis software. All systems are designed for mobile computing environments, so our developed therapy sessions can be installed in any place with proper space and connectivity. We have implemented a sonification server (Figure 3) with 600 emotional sound cues for 30 emotional keywords and a real-time sonification generation platform. We have also carried out extensive user study with Americans and Koreans to validate our sounds and compare cultural differences. The results show that there are some commonalities in terms of emotional sound preferences between the two countries and we can narrow down our emotional sound cues for further research. We have created robotic scenarios for a pilot study (Typical day of Senses & Four Seasons), and will expand the scenarios with diverse genres of music and motion library sets.