Deep Inside the Mind Music Machine Lab

Cognitive science is a relatively new interdisciplinary field weaving neuroscience, psychology, linguistics, anthropology, and philosophy with computer science. Cognitive scientist Myounghoon “Philart” Jeon, whose nickname translates to “love art,” studies how the human mind reacts to technology. Inside a unique research lab at Michigan Tech, Philart teaches digital devices how to recognize and react to human emotion.

Art Meets Technology

Humans respond to technology, but can technology respond to humans? That’s the goal of the Mind Music Machine Lab. Together with Computer Science and Human Factors students, Philart looks at both physical and internal states of artists at work. He asks: What goes on in an artist’s brain while making art?

23

Reflective tracking markers are attached to performance artists—which have included dancers, visual artists, robots, and even puppies—and 12 infrared cameras visualize and sonify their movement. From the movements, the immersive Interactive Sonification Platform (ilSoP) detects four primary emotions: happy, sad, angry, and content. The result is a system that recognizes movement and emotion to generate real-time music and art.

Robotic Friends for Kids with Autism

Just as technology may not pick up subtle emotional cues, children with autism spectrum disorder (ASD) have difficulties in social interaction, verbal, and nonverbal communication. In this National Institutes of Health-funded project, Jeon uses technology in the form of interactive robots to provide feedback and stimuli to children with ASD.

These children have difficulty expressing emotions. Robots can help express and read emotion.

Studies indicate autistic children prefer simplistic animals and robots to complex humans. “These children have difficulty expressing emotions. And robots can help express and read emotion,” he says.

Robots are programmed to say phrases with different emotional inflections. Cameras and Microsoft Kinect detect facial expressions of humans and use sound cues to reinforce what an emotion is. All the while, parents and clinicians monitor the interaction between the child and robot.

24