Also In This Section
  • Topics

  • Recent Posts

  • Tag: jeon

    Myounghoon “Philart” Jeon (CLS/CS) and his collaborators recently published “Robotic Motion Learning Framework to Promote Social Engagement”

    MyounghoonJeon20140131_0001Myounghoon “Philart” Jeon (CLS/CS) and his collaborators recently published “Robotic Motion Learning Framework to Promote Social Engagement” (http://www.mdpi.com/2076-3417/8/2/241) in the Journal, Applied Sciences, per Tech Today.


    Associate Professor, Jeon, Receives Korea Automobile Testing and Research Institute Grant

    MyounghoonJeon20140131_0001Philart’s grant is a 4-year award with a total budget of $350,000 from Korea Automobile Testing & Research Institute. Two graduate students will be supported by this grant each year. The project is titled “Development of the safety assessment technique for take‐over in automated vehicles.”

    The goal of the project is to design and evaluate intelligent auditory interactions for improving safety and user experience in the automated vehicles. Research tasks include developing a driving simulator for automated driving model, modelling driver states in automated vehicles, design and evaluating discrete auditory alerts for safety purpose, and the development of real-time sonification systems for overall user experience. Congratulations Philart!


    Deep Inside the Mind Music Machine Lab

    Cognitive science is a relatively new interdisciplinary field weaving neuroscience, psychology, linguistics, anthropology, and philosophy with computer science. Cognitive scientist Myounghoon “Philart” Jeon, whose nickname translates to “love art,” studies how the human mind reacts to technology. Inside a unique research lab at Michigan Tech, Philart teaches digital devices how to recognize and react to human emotion.

    Art Meets Technology

    Humans respond to technology, but can technology respond to humans? That’s the goal of the Mind Music Machine Lab. Together with Computer Science and Human Factors students, Philart looks at both physical and internal states of artists at work. He asks: What goes on in an artist’s brain while making art?

    23

    Reflective tracking markers are attached to performance artists—which have included dancers, visual artists, robots, and even puppies—and 12 infrared cameras visualize and sonify their movement. From the movements, the immersive Interactive Sonification Platform (ilSoP) detects four primary emotions: happy, sad, angry, and content. The result is a system that recognizes movement and emotion to generate real-time music and art.

    Robotic Friends for Kids with Autism

    Just as technology may not pick up subtle emotional cues, children with autism spectrum disorder (ASD) have difficulties in social interaction, verbal, and nonverbal communication. In this National Institutes of Health-funded project, Jeon uses technology in the form of interactive robots to provide feedback and stimuli to children with ASD.

    These children have difficulty expressing emotions. Robots can help express and read emotion.

    Studies indicate autistic children prefer simplistic animals and robots to complex humans. “These children have difficulty expressing emotions. And robots can help express and read emotion,” he says.

    Robots are programmed to say phrases with different emotional inflections. Cameras and Microsoft Kinect detect facial expressions of humans and use sound cues to reinforce what an emotion is. All the while, parents and clinicians monitor the interaction between the child and robot.

    24


    Faculty and Students Attend Conference

    Philart-and-StudentsMyounghoon “Philart” Jeon (CLS/CS) and his seven students attended the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI) Oct. 24-26 at University of Michigan.

    Jeon and students hosted a tutorial on “in-vehicle auditory interactions: Design and Application of Auditory Displays, Speech, Sonification and Music.” Jeon and international collaborators hosted a workshop on “Ethically Inspired User Interfaces for Decision Making in Automated Driving.”

    They had two demos at the conference: “Listen to Your Drive: An In-vehicle Sonification Prototyping Tool for Driver State and Performance Data” and “Development Tool for Rapid Evaluation of Eyes-free In-Vehicle Gesture Controls.”

    This travel has been supported by CLS, CS, ICC, MTTI and HMC.


    Associate Professor Myounghoon “Philart” Jeon received a research award from Hyundai Motor Company

    image72429-persAssociate Professor Philart Jeon received a research award from Hyundai Motor Company in the amount of $130,236.

    The project is entitled, “Novel In-vehicle Interaction Design and Evaluation”.

    Philart and his students will investigate the effectiveness of an in-vehicle control system and culture-specific sound preference.