Myounghoon “Philart” Jeon (CLS/CS) and his collaborators recently published “Robotic Motion Learning Framework to Promote Social Engagement” (http://www.mdpi.com/2076-3417/8/2/241) in the Journal, Applied Sciences, per Tech Today.
Myounghoon (Philart) Jeon (CLS/CS) and his three graduate students are attending the Human Factors and Ergonomics Society 2017 International Annual Meeting, which began Monday through today, in Austin, Texas.
Philart’s grant is a 4-year award with a total budget of $350,000 from Korea Automobile Testing & Research Institute. Two graduate students will be supported by this grant each year. The project is titled “Development of the safety assessment technique for take‐over in automated vehicles.”
The goal of the project is to design and evaluate intelligent auditory interactions for improving safety and user experience in the automated vehicles. Research tasks include developing a driving simulator for automated driving model, modelling driver states in automated vehicles, design and evaluating discrete auditory alerts for safety purpose, and the development of real-time sonification systems for overall user experience. Congratulations Philart!
Cognitive science is a relatively new interdisciplinary field weaving neuroscience, psychology, linguistics, anthropology, and philosophy with computer science. Cognitive scientist Myounghoon “Philart” Jeon, whose nickname translates to “love art,” studies how the human mind reacts to technology. Inside a unique research lab at Michigan Tech, Philart teaches digital devices how to recognize and react to human emotion.
Art Meets Technology
Humans respond to technology, but can technology respond to humans? That’s the goal of the Mind Music Machine Lab. Together with Computer Science and Human Factors students, Philart looks at both physical and internal states of artists at work. He asks: What goes on in an artist’s brain while making art?
Reflective tracking markers are attached to performance artists—which have included dancers, visual artists, robots, and even puppies—and 12 infrared cameras visualize and sonify their movement. From the movements, the immersive Interactive Sonification Platform (ilSoP) detects four primary emotions: happy, sad, angry, and content. The result is a system that recognizes movement and emotion to generate real-time music and art.
Robotic Friends for Kids with Autism
Just as technology may not pick up subtle emotional cues, children with autism spectrum disorder (ASD) have difficulties in social interaction, verbal, and nonverbal communication. In this National Institutes of Health-funded project, Jeon uses technology in the form of interactive robots to provide feedback and stimuli to children with ASD.
These children have difficulty expressing emotions. Robots can help express and read emotion.
Studies indicate autistic children prefer simplistic animals and robots to complex humans. “These children have difficulty expressing emotions. And robots can help express and read emotion,” he says.
Robots are programmed to say phrases with different emotional inflections. Cameras and Microsoft Kinect detect facial expressions of humans and use sound cues to reinforce what an emotion is. All the while, parents and clinicians monitor the interaction between the child and robot.
Myounghoon “Philart” Jeon (CLS/CS) and his seven students attended the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI) Oct. 24-26 at University of Michigan.
Jeon and students hosted a tutorial on “in-vehicle auditory interactions: Design and Application of Auditory Displays, Speech, Sonification and Music.” Jeon and international collaborators hosted a workshop on “Ethically Inspired User Interfaces for Decision Making in Automated Driving.”
They had two demos at the conference: “Listen to Your Drive: An In-vehicle Sonification Prototyping Tool for Driver State and Performance Data” and “Development Tool for Rapid Evaluation of Eyes-free In-Vehicle Gesture Controls.”
This travel has been supported by CLS, CS, ICC, MTTI and HMC.
Associate Professor Philart Jeon received a research award from Hyundai Motor Company in the amount of $130,236.
The project is entitled, “Novel In-vehicle Interaction Design and Evaluation”.
Philart and his students will investigate the effectiveness of an in-vehicle control system and culture-specific sound preference.
Zhenlin received an NSF research award with a total budget of $375,000. This is a 3-year project with a title of “CSR:Small: Effective Sampling-Based Miss Ratio Curves: Theory and Practice”. In this project, Zhenlin and his students will use miss ratio curves (MRCs), which relate cache miss ratio to cache size, to model working set and cache locality. The project develops a new cache locality theory to construct MRCs effectively and then applies it to several caching or memory management systems.
Tim received a DoD Army Research Office research award with a budget of $99,779 during the first year. This is also a 3-year project with a total budget of $1,066,799. The project is titled “Multisensor Analysis and Algorithm Development for Detection and Classification of Buried and Obscured Targets.” Tim and his students will develop new algorithms to detect and classify buried objects, one of the important research areas for ARO.
Philart received a research award from Hyundai Motor Company in the amount of $130,236. The project is entitled, “Novel In-vehicle Interaction Design and Evaluation”. Philart and his students will investigate the effectiveness of an in-vehicle control system and culture-specific sound preference.
Congratulations Zhenlin, Tim, and Philart! Thanks for the great job!
Myounghoon “Philart” Jeon’s (CLS/CS) students Jason Sterkenburg, Steven Landry and Joshua Johnson won the Best Student Paper Award from the International Conference on Auditory Display (ICAD) which was held at Australian National University in Canberra, Australia July 3 through July 7. The paper was entitled, “Towards an in-vehicle sonically-enhanced gesture control interface: A pilot study.” – from Tech Today
Professors Keith Vertanen, Nilufer Onder, Scott Kuhl, Philart Jeon, have been identified as four of only 85 instructors who received an exceptional “Average of 7 Dimensions” student evaluation score during Spring semester 2016.
Their scores are in the top 10% of similarly sized sections across all courses/sections on campus. These great achievements reflect the tremendous effort and commitment Keith, Nilufer, Philart, and Scott have put on their teaching.
Myounghoon “Philart” Jeon received the best paper award from the EAI (European Alliance Innovation) Conference regarding art and technology integration. Philart’s paper is about “Aesthetic Computing” and its case projects.
Congratulations Philart on a job well done!