Keith Vertanen and Scott Kuhl Awarded $499K NSF Grant

Scott Kuhl
Scott Kuhl
Keith Vertanen
Keith Vertanen

Keith Vertanen, assistant professor of computer science (HCC), and Scott Kuhl (HCC), associate professor of computer science, are principal investigators of a recently funded three-year National Science Foundation grant for their project, “CHS: Small: Rich Surface Interaction for Augmented Environments.” The expected funding over three years is $499,552.00.

Vertanen and Kuhl are members of Michigan Tech’s Institute of Computing and Cybersystems (ICC) Center for Human-Centered Computing. A 2018 ICC research seed grant funded by ECE Alumnus Paul Williams was used to produce some of the preliminary results in the successful proposal. More info about the Williams Seed Grant can be found here: https://blogs.mtu.edu/icc/2019/07/16/appropriating-everyday-surfaces-for-tap-interaction/.

A related video can be found here: https://youtu.be/sF7aeXMfsIQ.

Abstract: Virtual Reality (VR) and Augmented Reality (AR) head-mounted displays are increasingly being used in different computing related activities such as data visualization, education, and training. Currently, VR and AR devices lack efficient and ergonomic ways to perform common desktop interactions such as pointing-and-clicking and entering text. The goal of this project is to transform flat, everyday surfaces into a rich interactive surface. For example, a desk or a wall could be transformed into a virtual keyboard. Flat surfaces afford not only haptic feedback, but also provide ergonomic advantages by providing a place to rest your arms. This project will develop a system where microphones are placed on surfaces to enable the sensing of when and where a tap has occurred. Further, the system aims to differentiate different types of touch interactions such as tapping with a fingernail, tapping with a finger pad, or making short swipe gestures.

This project will investigate different machine learning algorithms for producing a continuous coordinate for taps on a surface along with associated error bars. Using the confidence of sensed taps, the project will investigate ways to intelligently inform aspects of the user interface, e.g. guiding the autocorrection algorithm of a virtual keyboard decoder. Initially, the project will investigate sensing via an array of surface-mounted microphones and design “surface algorithms” to determine and compare the location accuracy of the finger taps on the virtual keyboard. These algorithms will experiment with different models including existing time-of-flight model, a new model based on Gaussian Process Regression, and a baseline of classification using support vector machines. For all models, the project will investigate the impact of the amount of training data from other users, and varying the amount of adaptation data from the target user. The project will compare surface microphones with approaches utilizing cameras and wrist-based inertial sensors. The project will generate human-factors results on the accuracy, user preference, and ergonomics of interacting midair versus on a rigid surface. By examining different sensors, input surfaces, and interface designs, the project will map the design space for future AR and VR interactive systems. The project will disseminate software and data allowing others to outfit tables or walls with microphones to enable rich interactive experiences.