Category: Vertanen

Keith Vertanen Is PI on $225K NSF Grant, “Improving Mobile Device Input for Users Who are Blind or Low Vision”

Keith Virtanen
Keith Vertanen

Keith Vertanen (CS/ICC-HCC) is the principal investigator on a three-year project that has received a $225,663 research and development grant from the National Science Foundation. The project is entitled, “CHS: Small: Collaborative Research: Improving Mobile Device Input for Users Who are Blind or Low Vision.”

Abstract: Smartphones are an essential part of our everyday lives. But for people with visual impairments, basic tasks like composing text messages or browsing the web can be prohibitively slow and difficult. The goal of this project is to develop accessible text entry methods that will enable people with visual impairments to enter text at rates comparable to sighted people. This project will design new algorithms and feedback methods for today’s standard text entry approaches of tapping on individual keys, gesturing across keys, or dictating via speech. The project aims to:  1) help users avoid errors by enabling more accurate input via audio and tactile feedback, 2) help users find errors by providing audio and visual annotation of uncertain portions of the text, and 3) help users correct errors by combining the probabilistic information from the original input, the correction, and approximate information about an error’s location. Improving text entry methods for people who are blind or have low vision will enable them to use their mobile devices more effectively for work and leisure. Thus, this project represents an important step to achieving equity for people with visual impairments.

This project will contribute novel interface designs to the accessibility and human-computer interaction literature. It will advance the state-of-the-art in mobile device accessibility by: 1) studying text entry accessibility for low vision in addition to blind people, 2) studying and developing accessible gesture typing input methods, and 3) studying and developing accessible speech input methods.  This project will produce design guidelines, feedback methods, input techniques, recognition algorithms, user study results, and software prototypes that will guide improvements to research and commercial input systems for users who are blind or low-vision. Further, the project’s work on the error correction and revision process will improve the usability and performance of touchscreen and speech input methods for everyone.

Keith Vertanen and Scott Kuhl Awarded $500K NSF Grant

Scott Kuhl
Scott Kuhl
Keith Vertanen
Keith Vertanen

Keith Vertanen, assistant professor of computer science (HCC), and Scott Kuhl (HCC), associate professor of computer science, are principal investigators of a recently funded three-year National Science Foundation grant for their project, “CHS: Small: Rich Surface Interaction for Augmented Environments.” The expected funding over three years is $499,552.00.

Vertanen and Kuhl are members of Michigan Tech’s Institute of Computing and Cybersystems (ICC) Center for Human-Centered Computing. A 2018 ICC research seed grant funded by ECE Alumnus Paul Williams was used to produce some of the preliminary results in the successful proposal. More info about the Williams Seed Grant can be found here: https://blogs.mtu.edu/icc/2019/07/16/appropriating-everyday-surfaces-for-tap-interaction/.

A related video can be found here: https://youtu.be/sF7aeXMfsIQ.

Abstract: Virtual Reality (VR) and Augmented Reality (AR) head-mounted displays are increasingly being used in different computing related activities such as data visualization, education, and training. Currently, VR and AR devices lack efficient and ergonomic ways to perform common desktop interactions such as pointing-and-clicking and entering text. The goal of this project is to transform flat, everyday surfaces into a rich interactive surface. For example, a desk or a wall could be transformed into a virtual keyboard. Flat surfaces afford not only haptic feedback, but also provide ergonomic advantages by providing a place to rest your arms. This project will develop a system where microphones are placed on surfaces to enable the sensing of when and where a tap has occurred. Further, the system aims to differentiate different types of touch interactions such as tapping with a fingernail, tapping with a finger pad, or making short swipe gestures.

This project will investigate different machine learning algorithms for producing a continuous coordinate for taps on a surface along with associated error bars. Using the confidence of sensed taps, the project will investigate ways to intelligently inform aspects of the user interface, e.g. guiding the autocorrection algorithm of a virtual keyboard decoder. Initially, the project will investigate sensing via an array of surface-mounted microphones and design “surface algorithms” to determine and compare the location accuracy of the finger taps on the virtual keyboard. These algorithms will experiment with different models including existing time-of-flight model, a new model based on Gaussian Process Regression, and a baseline of classification using support vector machines. For all models, the project will investigate the impact of the amount of training data from other users, and varying the amount of adaptation data from the target user. The project will compare surface microphones with approaches utilizing cameras and wrist-based inertial sensors. The project will generate human-factors results on the accuracy, user preference, and ergonomics of interacting midair versus on a rigid surface. By examining different sensors, input surfaces, and interface designs, the project will map the design space for future AR and VR interactive systems. The project will disseminate software and data allowing others to outfit tables or walls with microphones to enable rich interactive experiences.

Vertanen Teaches Workshop in Mumbai, India

Keith Vertanen

Keith Vertanen (CS/HCC), associate professor of computer science, traveled to Mumbai, India, in July to co-facilitate a three-day workshop on best practices for writing conference papers. The workshop was presented by ACM SIGCHI and its Asian Development C

ommittee, which works to increase its engagement with researchers and practitioners from Asia. The aim of the workshop was to encourage res

earchers from Asia to submit papers for the ACM CHI 2021 Conference on Human Factors in Computing Systems.

Workshop Students and Instructors

Vertanen, who is co-chair of the Usability Subcommittee for CHI 2020, presented lectures on paper writing and experimental design to 20 PhD candidates from various universities in India, Sri Lanka, and South Korea. Vertanen also presented a talk on his text entry research and served on an advisory panel that offered feedback to the PhD students on their research in a forum similar to a doctoral consortium. Also co-facilitating the workshop were faculty members from University of Central Lancashire, UK, KAIST University, South Korea, and Georgia Institute of Technology, Atlanta. Visit https://www.indiahci.org/sigchischool/paperCHI2021/ to learn more about the workshop.

Appropriating Everyday Surfaces for Tap Interaction

Zachary Garavet and Siva Kakula

Researchers

Scott Kuhl (Associate Professor, CS)

Keith Vertanen (Assistant Professor, CS)

Sponsor: ECE Alumnus Paul Williams ’61

Amount of Support: $44,000

Duration of Support: 1 year

What if an everyday surface, like a table, could be transformed into a rich, interactive surface that can remotely operate things like computers, entertainment systems, and home appliances?

That’s what Michigan Tech Institute of Computing and Cybersystems (ICC) researchers Keith Vertanen and Scott Kuhl set out to do with a $44K seed grant from Electrical and Computer Engineering alumnus Paul Williams ’61.

Vertanen, assistant professor of computer science, and Kuhl, associate professor of computer science, are members of the ICC’s Center for Human-Centered Computing, which integrates art, people, design, technology, and human experience in the research of multiple areas of human-centered computing. They were assisted in this research by PhD candidate Siva Krishna Kakula, Computer Science, and undergraduate Zachary Garavet, Computer Engineering.

The team’s research goals were threefold: to create machine learning models that can precisely locate a user’s taps on a surface using only an array of inexpensive surface microphones; demonstrate the feasibility and precision of the models by developing a virtual keyboard interface on an ordinary wooden table; and conduct user studies to validate the system’s usability and performance.

The researchers are working on a related technical conference paper to present to their peers. Their outcomes included a prototype virtual keyboard that supports typing at rates comparable to a touchscreen device; possibly the first-ever acoustic sensing algorithm that infers a continuous two-dimensional tap location; and novel statistical models that quickly adapt to individual users and varied input surfaces.

Further, their results, hardware, and data sets can be applied to future collaborative work, and were used in the researchers’ $500K National Science Foundation proposal, “Text Interaction in Virtual and Augmented Environments,” which is under review.

Future applications of the research include enriched interactions in Virtual Reality (VR) and Augmented Reality (AR), compared to existing vision-only based sensing; and on-body interaction, like using your palm as an input surface.

Vertanen and Kuhl plan to continue this research, working to improve the accuracy of tap location inference, build richer interactions like swiping or tapping with multiple fingers, develop wireless sensor pods that can be quickly and easily deployed on any flat surface, and explore the display of virtual visual content on surfaces via Augmented Reality smartglasses.

View a video about this research at https://youtu.be/sF7aeXMfsIQ.

Seed grant donor Paul Williams is also the benefactor of the Paul and Susan Williams Center for Computer Systems Research, located on the fifth floor of the Electrical Energy Resources Center. The 10,000-square-foot, high-performance computing center—the home of the ICC—was established to foster close collaboration among researchers across multiple disciplines at Michigan Tech

The ICC, founded in 2015, promotes collaborative, cross-disciplinary research and learning experiences in the areas of cyber-physical systems, cybersecurity, data sciences, human-centered computing, and scalable architectures and systems. It provides faculty and students the opportunity to work across organizational boundaries to create an environment that mirrors contemporary technological innovation.

Five research centers comprise the ICC. The ICC’s 50 members, who represent 15 academic units at Michigan Tech, are collaborating to conduct impactful research, make valuable contributions in the field of computing, and solve problems of critical national importance.

Visit the ICC website at mtu.edu/icc. Contact the ICC at icc-contact@mtu.edu or 906-487-2518.

Download a summary of this research.

Williams Seed Grant Funds Virtual Keyboard Research

Siva Krishna Kakula and Zachary GaravetBy Karen Johnson, ICC Communications Director

What if an everyday surface, like a table, could be transformed into a rich, interactive surface that can remotely operate things like computers, entertainment systems, and home appliances?

That’s what Michigan Tech Institute of Computing and Cybersystems (ICC) researchers Keith Vertanen and Scott Kuhl set out to do with a $44K seed grant from Electrical and Computer Engineering alumnus Paul Williams ’61.

Vertanen, assistant professor of computer science, and Kuhl, associate professor of computer science, are members of the ICC’s Center for Human-Centered Computing, which integrates art, people, design, technology, and human experience in the research of multiple areas of human-centered computing. They were assisted in this research by PhD candidate Siva Krishna Kakula, Computer Science, and undergraduate Zachary Garavet, Computer Engineering.

The team’s research goals were threefold: to create machine learning models that can precisely locate a user’s taps on a surface using only an array of inexpensive surface microphones; demonstrate the feasibility and precision of the models by developing a virtual keyboard interface on an ordinary wooden table; and conduct user studies to validate the system’s usability and performance.

The researchers are working on a related technical conference paper to present to their peers. Their outcomes included a prototype virtual keyboard that supports typing at rates comparable to a touchscreen device; possibly the first-ever acoustic sensing algorithm that infers a continuous two-dimensional tap location; and novel statistical models that quickly adapt to individual users and varied input surfaces.

Further, their results, hardware, and data sets can be applied to future collaborative work, and were used in the researchers’ $500K National Science Foundation proposal, “Text Interaction in Virtual and Augmented Environments,” which is under review.

Future applications of the research include enriched interactions in Virtual Reality (VR) and Augmented Reality (AR), compared to existing vision-only based sensing; and on-body interaction, like using your palm as an input surface.

Vertanen and Kuhl plan to continue this research, working to improve the accuracy of tap location inference, build richer interactions like swiping or tapping with multiple fingers, develop wireless sensor pods that can be quickly and easily deployed on any flat surface, and explore the display of virtual visual content on surfaces via Augmented Reality smartglasses.

View a video about this research at https://youtu.be/sF7aeXMfsIQ. Download a summary of the research from the ICC website at icc.mtu.edu/downloads.

Seed grant donor Paul Williams is also the benefactor of the Paul and Susan Williams Center for Computer Systems Research, located on the fifth floor of the Electrical Energy Resources Center. The 10,000-square-foot, high-performance computing center—the home of the ICC—was established to foster close collaboration among researchers across multiple disciplines at Michigan Tech

Computer Science Workshop Held April 5-7

Explore CSR GroupMichigan Tech hosted the workshop “Exploring Computer Science Research” last Friday – Sunday (April 5-7). The workshop was one of 15 Google has sponsored in the U.S. and was organized by four CS Faculty: Leo Ureel, Linda Ott, Jean Mayo and Laura Brown; Jean Mayo and Laura Brown are members of the ICC. The workshop was for women and underrepresented groups to explore research and graduate school opportunities in computer science.

There were 26 attendees from six universities and colleges across Michigan and Wisconsin. Over the course of the weekend each student participated in a research experience, investigating a research question with a faculty mentor. Topics included:

Machine Vision – Robert Pastel, ICC Center for Human-Centered Computing

Data Science in Energy Systems – Laura Brown, ICC Center for Data Sciences

Cybersecurity and Privacy in Storage Systems – Bo Chen, ICC Center for Cybersecurity

Agent-based Simulations in Education – Leo Ureel

Human Computer Interactions: Natural Language Processing for Assistive Technologies – Keith Vertanen, ICC Center for Human-Centered Computing

After learning about and working on their research topics, the students presented out to the group. In addition to their research experiences, attendees learned about different job opportunities after graduate school, heard how to apply to graduate schools and talked to current graduate students about the graduate school experience and their research.

Guest speakers included Niloofar Gheissari and Anja Gruenheid, two Google employees, Pushpalatha Murthy, Dean of the Graduate School and Robin Hunicke, our keynote speaker from the University of California Santa Cruz and Funomena.