Archives—April 2017

The Making of a Citizen Science App

Astronomy is a citizen’s science. Its foundation is ordinary people who help answer serious scientific questions by providing vital data to the astronomical community. Nebulas, supernovas, and gamma ray sightings.

The availability of smartphones make collecting and sharing scientific data easier, faster, and more accurate.

These days former astronomy teacher Robert Pastel isn’t as interested in the stars, but he is serious about environmental science and using computer science—and smartphones—to capture more data from citizen scientists.

The availability of smartphones make collecting and sharing scientific data easier, faster, and more accurate. Pastel works with Alex Mayer, professor of civil and environmental engineering at Michigan Tech, students in both computer science and humanities, and scientists around the world to build mobile apps that feed real-world projects.

It starts in the summer, with scientists. “We reach out to them, or they find us. They share an idea and how citizen science can be used,” Pastel explains. “Then the app building begins; it’s about a two-year process.”

When the academic year rolls around, Pastel challenges his Human-Computer Interactions class to build the initial app prototype. In the following year, during Pastel’s Senior Design course, the app undergoes a makeover—from mobile app to a web-based tool. “By this time the scientists have likely changed their minds or solidified their ideas, and more changes are made,” Pastel adds.

34

An interactive mushroom mapper is the group’s most successful accomplishment to date. Hikers, bikers, or climbers—anyone with a smartphone and an affinity for fungi—capture a photo of the fungus, specify the type, describe the location, and hit submit. All via the app. The mushroom observation data reaches Eric Lilleskoz, a research ecologist with the United States Department of Agriculture. Mushroom Mapper has more than 250 observations from around the country. The app is also used for natural science education in local middle schools.

In addition to creating apps for citizen science, this NSF-supported effort has spawned student-initiated software development and offline apps.


Student Success in Computer Science

Redeveloping Michigan Tech’s introductory computer science courses has not been an easy feat. But for Leo Ureel, it’s meaningful work. “It’s about setting the right environment,” he says.

Humans learn best when we communicate with others. We’ve taken what we know works in industry and applied it to the classroom.

In the old model, instructors lectured, then assigned independent tasks. Teaching assistants graded the projects and returned them to students two or three weeks later. In a new model Ureel helped create, students work in groups of two to four to mimic workforce settings. “We are no longer just feeding information. Humans learn best when we communicate with others. We’ve taken what we know works in industry and applied it to the classroom,” Ureel explains.

With support from a Jackson Blended Learning Grant, Ureel implemented a web-based teaching assistant to tighten the feedback loop for students. Students submit code via a web portal and receive instant feedback. “They continue submitting work until they get it right. It’s mastery learning,” Ureel adds.

Authentic Learning Experiences

When first-year Michigan Tech student Lauren Brindley received a Google Ignite Computer Science grant to provide funding for 10 robots, Ureel knew it was an opportunity to provide a rich learning experience for students. “After graduation, it’s likely students will build robots in their careers; we’re providing real-world, hands-on learning from day one.” Ureel is developing inquiry curriculum where first-year computer science students will explore how to program the rover robots to move about the room.

33

Ureel’s next challenge is to assess each first-year student to ensure they’re in the proper course. “Nonmajors often come in with little to no programming experience; meanwhile computer science majors are off and running, ready for a challenge,” Ureel says. To help several hundred students determine the best courses, Ureel is creating an online course sample so students get a taste of
course content before making any decisions.

Preliminary data indicates Ureel’s efforts are working. “Engagement, retention, and grades are improving.”


Advancements in Eyes-free Text Entry

For Keith Vertanen, the satisfaction of helping people with visual impairments is a byproduct of the challenge he seeks.

Vertanen’s research will offer more texting options not only to the blind community, but to the situationally impaired, too.

“My interest stemmed from sighted text entry research. The decoder (a touchscreen keyboard recognizer) is so accurate—we craved a bigger undertaking,” Vertanen explains. So he dug into literature and consulted with users who are blind to determine the need for better eyes-free text-entry options.

Existing accessibility solutions are slow. “There is a delay because users have to search for the target, key, or graphic and wait for audio feedback,” Vertanen says. By sliding a finger around on the touchscreen, the system announces via text-to-speech what their finger is over. When they find the element they want (it could be a key on a touchscreen keyboard), they double tap with their searching finger or they “split tap” by tapping with a second finger. The interaction technique was developed out of research at the University of Washington and is now a standard accessibility feature on iPhone and Android phones.

With Vertanen’s prototype, users with visual impairments imagine the size, position, and orientation of the Qwerty keyboard. They are asked to tap out letters, and eventually sentences. So far, users accurately tap their intended text on the imaginary display about 50 percent of the time.

30

There’s more work to be done. From this noisy data, Vertanen asks two questions: Can we develop new and improved algorithms to more accurately recognize the user’s intended text? And can we find ways users can provide the recognizer with a better signal while still allowing fast entry?

Vertanen’s research will offer more texting options not only to the blind community, but to the situationally impaired, too: “Those times when you cannot attend to your phone, like when you’re walking. Or perhaps we can treat your airline tray table as a touch-typing surface—but without a visual display.”

His research will also impact the devices of the future which may be designed without a text display.

“These are hard problems to solve. The other challenge is how to make error-correction efficient and pleasant. This is especially true if people are entering difficult text such as proper names or acronyms. A complementary approach is,  how do you design text-entry interfaces that allow users to be more explicit (albeit slower) about parts of their text they anticipate will be difficult to recognize,” Vertanen asks.