Huskies are all too familiar with snowy winter roads—but artificial intelligence (AI) isn’t. That means snow, in all its forms, is especially challenging for autonomous vehicles and the many sensors they use to navigate. But where one sensor may be confounded, there’s usually another that can see more clearly. The trick is to teach the sensors to work as a team, coached by AI.
Together with their graduate students, Nathir Rawashdeh, assistant professor of computing, and Jeremy Bos, assistant professor of electrical and computer engineering, are using algorithms to teach the sensors to work together through an AI process called sensor fusion.
“Every sensor has limitations, and every sensor covers another one’s back,” says Rawashdeh.
The team gathered light detection and ranging (lidar), radar, and image data from snowy places around the globe (including the Keweenaw). After cleaning it up and ensuring accurate labeling, they’re using it to teach vehicle AI what snow looks like—and how to fuse data from its sensors to drive snow-covered roads successfully.
Both Rawashdeh and Bos are members of the Institute of Computing and Cybersystems’ (ICC) Center for Data Science (DataS).