by Allison Mills, University Marketing and Communications
A major challenge for fully autonomous vehicles is navigating bad weather. Snow especially confounds crucial sensor data that helps a vehicle gauge depth, find obstacles and keep on the correct side of the yellow line, assuming it is visible. Averaging more than 200 inches of snow every winter, Michigan’s Keweenaw Peninsula is the perfect place to push autonomous vehicle tech to its limits.
In two papers presented at SPIE Defense + Commercial Sensing 2021, researchers from Michigan Technological University discuss solutions for snowy driving scenarios that could help bring self-driving options to snowy cities like Chicago, Detroit, Minneapolis and Toronto.
The team includes Nathir Rawashdeh and doctoral student Abu-Alrub (CC) as well as Jeremy Bos and student researchers Akhil Kurup, Derek Chopp and Zach Jeffries (ECE).
Read more about their collaborative mobility research on mtu.edu/news.
Nathir Rawashdeh (AC) led the publication of a paper at the recent online SPIE Defense + Commercial Sensing / Autonomous Systems 2021 Conference.
The paper, entitled “Drivable path detection using CNN sensor fusion for autonomous driving in the snow,” targets the problem of drivable path detection in poor weather conditions including on snow-covered roads. The authors used artificial intelligence to perform camera, radar and LiDAR sensor fusion to detect a drivable path for a passenger car on snow-covered streets. A companion video is available.
Co-authors include Jeremy Bos (ECE).
A conference paper published in IEEE Xplore entitled, “Interfacing Computing Platforms for Dynamic Control and Identification of an Industrial KUKA Robot Arm” has been published by Assistant Professor Nathir Rawashdeh, Applied Computing.
In this work, a KUKA robotic arm controller was interfaced with a PC using open source Java tools to record the robot axis movements and implement a 2D printing/drawing feature.
The paper was presented at the 2020 21st International Conference on Research and Education in Mechatronics (REM). Details available at the IEEE Xplore database.
A paper co-authored by Assistant Professor Nathir Rawashdeh (DataS, Applied Computing) on Skin Cancer Image Feature Extraction, has been published this month in the EurAsian Journal of BioSciences.
View the open access article, “Visual feature extraction from dermoscopic colour images for classification of melanocytic skin lesions,” here.
Additional authors are Walid Al-Zyoud, Athar Abu Helou, and Eslam AlQasem, all with the Department of Biomedical Engineering, German Jordanian University, Amman, Jordan.
Citation: Al-Zyoud, Walid et al. “Visual feature extraction from dermoscopic colour images for classification of melanocytic skin lesions”. Eurasian Journal of Biosciences, vol. 14, no. 1, 2020, pp. 1299-1307.
Rawashdeh’s interests include unmanned ground vehicles, electromobility, robotics, image analysis, and color science. He is a senior member of the IEEE.
The College of Computing is pleased to announce that it has awarded five faculty seed grants, which will provide immediate funding in support of research projects addressing critical needs during the current global pandemic.
Tim Havens, College of Computing associate dean for research, said that the faculty seed grants will enable progress in new research that has the potential to make an impact on the current research. Additional details will be shared soon.
Congratulations to the winning teams!
Guy Hembroff (AC, HI): “Development of a Novel Hospital Use Resource Prediction Model to Improve Local Community Pandemic Disaster Planning”
Bo Chen (CS): “Mobile Devices Can Help Mitigate Spreading of Coronavirus”
Nathir Rawashdeh (AC, MERET): “A Tele-Operated Mobile Robot for Sterilizing Indoor Space Using UV Light” (A special thanks to Paul Williams, who’s generous gift to support AI and robotics research made this grant possible)
An SAE technical paper, co-authored by Nathir Rawashdeh, assistant professor, CMH Division, College of Computing, has been accepted for publication at the WCX SAE World Congress Experience, April 21-23, 2020, in Detroit, MI. The title of the paper is “Mobile Robot Localization Evaluations with Visual Odometry in Varying Environments using Festo-Robotino.”
Abstract: Autonomous ground vehicles can use a variety of techniques to navigate the environment and deduce their motion and location from sensory inputs. Visual Odometry can provide a means for an autonomous vehicle to gain orientation and position information from camera images recording frames as the vehicle moves. This is especially useful when global positioning system (GPS) information is unavailable, or wheel encoder measurements are unreliable. Feature-based visual odometry algorithms extract corner points from image frames, thus detecting patterns of feature point movement over time. From this information, it is possible to estimate the camera, i.e. the vehicle’s motion. Visual odometry has its own set of challenges, such as detecting an insufficient number of points, poor camera setup, and fast passing objects interrupting the scene. This paper investigates the effects of various disturbances on visual odometry. Moreover, it discusses the outcomes of several experiments performed utilizing the Festo-Robotino robotic platform. The experiments are designed to evaluate how changing the system’s setup will affect the overall quality and performance of an autonomous driving system. Environmental effects such as ambient light, shadows, and terrain are also investigated. Finally, possible improvements including varying camera options and programming methods are discussed.
A conference paper co-authored by Nathir Rawashdeh (CC/MERET), has been accepted for presentation and publication at the 5th International Conference on Advances in Mechanical Engineering, December 17-19, 2019, in Istanbul, Turkey.
The paper is entitled, “Effect of Camera’s Focal Plane Array Fill Factor on Digital Image Correlation Measurement Accuracy.” Co-authors are Ala L. Hijazi of German Jordanian University, and Christian J. Kähler of Universität der Bundeswehr München.
Nathir Rawashdeh, College of Computing Assistant Professor of Mechatronics, Electrical, and Robotics Engineering Technology, will present a talk this Friday, December 6, from 3:00 to 4:00 p.m., in Rekhi 214. Rawashdeh will present a review of recent advancements in Unmanned Ground Vehicle (UGV) applications, hardware, and software with a focus on vehicle localization and autonomous navigation. Refreshments will be served.
Abstract: Unmanned Ground Vehicles (UGV) are being applied in many scenarios including, indoors, outdoors, and even extraterrestrial. Advancements in hardware and software algorithms reduce their cost and enable the creation of complete UGV platforms designed for custom application development, as well as research into new sensors and algorithms.