Safer autonomous vehicles

Printer-friendly versionPrinter-friendly versionPDF versionPDF version

Sept. 21, 2018

MSU is perfecting sensing and perception algorithms to guide autonomous cars in weather and traffic

If every road was straight and the weather always sunny and warm, self-driving—or autonomous—vehicles would have little trouble getting from point A to point B.“Radar, when fused with other sensing modalities, could achieve not only a human level of perception, but a super-human level of perception.”  – Hayder Radha

If only.

In reality, making autonomous vehicles part of our future will require technology that not only keeps self-driving vehicles on the road in perfect conditions, but also guides them safely through snow, fog and other challenging weather and traffic conditions.

That’s where Michigan State University comes in.

Working in a College of Engineering-based research center called Connected and Autonomous Networked Vehicles for Active Safety, or CANVAS, MSU researchers are perfecting the sensing and perception algorithms that will tell an MSU-created autonomous car where to go, no matter the weather or traffic conditions.

“We are studying the challenges associated with driving autonomously in severe weather conditions,” said Hayder Radha, CANVAS director and professor of electrical and computer engineering. “That includes heavy snow, fog and even extreme low temperatures, which can affect navigation.”

Radha said MSU is working to perfect radar sensing, “which could be one of the most dependable sensors that an autonomous vehicle could rely on.”

In addition to helping the car make its way through snow, rain or gloom of night, the radar detects objects and can distinguish what they are, a process known as “classification.”

“It will be able to tell if that object is a pedestrian or a bike or another vehicle,” Radha said. “Is it something that is going to move? Then it can help the car make the decision on how to respond.”

Radar could become almost as good as a camera in helping to achieve what he calls “situational awareness," he continued.

MSU also is partnering with a New York-based startup called CARMERA to develop three-dimensional maps that can assist greatly in the navigation process. And while it’s too early to proclaim mission accomplished, great strides are being made.

Last fall, the CANVAS team experimented with maps developed on the MSU campus. Later in the year, when leaves had fallen from the trees and the landscape looked significantly different, the maps and the MSU autonomous driving algorithms were tested, passing with flying colors.

Another focus of MSU’s research is what’s called “connected sensor fusion,” which essentially entails vehicles communicating with one another and sharing sensor data. For example, an autonomous car is approaching an intersection where a pedestrian is waiting to cross. That car, autonomous car A, can’t detect the person because another car, autonomous car B, is in its way. But car B can detect the pedestrian.

“The vehicle that perceives the pedestrian is able to share that information with other nearby vehicles."

An autonomous vehicle created at MSU determines its precise location within its environment using live lidar measurements, advanced localization algorithms and high-resolution 3D maps.Despite some drivers’ concerns about letting go of the wheel, the use of autonomous vehicles is expected to make the road a safer place. Improving sensing and perception technologies is intended to reduce human error, which, according to recent studies, causes 90 percent of car accidents.

Self-driving cars also may benefit car owners by increasing productivity. Radha pointed out that if someone is stuck in traffic but riding in a self-driving car, he or she can do some work, relax a bit, even take a nap.

“It could significantly improve our quality of life,” he added.