Skip to content

Snow, fog, and ice spotlight ADAS system weakness

Inclement weather continues to challenge ADAS system performance, potentially affecting driver trust in autonomous driving. By Byron Stanley

As the seasons change, weather and road conditions become ever more unpredictable. From thick fog to treacherous snow and ice, driving becomes a challenge as sightlines are reduced and roadways turn slick—making safety top of mind for everyone on the road.

During these extreme and often quickly deteriorating conditions, drivers would like to turn to their vehicle’s advanced driver assist systems (ADAS)—such as lane-keeping, automatic emergency braking, adaptive cruise control, and proximity monitors—to help them navigate safely and more confidently. When a driver’s vision is challenged by conditions like heavy snow, dense fog, and pouring rain, the driver should be able to activate ADAS capabilities to improve awareness and assist in safely maintaining their own lane, avoiding hazards, and smoothly moving through turns.

In recent years, an increasing number of ADAS technologies have been added to vehicles with the goal of creating a safer driving experience, however these technologies are often rendered unreliable or inoperable in difficult road conditions. In fact, even in clear weather, automotive researchers found that over the course of 4,000 miles travelled, vehicles’ driving assistance systems experienced some type of issue every eight miles (and oftentimes less). These safety-related systems often disengage with little notice, especially in challenging road conditions where the driver is most likely to need assistance, putting themselves, their occupants, and others on the road at increased risk.

Limitations of today’s ADAS technology

ADAS functionality can typically be divided into perception, such as observing people or cars for automated braking, and localisation, such as determining vehicle position for lane keeping applications. Sensors—cameras, radar, GPS, and some simple forms of LiDAR—make both functions possible but are often the fundamental cause of ADAS system failures.

Although the automotive industry has come a long way in the development of driver assistance and automated features, the fact remains that most only work in the somewhat ideal conditions

Cameras, one of the most commonly used ADAS sensors, work by passively observing light reflected or emitted by the surface environment. While cameras enable a high resolution view of the surrounding environment, changes that block light from reaching the camera will degrade or render them non-functional. These obscurations could be caused by sun glare, bright lights, a truck or object blocking the view, snow or ice on the lens, rain or snow in the air, drops of water on the windshield, or even fog.

LiDAR, which operates by sending out light and determining distance and intensity from the reflections, allows operation at night, detects objects and can provide map-based positioning. As a sensor, in addition to being exceptionally expensive, LiDAR is susceptible to many of the same issues as cameras.

Although automotive radar provides a lower resolution image of the environment than a camera does, it does pose several advantages. Radar can transmit and receive radio waves through falling snow and rain with less loss, however current radar technologies are still extremely limited in penetration depths and are sensitive to ice or snow build-up due to the frequencies at which they operate.

GPR
Ground penetrating radar can map the road structure beneath a vehicle

GPS works by receiving highly accurate timing signals from a constellation of satellites and then calculating the vehicle’s current position on Earth. The advantage to GPS is that it finds a vehicle’s position wherever it is, however, it is typically too inaccurate and unreliable for functions like lane keeping. GPS positioning also fails when signals aren’t available—such as when a vehicle is next to tall buildings, in valleys, under overpasses, inside of parking garages, under tree cover, and in tunnels—or when signals bounce off nearby objects such as surrounding buildings, overhead road signs, and other vehicles. Additionally, GPS signals are low amplitude and can be easily affected by other environmental signals.

While a typical approach to mitigating some of these failures is to combine sensors with the hope that at least one type will work amid the aforementioned scenarios, this approach does not work reliably in many of the above situations as the same failures apply to the critical sensors. Specifically, situations where sensors are blocked or degraded continue to pose a challenge for both reliable localisation and perception. As a result, one of the biggest limitations of today’s ADAS capabilities is weather susceptibility.

Poor driving conditions are a high safety impact use case of ADAS technologies, yet often lead to ADAS system failures. Although the automotive industry has come a long way in the development of driver assistance and automated features, the fact remains that most only work in the somewhat ideal conditions. A large gap exists between what is available today, and what the ADAS industry needs to provide a more safe driving experience when drivers need it the most.

Addressing today’s limitations

In 2020, 96% of vehicle models were equipped with at least one ADAS system, such as automatic emergency braking, blind spot warning or lane keeping assistance. Drivers have grown accustomed to using these technologies, however they grow increasingly disappointed when the technologies fail to operate reliably or at all. In a survey conducted by AAA last year, data shows that 80% of drivers want their current vehicle safety systems to work better. Some camps have warned that automakers must prioritise improving today’s ADAS safety systems and not just assume that developing autonomous vehicles (AVs) will solve these issues. AVs themselves struggle with many of the same fundamental issues that ADAS systems do, even with more expensive equipment to work with. Between failing to address the technical issues and failing to deliver needed value to customers, automakers risk lives and further customer disillusionment.

To unlock the next level of autonomy the industry first has to solve for ADAS system success in the snow, fog and ice

More and more drivers of ADAS-enabled vehicles are experiencing advanced vehicle technology and partially automated systems for the first time. These failures are their first impression and will set the baseline for their thoughts on the future reliability and safety of other driver assistance technology—including the future of autonomy.

In extending the reliability and availability of ADAS systems, automakers may consider fusing additional sensors into the vehicle’s sensor stack. Complementary technologies can improve significantly on legacy sensors and result in increased reliability, availability, and precision.

The industry is seeing an influx of innovative solutions that can drastically improve current ADAS capabilities. For example, using technologies like ground penetrating radar or other novel subterranean mapping techniques will allow a vehicle to reliably determine its position even in the midst of winter. This novel technology will increase the reliability, safety and precision of ADAS capabilities such as lane-keeping no matter what driving or weather conditions lie ahead for the driver.

GPR pickup in snow
Technologies like ground penetrating radar will allow a vehicle to reliably determine its position even in the midst of winter

Fuelling tomorrow’s AV adoption

The autonomous driving industry faces many hurdles when it comes to both securing a safe driving experience and enhancing driver’s thoughts around autonomy. Luckily there are new sensors and approaches that the industry as a whole can rely upon to improve AVs to be safe, reliable and efficient, both in practice, and in consumers’ eyes especially when needed most.

With the arrival of winter in many markets, it is clear that to unlock the next level of autonomy the industry first has to solve for ADAS system success in the snow, fog and ice. The future of driving may be autonomous, however, the safety and reliability of autonomous driving features, as well as assuaging drivers’ concerns, hinges upon the improvement of current ADAS technologies to operate seamlessly in all road and weather conditions. Whether it is rain, fog, snow or ice, these types of driving conditions will continue to persist and remain a barrier to AVs until the industry leans into processes and technologies that can advance these capabilities in vehicles today. Complete adoption, which will save many lives, is well within reach. The industry just needs to get there as efficiently and safely as possible through improving the safety, reliability, and availability of ADAS systems—year round.


About the author: Byron Stanley is Cofounder and Chief Technology Officer of GPR 

Welcome back , to continue browsing the site, please click here