Skip to content

Multi-modal HMI essential to build trust in AVs

The human-machine interface is evolving to incorporate artificial intelligence, allowing for the safe operation of self-driving technology. By Michael Nash

The rollout of highly automated vehicles is by no means just around the corner, but several OEMs are currently testing self-driving technology on public roads and some have plans to bring their efforts to market before 2025. Aside from developing the technology and ironing out any glitches to ensure it operates safely and efficiently,

OEMs and Tier 1 suppliers are faced with a great number of issues before the widespread rollout of highly autonomous vehicles is realised.

One of the primary challenges is establishing trust between car and driver, something which could be achieved with a multi-modal human-machine interface (HMI) incorporating artificial intelligence (AI). This is a relatively new area, but one that could be critical to market acceptance of self-driving technology.

Next-gen evolution

Most HMI systems on the market today are relatively simplistic in comparison to advanced, multi-modal concepts that use AI, but several companies are eager to push the boundaries. Volvo Cars, for example, has a team of experts dedicated to HMI research and design. The team is presently working on technology that bridges the gap between non-autonomous and autonomous driving.

“The system needs to be intuitive but also needs to make sure that it only adds to the safety of vehicle occupants and other road users” – Malin Ekholm, Volvo Cars Safety Centre

The OEM has plans for a specific manoeuvre for the human driver to transfer control responsibility to the car. Speaking to Automotive World in his previous capacity as Senior Technical Leader – Safety and Driver Support Technologies at Volvo Cars, Erik Coelingh said, “It’s about pushing two pedals. It has to be very clear when it’s on or off. We do not want it to be able to activate or deactivate by mistake.” Coelingh, who is now at the Autoliv-Volvo driver assistance systems joint venture Zenuity, added at the time, “Once the car is driving autonomously, we also have to provide trust to the driver. If you want to sit relaxed, read a book, do something else, then you need to trust the vehicle you are in.”

Crucially, the HMI must be developed with safety at its core, said Malin Ekholm, Senior Director at Volvo Cars Safety Centre. “The system needs to be intuitive but also needs to make sure that it only adds to the safety of vehicle occupants and other road users,” she said. It cannot, she added for clarity, be detrimental.

As such, techniques like voice command and gesture control could play a greater role in minimising driver distraction. This, added Ekholm, will continue to be important regardless of autonomous driving technology rollout, as vehicle occupants are likely to need to remain vigilant and aware behind the wheel for a long time.

A recent paper published by the US National Highway Traffic Safety Administration (NHTSA) states that distracted driving leads to over 420,000 injuries and 3,100 fatalities every year in the US alone, and that nearly one-third of all US drivers between 18 and 64 years old read or send text or email messages while driving.

Continental is one of many suppliers currently developing HMI technology with two goals in mind – reducing driver distraction and enabling autonomous driving. It recently announced that it was researching and testing a range of display and control concepts that are designed to maximise the relationship between car and driver.

“The use of multi-modal HMI allows drivers to access controls and functions much faster than with conventional control concepts involving buttons and switching” – Thomas Vöhringer-Kuhnt, Continental

“On the road towards fully automated driving, our biggest challenge will be the new role of drivers and the resulting new needs and requirements,” explained Karsten Michels, Head of System and Advanced Development in Continental’s Interior division. “Up to now, drivers have been solely occupied with the task of driving; in the future, however, they will become critical users and monitors in the cockpit. To meet this challenge, they have to know at all times how the vehicle is behaving and the vehicle’s current driving mode. Transparency and an awareness of the current situation are our watchwords when it comes to developing new concepts for a holistic human-machine dialogue. Only in this way can drivers place their trust in fully automated driving systems.”

A suite of multi-modal functions will be necessary to provide relevant information to vehicle occupants, making sure they know how the car is behaving and why. Thomas Vöhringer-Kuhnt, Head of HMI User Experience & Design at Continental, told Megatrends that the use of multi-modal HMI also “allows drivers to access controls and functions much faster than with conventional control concepts involving buttons and switches.”

Raising questions

This view was echoed by Arnd Weil, Senior Vice President and General Manager at Nuance Automotive. “If the driver wants to listen to music, for example, it’s tedious to go through various apps and select it with buttons on the steering wheel,” he explained to Megatrends. “It’s so much easier if we use natural language voice recognition technology. The same idea goes for inputting a destination into the navigation system, accessing the phone book and writing emails.”

Weil is confident that multi-modal HMI will play an increasingly important role as consumers demand greater levels of connectivity, convenience and a more personalised driving experience. However, the most benefits will be seen when combining these different modes with AI.

“Transparency and awareness are our watchwords when developing new concepts for a holistic human-machine dialogue. Only in this way can drivers place their trust in fully automated driving systems” – Karsten Michels, Continental

“AI is a big field and probably needs a narrower definition,” he says. “If we’re talking about deep learning, which many people consider as AI, it is dependent on the data obtained to actually do that learning. This raises the question of data privacy, data ownership, and who has access to what data and can leverage it.”

With the use of AI, the HMI can become a personal on-board assistant that learns about driver and vehicle occupant preferences. It can access data stored in the Cloud on previous driving scenarios and identify patterns. “When connected cars become autonomous, these assistants must leverage AI to keep their passengers connected, informed and engaged in the event they need to take the wheel,” Weil noted. “But what happens if somebody hacks that history and knows the car? This person could basically control it remotely, which is a huge issue and a bottleneck for rolling out the technology.”

Bold expectations

Investment in AI across the automotive industry has been growing. In January 2017, Nissan’s Chief Executive Carlos Ghosn confirmed the launch of Seamless Autonomous Mobility (SAM) – a system that marries cameras, radars and LiDAR with AI, allowing cars to become smart enough to know when they should or should not attempt to negotiate difficult driving tasks. If deemed necessary, the car can request help via a command centre, and a person can access the situation before taking action over the wireless network.

A month later, Ford announced that it would spend US$1bn over the course of five years on developing a virtual driver system for autonomous vehicles with Pittsburgh-based start-up Argo AI. At the time, the then Ford Chief Executive Mark Fields said that this would help to strengthen the company’s “leadership in bringing self-driving vehicles to market in the near term.”

At CES in 2017, Mercedes-Benz declared that it was working with Nvidia to launch an AI-powered car “within 12 months”. This is just part of an on-going collaboration between the two companies that is focused on deep learning and AI.

“The system needs to be intuitive but also needs to make sure that it only adds to the safety of vehicle occupants and other road users” – Malin Ekholm, Volvo Cars Safety Centre

“The hottest trend in developing advanced driver assistance systems (ADAS) seems to be deep learning technology, which is used for image recognition and understanding the car’s surroundings,” Alex Mankowsky, a Futurist at Daimler’s Futures Studies & Ideation unit, recently told Automotive World. “Advancements in this field have exceeded our boldest expectations from a few years back.”

Weil thinks that on-going investments in AI from both OEMs and suppliers will be extremely important for the safe rollout of self-driving technology. “If we look five or ten years out, the HMI must be trustworthy, intelligent and users must be able to interact with it just as they would with another human,” he suggested. “Think about movies where intelligent car systems talk to drivers and can provide a long list of helpful services, from taking over the driving task to paying for fuel. This is the direction the industry is heading in.”

In the autonomous car of the future, occupants could give the vehicle a variety of commands through a range of media, from voice and gesture to haptic control. A concept car recently showcased by ZF at its 2017 Global Press Event was able to navigate around a track without input from the driver, but the passenger could take control by using a round haptic controller positioned on the centre console.

While only a prototype, it provided a glimpse into the future and the evolving interaction between vehicle occupants and their cars. With a variety of new, smart HMIs on the cusp of entering the market, the possibilities seem limited only by the rate at which consumers are ready to accept innovative technologies.

This article appeared in the Q3 2017 issue of Automotive Megatrends Magazine. Follow this link to download the full issue

Welcome back , to continue browsing the site, please click here