Skip to content

What role will AI play in the cockpit of the future?

The cockpit of the future will likely be dominated by voice control, and Affectiva believes that requires the introduction of human perception artificial intelligence. By Freddie Holmes

The cockpit of today is becoming increasingly digital, but drivers remain in control of the vehicle and its creature comforts. In a driverless future, artificial intelligence (AI) will be required to shape the user experience (UX) and ensure the ride is as enjoyable as ever.

To this end, Boston-headquartered Affectiva is developing what it calls Emotion AI, which measures facial expressions using computer vision, as well as vocal analysis through speech science, to recognise the emotional and cognitive state of both human drivers and passengers.

An early use case will be to reduce driver distraction, highlighting when a driver is looking at his or her phone behind the wheel, or misusing a semi-autonomous driving feature. Numerous incidents have already occurred where drivers have abused highway pilot systems, and the start-up believes that Emotion AI could help encourage drivers to pay attention. The technology is also being honed for fully driverless vehicles of the future.

The system works by identifying, isolating and tracking a human face. Vision-based algorithms then analyse and categorise facial expressions to judge human emotions and states. Future iterations will also be able to detect objects such as smartphones. Affectiva does not develop its own cameras or microphones to provide developers the freedom to integrate the technology within their vehicles as they see fit.

A little more conversation

While early use cases are certainly promising, Affectiva also has a keen eye on the future. The company aims to improve the in-vehicle UX in a world where drivers become passengers, and thus voice control becomes the norm.

The HMI in the cabin will become significantly smarter within the next few years, but it also has to become more conversational, sophisticated and intuitive

“We believe that the human-machine interface (HMI) in the cabin will become significantly smarter within the next few years, but it also has to become more conversational, sophisticated and intuitive,” said Abdo Mahmoud, Senior Product Manager at Affectiva. “You can draw parallels with the changes that have taken place with smartphones; they used to be simple devices to make calls, but are now very advanced machines.”

A future of voice control would be a far cry from today. Despite the best intentions of many automakers, voice recognition is rarely favoured over physical switches and buttons. That being said, the technology has come on leaps and bounds of late, even to a point where the system can interpret casual language and make relevant suggestions—“I’m hungry” can be understood as “find me a restaurant nearby that is open now and suits my preferences.”

But if drivers no longer have control of the vehicle themselves—and are even sharing a robotaxi with strangers—what role can advanced Human Perception AI play? “The cabin experience within the vehicle will become more of a critical differentiator across different brands,” Mahmoud suggested. “In robotaxis, the cabin can provide information that is critical to their operations.”

For example, it is useful for the vehicle to know where people are sitting, whether they have their seat belt on and whether they are using a smartphone. The technology could also be used to alert a passenger that he or she has left their wallet behind. All this information can help to ensure that passengers feel comfortable in the vehicle—it will give the impression that it understands exactly what is going on, and may even come across as helpful and friendly. That is a big plus for AV service operators, which will initially have to win over sceptics or scared first-time riders.

In robotaxis, the cabin can provide information that is critical to their operations

The AI should also be able to recognise how passengers are feeling. “A good example is if you get into a robotaxi and fall asleep,” said Mahmoud. “The car should recognise that, wake you up before you reach your destination and ensure you exited the ride safely with all your belongings.”

Acts such as this will help the car to build a rapport with users over time. Consider the wave of relief when another passenger passes over the wallet that had slipped out of your pocket by accident, or the taxi driver that advises the destination is just a few minutes away. It’s the little things that make all the difference, but digitising those micro-interactions is a sizeable task.

Public opinion

Numerous AV developers have prototype vehicles running on public roads today. In August 2019 more than 60 players held a permit to test with a safety driver in California alone. The vast majority of AV pilots do not take members of the public along, but a small handful do. In this case, Emotion AI can play a useful role.

For example, Waymo’s Early Rider programme in Phoenix may have a safety driver behind the wheel, but it also has members of the public in the back seat. Footage of each ride can show how riders react to certain events whilst the car was driving, and how the experience can be tweaked. Mahmoud explained that Affectiva’s Emotion AI technology can help developers to sift through hours of AV test footage and catalogue emotions.

Many people experience nausea in AV test cars, and facial and vocal expressions can help developers to adapt the way the car drives

“We can use data from the vehicle in order to understand their experience, as well as how they react to the operation of the vehicle itself. A good example is that many people experience nausea in AV test cars, and facial and vocal expressions can help developers to adapt the way the car drives,” he explained. While the technology has not been trained to recognise a passenger physically vomiting just yet, recognising that a passenger feels sick in advance of such a situation—and driving more gently as a result—is a bonus for all involved.

The smart cabin of the future

Somewhat fuelled by millennial demands for on-demand mobility, the next generation of shared vehicles will focus less on driving dynamics and more on usability and comfort. That naturally places a greater emphasis on how passengers interact with devices in the cabin, and further in the future, how the car interacts with its passengers.

Mahmoud believes that smart in-vehicle communication is the next frontier for vehicle HMI, with Affectiva nicely positioned to reap the benefits. Indeed, the start-up recently secured an additional round of funding to the tune of US$26m, led by Tier 1 AV specialist Aptiv and several venture capital firms.

“We are in somewhat of a unique position, having built similar systems in another vertical for years. In automotive, we are building the next-generation of AI-based systems for the smart cabin of the future,” he concluded. “That holistic system will understand not only where the driver is looking, but also how they are feeling and which objects they are interacting with. And not only for the driver, but for all passengers in the vehicle.”

Welcome back , to continue browsing the site, please click here