Skip to content

Affectiva and Nuance to bring emotional intelligence to AI-powered automotive assistants

Affectiva, the global leader in Artificial Emotional Intelligence (Emotion AI), and Nuance Communications, Inc. (NASDAQ: NUAN), the leader in conversational AI innovations, today announced their work together to further humanise automotive assistants and in-car experiences

Affectiva, the global leader in Artificial Emotional Intelligence (Emotion AI), and Nuance Communications, Inc. (NASDAQ: NUAN), the leader in conversational AI innovations, today announced their work together to further humanise automotive assistants and in-car experiences. Affectiva Automotive AI, the first multi-modal in-cabin AI sensing solution, will be integrated with Nuance’s conversational AI-powered Dragon Drive automotive assistant platform. The integrated solution will deliver the industry’s first interactive automotive assistant that understands drivers’ and passengers’ complex cognitive and emotional states from face and voice and adapts behaviour accordingly.

The integration of Affectiva’s technology with Dragon Drive will expand the breadth and depth of contextual, emotional and cognitive data that automotive assistants can detect and account for. Affectiva Automotive AI measures facial expressions and emotions such as joy, anger and surprise, as well as vocal expressions of anger, engagement and laughter, in real-time. Affectiva Automotive AI also provides key indicators of drowsiness such as yawning, eye closure and blink rates, as well as physical distraction or mental distraction from cognitive load or anger.

Nuance’s Dragon Drive powers more than 200 million cars on the road today across more than 40 languages, creating highly customised, fully branded experiences for Audi, BMW, Daimler, Fiat, Ford, GM, Hyundai, SAIC, Toyota, and more. Powered by conversational AI, Dragon Drive enables the in-car assistant to interact with passengers based on verbal and non-verbal modalities, including gesture, touch, gaze detection, voice recognition powered by natural language understanding (NLU), and now, through its work with Affectiva, emotion and cognitive state detection.

“As our OEM partners look to build the next generation of automotive assistants for the future of connected and autonomous cars, integration of additional modes of interaction will be essential not just for effectiveness and efficiency, but also safety,” said Stefan Ortmanns, executive vice president and general manager, Nuance Automotive. “Leveraging Affectiva’s technology to recognise and analyse the driver’s emotional state will further humanise the automotive assistant experience, transforming the in-car HMI and forging a stronger connection between the driver and the OEM’s brand.”

“We’re seeing a significant shift in the way that people today want to interact with technology, whether that’s a virtual assistant in their homes, or an assistant in their cars,” said Dr. Rana el Kaliouby, CEO and co-founder of Affectiva. “OEMs and Tier 1 suppliers can now address that desire by deploying automotive assistants that are highly relatable, intelligent and able to emulate the way that people interact with one another. This presents a significant opportunity for them to differentiate their offerings from the competition in the short-term, and plan for consumer expectations that will continue to shift over time. We’re thrilled to be partnering with Nuance to build the next-generation of HMIs and conversational assistants that will have significant impacts on road safety and the transportation experience in the years to come.”

In the near-term, Affectiva and Nuance’s integrated solution will enable the automotive assistant to further learn and understand driver and passenger emotion and behaviour, as shown through speech incidents or facial expressions of emotion. For example, if an automotive assistant detects that a driver is happy based on their tone of voice, it can mirror that emotional state in its responses and recommendations.

In the future, the solutions are anticipated to address safety-related use cases, particularly in cars across the autonomous vehicle spectrum. Using Affectiva’s and Nuance’s technologies, automotive assistants could detect unsafe driver states like drowsiness or distraction and respond accordingly. In semi-autonomous vehicles, the assistant may take action by taking over control of the vehicle if a driver is exhibiting signs of physical or mental distraction.

Affectiva and Nuance will be speaking today at Affectiva’s Emotion AI Summit at the State Room in Boston, MA. Please visit Affectiva’s blog for learnings and insights from the event, including future updates on its work with Nuance.

SOURCE: NUANCE COMMUNICATIONS

Welcome back , to continue browsing the site, please click here