Biometric recognition systems are rapidly gaining ground across numerous industries, and automotive is no exception. Automakers and suppliers are actively exploring ways to improve the in-vehicle experience for drivers and passengers, both in terms of safety and convenience. Initial applications have centred around unlocking and starting the vehicle, but as the industry gradually moves towards shared and autonomous mobility, these could prove just the tip of the iceberg.
Fingerprints and facial recognition
With biometric authentication, the owner’s body essentially becomes the key to the vehicle. Hyundai is preparing to introduce fingerprint access on the 2019 Santa Fe. A sensor on the door handle scans the driver’s fingerprint and then unlocks the door. A scanning sensor on the ignition allows the driver to start the vehicle. Once entry is granted, biometrics can be harnessed to personalise the in-vehicle experience as well. Hyundai intends to enhance this application in the near future so the driver’s fingerprint can also be used to automatically adjust the seat position, steering wheel position, and angle of the rear-view mirror based on personalised settings.
Biometrics is also being used in real-time to detect the state of the driver, particularly to determine if he is distracted or fatigued. Harman, now a Samsung company, is developing a digital assistant system that measures an individual’s pupils and looks at facial expressions to determine their mood. “Your pupils can give insight into your cognitive mode,” explained Jason Johnson, Director of User Experience Design & Studio Lead – Detroit/Novi, at Harman. “They reflect not what you’re thinking but how much you’re thinking, and how much activity is happening.” If the system detects that the driver is highly stressed, for example, it may decide to delay relaying a non-urgent message about wiper fluid levels, for example.
Nuance Communications is also looking to harness biometric data to determine the driver’s emotional state. On this front it has been partnering with Affectiva, a specialist in artificial emotional intelligence (Emotion AI). Using a camera to view the driver’s face, the system can then deduct his or her cognitive state. That could determine exactly how and to what degree the digital assistant then interacts with the driver.
“Biometrics is a massive space,” observed Krishna Jayaraman, Program Manager – Connectivity & Telematics in Frost & Sullivan’s Automotive & Transportation practice. “It won’t be long before these systems will be able to understand when a driver is stressed. Maybe the seatbelt will feature sensors to pick up on respiration and as soon as the driver’s breathing starts to increase, it understands he is stressed and responds to that. Or maybe it just soothes the driver and doesn’t offer too much non-essential information,” he predicted. “Eventually, these are the sorts of things that biometrics and artificial intelligence will do for you.”
Security from the inside
Like many new technologies, some of these biometric systems raise concerns about potential hacking. Anyone with an identical twin faces a risk of having their vehicle stolen or used without permission. In some cases, a different makeup routine or a new beard could throw off the facial recognition system. What if an emergency arises and the owner needs to access or start the car immediately and can’t wait for an official facial verification process?
“So far, all biometric systems have been easily hacked or breached because they make use of a database or because they compare images,” commented Martin Zizi, Chief Executive of Aerendir, a biometric authentication developer. “It’s very easy; there are more than 400 tutorials on the internet about this.” With an abundance of instructional videos out there like ‘Instructions on fooling facial recognition’ and ‘How to copy a fingerprint like a Spy’, it’s clear the current systems are far from fool proof. According to some videos, systems can be hacked with simple tools like PlayDoh and pressure-sensitive tape combined with a lipstick print or a fingerprint.
For Hyundai, the answer lies in capacitance recognition, which detects differences in the electricity level of various parts of the fingertip. The automaker estimates that the system has a one in 50,000 chance of granting access to someone other than the owner.
Silicon-Valley’s Aerendir takes a similar inside-out approach to the challenge. Instead of looking to outside physical features, it taps into a person’s nervous system activity, specifically the micro-vibrational patterns in a user’s hands. The company’s NeuroPrint technology is based on proprioceptive neurophysiology. “Muscle fibres are connected to the brain’s neural network, and no two brains are alike,” Zizi told M:bility. “We can start to play with this and use an individual’s brain pattern to identify them.”
These signals are generated within all individuals and uniquely shaped by their brain. They can be picked up by the incredibly sensitive sensors in modern smartphones or tablets and authenticated by simply holding the device for three to four seconds, or less than one second if embedded into a car seat. Importantly, Zizi claims that it is nearly impossible to hack. “The body is difficult to hack when you use a live signal like ours,” he added. The system operates entirely on the specific device involved, meaning that personal data need not be transferred to an external server during the authentication process. This eliminates the possibility of interception via hackers.
NeuroPrint technology has been developed and is operational. It is currently being adapted to vehicles.
In addition to security, the NeuroPrint system also offers inherent safety benefits for drivers. Along with brain activity it also picks up a driver’s heartbeat, from which it can deduce the respiration rate. Combined with the data on muscle tone, this makes for a fully-fledged physiological monitoring system. Such a system could theoretically be used to safely stop a car when it detects the driver is unwell, perhaps suffering a heart attack or severely impacted by alcohol. “In the short time before someone loses consciousness, their muscle tone changes radically. The system could detect this,” he added.
Assuming a future of autonomous vehicles, the system could potentially redirect the vehicle to a hospital for emergency medical treatment in the case of a crash or a general medical emergency. This could dramatically improve the chances of survival in urgent cases. Emergency services require an average of 12-14 minutes to reach an incident site in an urban location. Outside of the city the average is 17-19 minutes. The brain can only survive for six to eight minutes without oxygen. “If the vehicle knows the occupant’s physiology it could have the ability to triage,” suggested Zizi. “It could then direct emergency responders on a priority basis. If it can move itself, it could drive towards the first responder.”
And it all stems from a cheap sensor costing less than a dollar. For the full system, Zizi estimates that it would add US$1-US$10 to the cost of a vehicle, depending on scale. “It’s trivial compared to the rest of the car. We could even be quoted on ASIC (application specific integrated circuit), and because of the way we programme, we could bring the price of a dedicated chip down to US$0.20. If I were to make a do-it-yourself kit, it would be around US$2-$3 because of the sensor and microcontroller. However, the future of this technology is to be fully embedded inside the electronic system of a car.”
In the longer term, the rise of artificial intelligence (AI) could make accurate identification all the more important. Zizi is particularly concerned about the potential threat from AI robots and their ability to impersonate humans.
“In a world where everyone’s data has been shared online, even biometry eventually, what will prevent an AI bot from impersonating you, creating contracts on blockchain behind your back?” he asked. “The one thing that can prevent that is physiology. No matter how a robot is programmed or the AI is refined, at the end of the day the AI is not alive and it has no heartbeat, it has no brain waves. The fact that I’m alive is the final frontier between human and machine.”
This article appeared in the Q2 2019 issue of M:bility | Magazine. Follow this link to download the full issue.