At International CES 2015, Intel’s booth featured a Jaguar F-Type. The purpose of its presence on the booth was not, however, to promote the car or to highlight Intel-branded technology, but rather to demonstrate the eye tracking technology installed on the vehicle by Melbourne-headquartered supplier of intelligent sensing technologies, Seeing Machines.
In September 2014, Seeing Machines signed a 15-year strategic alliance with TK Holdings, the US subsidiary of Japanese automotive safety equipment supplier, Takata. The alliance sees the two companies agreeing to further develop driver monitoring technologies, an area on which it has already been working for two years with clients in the mining industry.
At CES, Megatrends caught up with Nick Langdale-Smith, Vice President, OEM Relationships, to discuss Seeing Machines’ plans to take its eye tracking technology out of the mines and onto public roads.
Why mining? “Because mining has big trucks, big problems and deep pockets,” grins Langdale-Smith. The mining industry approached Seeing Machines for its technology, which had been identified as a way to solve some of the major issues faced by companies running mining assets, the key one of which is driver fatigue. “Our technology found a very good match in the mining industry, where we were able to put it in the big yellow mining trucks that turn into unguided missiles when their driver is asleep. Remember, they are 500 tons worth of metal and iron,” says Langdale-Smith.
Seeing Machines developed a solution that issues alerts if a driver dozes off at the wheel, using both an audible alert and a seat vibration alert. “And that’s allowed our customers in the mining industry to save time, money and lives in accidents.” An added benefit was a reduction in running costs, because a fatigued driver is also a bad driver – and a bad driver grinds his wheels against the mine walls. “One of our customers saved 50% on their tyre wear and tear cost in a year by implementing our fatigue solution,” says Langdale-Smith. “Those tyres cost up to 200,000 bucks a piece. If you scale that up over their entire global operation, that’s a US$21.9m saving in rubber costs. So we’re seeing not only savings through preventing accidents and those final catastrophic events where there’s loss of life, but, in an industry that’s driven by increasing productivity, we’re able to show improvements in running costs. And we’re looking at fuel consumption next.”
It’s fascinating to hear that such technology has come from mining, an industry which has also been the source of much of the semi-autonomous drive technology in trucking. “It was a sensible place for us to start, because these mining trucks cost many millions of dollars. We can’t move directly into series production in automotive, where the price points are so much lower when we’re still proving out the technology, so we’ve been able to create really good partnerships in mining. There is such a large problem with fatigue and such valuable assets that we were able to find considerable value in there for the customer. The return on investment just made sense.”
Seeing Machines has proven out the technology in the mining industry, and is now moving into trucking and buses, where it is operating on the Intel Atom platform, explains Langdale-Smith. “We’re obviously using some of Intel’s core processers in the mining and the transportation industry, so we’ve got this good relationship with Intel in the aftermarket area of our business.”
In terms of a proposed timescale for bringing this to market, Langdale-Smith says the CES Jaguar F-Type demonstrator is merely a concept demonstrator. “It’s probably going to be two to four years before you’re able to see this Gen 1 driver monitoring technology hit showrooms. But we’re going to see it get more and more sophisticated from there. Intel has a phrase called Perceptual Computing. The idea is that the machine can perceive its user, and that is exactly what we’re trying to build here. The car becomes imbued with an intelligence of its driver and is able to make intelligent decisions, alert you more rapidly if it detects something that you should be alerted to, or get out of your way so as not to be a nuisance when it doesn’t need to be there.”
The decisions regarding specific applications of the Seeing Machines technology are up to the OEM to make, says Langdale-Smith. “At Seeing Machines, we are pursuing aftermarket business in mining and transportation. We see opportunities with commercial road transport operators. But over the last 12 to 24 months we’ve really been pulled very rapidly by a large number of OEMs into needing a solution for passenger vehicles. And the time is coming, to use the Perceptual Computing phrase again, where we’re marrying understanding what’s on the outside with the understanding of what’s going on inside to make these things smarter and safer.”
However, there are many different applications beyond smart alerts that can be supported once there’s a camera looking at the driver, including advanced head-up displays, augmented reality and gesture-based interaction. “These are all potential applications that leverage an understanding of the driver’s face and eyes, and facial expression. What we are trying to do at Seeing Machines is to be the go-to driver monitoring solution, and to be able to understand as much of the driver and the context that the driver is operating in as possible from a camera or cameras inside a vehicle.”
Indeed, Langdale-Smith concedes that mood tracking, already under development elsewhere, is something his company could potentially be asked to roll out by its automotive customers. “We think there’s some interesting information in facial expression tracking. For instance, if someone is yawning, that could signal fatigue. There’s also research that suggests the slackening of facial muscles can be a precursor to fatigue. Road rage could be signalled by someone who is really tense. We feel that there’s information contained in that signal, and our R&D teams are probably going to start looking at that in more detail as we progress. We’re a one-stop shop for tracking everything from the neck up and understanding the human face and eyes.”
Longer-term, driver monitoring technology like facial expression recognition and eye tracking could be used to monitor heart rate, says Langdale-Smith. “If you look at certain channels of the image, you can start looking at the blood pumping into and out of the face. Those are the sort of things we’re considering looking at in the future that might give us an even better contextual understanding of the driver’s fatigue levels, awareness levels, distraction levels and cognitive workload levels. And that’s going to allow the machine to change the experience to better serve the driver. If you want to get into real science fiction, you can start talking about pupil dilation. I talked to a performance car company which wanted to do some research into keeping the driver on the edge of their seat at all times by looking at the pupils so that they were constantly dilated. There’s a lot of science fiction around that one – but who knows what the future holds?”