Skip to content

The autonomous car is coming, and it runs on software

Megan Lampinen talks to Intel’s Elliot Garbus about the role of the autonomous car in the Internet of Things

Preparing for the car of the future requires suppliers to transition their focus areas today – and that is just what Intel has been doing. From an in-vehicle infotainment (IVI) focus, the company has expanded its remit to include advanced driver assistance systems (ADAS) and what it calls the ‘software-defined cockpit’.

“We have been driving a set of technology in the automobile for quite a few years now, and we’ve seen an intersection of a set of trends that value what we’re delivering,” Elliot Garbus, Vice President of Intel’s Internet of Things Solutions Group, and General Manager of the company’s Automotive Solutions division, told Megatrends. “That value continues to expand over time,” he added. With a view of the technology of the future, Intel is laying out a path that will move it steadily in that direction.

In-vehicle computing

While many industry players differ in their idea of the steps towards autonomous driving vehicles, most agree that the end goal is the same. “The technology portfolio needed to deliver a fully autonomous vehicle requires an interesting set of capabilities,” observed Garbus, “Some obvious and some perhaps less obvious.”

To begin with, there needs to be a significant level of computing in the vehicle. This involves sensor fusion – how to pull together data to create a 360 degree situational awareness – as well as data probing, which involves sending to the Cloud information about the vehicle’s surroundings.

“There is a real need for high definition maps, an ability to really accurately position the vehicle. In fact, if we have a high definition map, and there is nothing else on the road, autonomous driving would be trivial,” he noted. The challenge lies in the unpredictability of other road users. “We need that 360 notion of situational awareness for the vehicle, and we need to be continually taking some of that data and uploading it to the cloud to continually learn, retrain and then redistribute information back to the vehicle,” Garbus added.

Importantly, the vehicle needs to be able to respond correctly to unanticipated events. “I’ve never seen a sewer worker climb out of a manhole in front of me while I’m driving, but I did have a customer tell me of a similar scenario that one of their autonomous vehicles came upon, so making sure that’s part of the learning is going to be important,” he said.

Communications

Intel is a big player in the Cloud and in-vehicle computing segments, as well as communications, and Garbus believes that this third area serves to glue the other two together. The company has been growing its offering here, particularly its portfolio of modems and products targeting the automotive industry, as well as making significant investments in 5G technology. “We see work both in LTE Advanced and 5G as critically important for delivering the kind of connectivity needed for autonomous driving, as well as being a very scalable and economically attractive way to drive V2X,” explained Garbus, using the term V2X to refer to vehicle-to-everything communication.

He also pointed to new opportunities along this front that bring together some of the broader trends seen in IT with the automobile, allowing automotive companies to monetise information and take advantage of that connectivity with Big Data. One simple trend, for example, involves changing to a more direct relationship with the customer.

Intel is looking at technology that could identify if a consumer is at risk of defecting to another brand or generally changing usage patterns to enable an automotive OEM to more effectively market to a consumer. “If I have an entry level sports car and I stop at an elementary school on my way to work every day, it might be interesting to start selling me on the benefits of a four door sedan for a growing family,” he suggested as an example. “This is the sort of thing we think about for Google, but not something we necessarily associate with the automotive industry, but that change is happening.”

This Internet of Things approach can be applied to proactive maintenance and warranty costs as well. “If I can early identify patterns that suggest there are components that will have a broader liability issue, can I change them earlier in my manufacturing cycle to help improve warranty costs overall? Leveraging this broad connectivity and the opportunity to take that data, and use data analytics to look for those kinds of patterns, is a rather remarkable value opportunity,” he suggested.

The software-defined cockpit

A software-defined cockpit refers to the consolidation of cluster displays with in-vehicle infotainment systems. The approach takes connected experiences from both inside and outside the vehicle and brings them into a centralised control console. Intel believes that it opens new opportunities for infotainment, customised user interfaces and improved safety features.

Reinventing HMI

The human-machine interface (HMI) is one of the central areas undergoing change for driverless cars, where potentially steering wheels and brake or accelerator pedals have no place. Companies like Intel are developing environments that will respond to the needs of the future, which could be very different from those of today. For instance, what happens when the passenger ‘in charge’ of a vehicle in autonomous mode changes his mind – how does he interact with the vehicle to adjust its course? The example Garbus provides is a family in the middle of a specified journey that decides unexpectedly to stop for a meal along the motorway. “What do I do, as the passenger in control? What is that user experience? Do I push a button and redirect the car? Do I grab hold of a joy stick or the steering wheel and just pull over? Do I simply turn around and tell the kids to pipe down, and that we have to have lunch at Grandma’s house?” asks Garbus.

Intel sees opportunities for continuing to advance the HMI towards what Garbus refers to as ‘a software-defined cockpit’. As he elaborated: “We see the instrument cluster moving to a screen, as well as the centre stack screen continuing to grow, and the opportunities for more screens in the vehicle as well to provide information.” These will not only provide entertainment but more importantly bring some of the safety technologies front and centre into the driving experience.

“Imagine how mobile augmented reality could help keep us safer on the road by alerting us to unsafe driving distances or a lapsed behaviour of drivers,” he suggested. “Think of how fusing multiple cameras together could create virtual rear view mirrors with no blind spots, how the elimination of side view mirrors, and their replacement with cameras, could enable us to put those side views in different places, or integrate with additional information to show a car coming up very quickly from behind. There’s a whole evolution of HMI going on. We’re very much participating in that, and its elevation towards the software-defined cockpit.”

Fascinating world of the future

The advent of autonomous vehicles, with the dramatically improved safety that they promise, opens new doors for exterior vehicle design evolution as well. “If we move into a world where cars no longer have collisions, then my design point for a vehicle could change fundamentally,” said Garbus. “I could make it out of much lighter material, perhaps plastic. And because it is so much lighter, fuel efficiency would be much greater.”

Combined with changes in the interior, which turn it more into an office or a living room space, the vehicle could become an altogether different machine. “It opens up really a fascinating world of the future,” he added.

This article appeared in the Q1 2016 issue of Automotive Megatrends Magazine. Follow this link to download the full issue.

Welcome back , to continue browsing the site, please click here