In August 2016, Uber’s footing in the autonomous drive space became significantly more assured, having penned a deal with Volvo Cars to jointly develop the technology. In November 2017, a framework was established that would allow for an order of up to 24,000 Volvo vehicles for self-driving developments. In August 2018, Toyota Motor Corporation made a US$500m investment in the ride-share giant, along with a partnership that will see its technology installed in purpose-built Toyota vehicles.
Part of the attraction is Uber’s grounding in ride-sharing—a segment it rejuvenated as an start-up back in 2010. The company has garnered swathes of data on how, where and when riders travel, how long each trip takes and the cost of doing so. In theory, you couldn’t ask for a better foundation in understanding how a robo-taxi service might work. “All that experience helps a tremendous amount,” Stephen Lesh, Head of Hardware and Vehicle Programmes at Uber Advanced Technologies Group (ATG), told M:bility. “There’s almost an infinite number of self-driving scenarios to solve, so we use that pool of data from the Uber ride-share network to prioritise our autonomous vehicle development.”
Having spent two decades working at Ford Motor Company, Lesh made the swap to Uber in October 2016. Ironically, Ford is now pushing to become a mobility company. Based in Detroit, Lesh handles the development of Uber’s self-driving hardware and its integration within the vehicle, including the sensors, computers, electronics and connections, as well as all of the integration work with Uber’s OEM partners—power, cooling, mounting structures, vehicle safety and integrity.
As one of the world’s largest automakers, Ford gave Lesh valuable insight into the requirements of large-scale engineering programmes, and it is here that he underlines Uber’s lofty ambitions. “In the auto industry everything starts with a clay model, but develops to the point where every part in the vehicle can be mass produced in multiple assembly plants around the world,” he said. “Here at Uber we’re still in the development phase, but everything we do is aimed towards designs, components and assemblies that can eventually be mass produced.”
The company has bulked out its expertise in various elements of self-driving technology. Experts in anything from mapping and LiDAR to artificial intelligence (AI) and motion planning can be found at the ATG head office in Pittsburgh, where testing recently resumed on public roads—nine months after operations came to an abrupt halt following a fatal collision with a pedestrian.
On 18 March 2018, 49-year-old Elaine Herzberg was pushing her bicycle across the road in Tempe, Arizona. At around 10pm that night, an Uber self-driving test vehicle, operating in autonomous mode and with a test driver behind the wheel, failed to avoid a head-on collision. The safety driver, seemingly distracted by an episode of The Voice, according to a 318-page police report, had also failed to notice Herzberg in time.
The event sent a shockwave across the entire autonomous drive sector. In conversation with various industry experts in recent years, the overriding opinion has been that a fatality during public testing was inevitable. However, very few had expected the day would come so early on.
In July 2018, Uber test vehicles resurfaced in Pittsburgh but under manual control. The company’s voluntary safety report was later issued to the US National Highway Traffic Safety Administration (NHTSA) in November, with company Chief Executive Dara Khosrowshahi noting ‘deep regret’ for the event that transpired in Tempe. However, he conceded that whilst the firm is “committed to anticipating and managing risks that may come with this type of testing… We cannot—as no self-driving developer can—anticipate and eliminate every one.”
As things stand, Uber’s test vehicles are back on public roads in Pittsburgh in autonomous mode, but under significantly stricter guidelines and with a number of stringent measures introduced across the board. “We felt it was very important before we went back on the road that we not only made changes—having learned from experiences—but also that we publicly communicate those changes,” said Lesh.
A new driver monitoring system has been introduced to gauge the driver’s alertness, and whether they are looking at the road ahead. The system, says Lesh, can provide immediate feedback to the driver if they are adjudged to be distracted, with that information also sent back to a supervisory team at HQ. Enabled via a front-facing camera, a real-time feed of each trip can also be viewed remotely. The prior monitoring system was fairly rudimentary, simply recording a view of the cabin via a dash-mounted camera.
Speaking to M:bility in October 2018, Gabi Zijderveld of Boston-headquartered Affectiva, noted that its AI-based driver monitoring system “can help to identify whether a safety driver testing an autonomous vehicle is truly engaged.” While Zijderveld could not name specific partners, she advised that the company has seen significant interest from various parties testing autonomous drive technology on public roads.
Every standard Volvo XC90 on sale today comes with automatic emergency braking (AEB). However, in Uber’s modified XC90 test vehicles, certain driver assistance features—including AEB—were disabled. Uber test vehicles now utilise the vehicle’s collision mitigation system, which is engaged regardless of whether the car is in Uber’s self defined ‘manual’ or ‘auto’ driving mode.
An extra safety driver, or ‘Mission Specialist’, has also been added to each vehicle. “They can work with each other to maintain alertness,” explained Lesh. In addition, a more extensive training and selection process for its Mission Specialists has been employed. “We created a new job description that was more technical to source people with the skill-sets that better understand how the vehicles work and behave in certain situations,” he explained. An advert posted in January 2019 described the role as being ‘responsible for monitoring all systems in the vehicle’ and ‘providing clear and concise feedback to developers.’
Much like any commercial driving role, hours of service—how long a driver can be in the vehicle before a break is legally required—have also been limited. Driver fatigue is a leading cause of heavy truck crashes globally, and the trend has also been recognised in the burgeoning market of self-driving test vehicles. “There has been a comprehensive package of changes that we feel are now state-of-the-art in safe on-road testing,” said Lesh.
In Pittsburgh, the cars run on a set loop known as an ‘operational domain’, passing a number of Uber’s engineering facilities on the way. The loop has been selected as it contains traffic scenarios the self-driving software has passed in simulation and during closed track tests. Only until the car has passed both sets of tests can it run on the road. In addition, the cars do not drive the loop during inclement weather such as snow or ice. In San Francisco and Toronto—Uber’s other test beds—autonomous test cars are on the road but only in ‘manual mode.’ It is a somewhat confusing expression, which Lesh is keen to clear up.
“With manual testing, the sensor suite and computer are fully up and running, taking in data just as they would in autonomous mode,” he explained, “but the vehicle is controlled by our Mission Specialists.” This, he says, allows Uber to gather data from the on-board LiDAR, camera and radar sensors, which can then be leveraged by computer simulation programmes. With sensors continuously scanning the surrounding environment, it also aids the development of high-definition maps, which give the car a better idea of where it is on the road.
The user experience
While the focus is on ensuring public road tests resume without incident, engineering teams behind the scenes are also working hard to craft the user experience.
Even with the current Uber app, a focus has been placed on creating a consistent customer service, allowing riders to change their destination mid-drive, rate how the trip is going or share their route with friends. Then there is the cashless payment system, with a ride ordered and paid for via a smartphone. The idea, says Lesh, is for riders to manage their experience without interacting with a driver. Sound familiar? It should—it is the basis of all proposed robo-taxi services so far. “Uber today has already made significant efforts to automate the customer experience. Not having a driver in the car will be a final step, but it won’t be the first time that Uber has tried to automate parts of the customer experience,” he explained.
Automating the customer experience is one thing, but removing the driver entirely is a different story. With a traditional Uber, there is still an option to interact with the driver if necessary; the rider can observe his or her driving style, whether they are distracted and if they have recognised a potential hazard ahead. Replicating the core elements of a comfortable journey will be pivotal for any driverless service. “This will be especially true early on, when customers are trying to build up comfort with the technology,” said Lesh. “We’ve been experimenting with displays that show a primarily LiDAR-generated view of what the vehicle sees—identifying vehicles and pedestrians and other things,” he continued. “It brings comfort to the passenger. They can see that the vehicle has spotted another car nudging out of an intersection, or that a pedestrian might step out from the crosswalk. We’re still experimenting with what kind of information would be useful for the rider in order to become comfortable.”
Buckle up – it’s going to be a boring ride
Driverless cabs present an interesting dynamic in which comfort will be defined not only physically, but also emotionally. To achieve the former, engineers can tweak the vehicle’s handling, change the seating arrangement and provide individual air conditioning zones. For the latter, a dull drive is almost the desired effect. “Our vehicles are programmed to drive safely, cautiously and always within the law, which tends to make them very boring. It’ll never be an exciting experience, and any commute that’s rather boring is probably a good one,” said Lesh. “There will be a curve of acceptance. Riders will be excited at first, and once they see it performing safely, reliably and very much like a normal car, it becomes boring.”
Part of that comes down to ensuring the ride is essentially uneventful—you simply get in and get out without incident. This can only be facilitated if the system itself is cautious, predictive and quick to respond. “My driving instructor always talked about being alert, cautious and able to respond to the actions of other road users. That type of thinking is how we are programming our vehicles,” explained Lesh.
The difficulty is that a ‘good’ driver is subjective; while some may prefer a brisk drive, others may feel unsafe. “This is an extremely important issue for self-driving technology. The industry as a whole needs to collaborate on a set of metrics that define what self-driving is,” he said. “That will go far beyond factors like cornering speed, but if we’re really going to get this technology on the road, we must have an industry standard. There’s a lot of work that needs to be done on that.”
Don’t bin the bread and butter
While the plan is not to supplant human-driven Ubers—the bread and butter of the business—numerous opportunities for driverless services are becoming clear. This view is not exclusive to Uber; various players have agreed that autonomous vehicles will form an additional part of the broader transportation network, running alongside traditional taxis, privately owned cars, trains, buses and even e-scooters. “There are certain operational domains where it’s going to be very difficult to have a self-driving car operate, potentially ever,” said Lesh. “Human drivers could do those trips instead.”
One use case for driverless cars has arisen in the food delivery industry. Ford and Toyota have both voiced an interest in using driverless cars to deliver pizza, and Uber Eats could prove an early use case for autonomous vehicles outside of ride-sharing. “We’ve had a lot of discussions around that, and what else we could do with the technology,” said Lesh. “Right now our focus is on the core ride-share business, but once the self-driving technology is fully proven out it is easy to see how it could be extended to other parts of the Uber business.”
Uber is not alone in its pursuits, with countless test programmes across Europe and the US already underway. As of January 2019, more than 60 individual players hold licenses to test autonomous vehicles on public roads in California alone. In China, Baidu continues to refine its Apollo platform, which is notably being leveraged by Ford and Daimler in the country. Delphi-owned nuTonomy ran an autonomous cab service in Singapore’s one-north business district back in August 2016, with a paid-for service planned for 2019.
This article appeared in the Q2 2019 issue of M:bility | Magazine. Follow this link to download the full issue.