Skip to content

How AR could offer a glimpse into the mind of a robot

Bleeding edge augmented reality technology could bring human workers and industrial robotics closer together than ever before, learns Freddie Holmes

The subject of machine automation naturally raises a number of questions for those that work side by side with mechanical leviathans: how can humans understand what a robot is about to do next, and be sure that it will not accidentally cause harm? Augmented reality (AR) could prove pivotal in alleviating concerns, with real-time 3D graphics helping workers to better understand the intentions of their robot colleagues with little more than a glance.

Outside of the factory, AR is being considered for in-vehicle navigation. Rather than displaying directions in the driver’s peripheral vision on an infotainment screen, arrows and instructions can be overlaid directly on the road via an AR windshield. At CES 2019, Hyundai revealed that it has been working with WayRay, a Swiss start-up that has developed just a system. Also at the show, Nissan laid out plans to introduce digital passengers in the cockpit—computer generated avatars that can ride alongside the driver.

AR has found numerous uses elsewhere, even directing retail shoppers to their preferred items and saving them the pain of hunting aisle to aisle. In 2014, a DARPA-funded prototype headset was tested to improve military situational awareness by tagging and tracking objects, and overlaying information into a soldier’s line of sight. AR is also rife in consumer electronics, with mobile apps such as Snapchat and Pokemon Go taking the spotlight. In 2016, AR was included in Facebook’s ten-year roadmap as a key technology for growth up to 2026. However, AR offers far more than just entertainment value. Introduced in the right areas within the factory, efficiency and safety could hit new heights.

Say hello to your digital twin

Simon Mayer is Professor for Interaction- and Communication-based Systems at the University of St.Gallen in Switzerland. A computer scientist by training, he describes the profession as ‘putting magic into the real world.’

His team has been investigating how mixed reality can help workers to interact with an assembly line. Leveraging the Microsoft Hololens AR headset, the proposed solution—dubbed ‘HoloMiracle’—allows the user to make vocal queries and control a digital cursor to manipulate 3D graphics. This means that decisions can be made on the spot rather than from a computer lab. In future, it could also provide an insight into a robot’s intentions, or more specifically, the tasks it has been programmed to perform.

“Augmented reality overlays 3D images over a particular space. A digital twin attached to a physical robot, which is only visible through an AR lens, could show what that machine will be up to in few seconds’ time,” explained Mayer. The problem, he says, is that this form of technology within the manufacturing sphere is not just cutting-edge; it is at such an early stage that he describes the solution as ‘bleeding-edge’.

Working with giants

AR could also prove helpful in the realm of collaborative robots, or co-bots, which are far smaller and easier to move than heavy-duty industrial robots. Today, co-bots already assist human workers with tasks that are particularly strenuous, such as fitting heavy components on a vehicle or other ‘dirty, dangerous and dull’ tasks. Workers at BMW’s plant in Dingolfing, Germany use Kuka co-bots to take the strain of mounting heavy gearboxes, for example, while Skoda’s Vrchlabí plant in the Czech Republic uses co-bots to insert gear actuator pistons into transmissions.

Introduced in the right areas within the factory, efficiency and safety could hit new heights

In order for robots and humans to work together in more complex situations, improved communication between man and machine will be key. Safety and efficiency must continue to rise as machines become more capable and modular. If current projections hold true, the so-called smart factory of the future will leverage highly flexible robots in order to facilitate quick changeover times and the production of a greater range of products. According to Rolf Najork, President of the Executive Board at Bosch Rexroth, there will only be six fixed parts to the smart factory: four walls, the ceiling and the floor.

Large robots are typically housed in the safety of a manufacturing cell—a confined area in which a towering machine operates out of harm’s way. Such robots can be tasked with handling materials well over 1,000kg (2,200lbs), and are capable of moving objects in wide sweeping motions at speed. It is a dangerous place for any human, particularly where movements may be sudden and unpredictable. While this is likely to remain standard procedure for the majority of tasks, Mayer believes that AR could help to bring the human into the loop in a safer and more efficient way.

“Today, even partially autonomous robots are fenced in a box and no one is allowed to enter. As soon as it detects a human entering the cell, the entire system switches off. This is the kind of setup we are trying to get away from,” he explained. “If humans are working among heavy, dangerous and autonomous machines that interact with each other, they need to be informed about the intentions of all these robots.” The issue has not been ignored by the industry; in the summer of 2019, industrial robotics giant, Kuka, will launch a programme to investigate, implement and test use cases for AR and VR glasses in robotics.

Follow the bubbles

AR could not only show the actions a robot may be about to perform, but also a real-time view of the devices and networks with which it is wirelessly connected. Devices in smart environments, explains Mayer, form complex networked systems. However, communication between each system essentially occurs invisibly, or ‘behind the back’ of human workers. “The more autonomous behaviour instilled to any given system that interacts with humans, the more we need to keep the human in the loop and informed,” he said.

Prior to his current role, Mayer served as a Research Scientist with Siemens Corporate Technology in Berkley, California between 2014 and 2017. During this time he co-authored a report into how the interactions of ‘autonomous cognitive machines’ could be visualised through a technology called HoloInteractions. The idea is for workers to “directly observe which devices interact with each other, and what data is transmitted between them at any given moment.”

This AR system was tested in a manufacturing cell outfitted with a Universal Robots UR5 collaborative robot, a programmable logic controller (a computer designed to control manufacturing processes) and a Microsoft Kinect motion-sensing device. When viewed through an AR headset, streams of blue bubbles show the various wireless interactions that take place as the robot is instructed to pick up a box.

Making AR a reality

It is worth highlighting the difference between virtual reality (VR) and augmented reality. AR overlays the real world with 3D images—computer generated graphics that are not really there—whilst VR uses a purely digital landscape, like a videogame. The benefit of AR is that it allows work to be carried out in situ, rather than from in front of a computer screen elsewhere.

“The use of augmented reality can enable remote asset inspection and reduce the costs of downtime,” notes Arthur D. Little’s Future of Mobility 3.0 study. Indeed, workers at a number of ZF plants have investigated how AR headsets can be used to troubleshoot issues on the production line in real-time. Specialists are often in remote locations away from the plant, but an AR headset allows both parties to demonstrate changes that may need to be made using overlaid 3D images and instructions. Metal components supplier Gestamp has been trialling AR that allows workers at its Abrera plant in Barcelona to step inside a digital 3D model of a vehicle—useful for analysing the result of a crash test.

If humans are working among heavy, dangerous and autonomous machines that interact with each other, they need to be informed about the intentions of all these robots

Toyota has used Microsoft Hololens headsets to aid inspection processes—such as checking the thickness of exterior coatings—and to optimise the layout of the factory floor. PSA is considering how AR could assist the way in which robots used in welding lines and paint shops are programmed. “Augmented reality could prove a way for workers to gain knowledge of how and where they will work in the future, and exactly how the environment around them will change,” noted Yann Vincent, Groupe PSA Executive Vice President, Supply Chain & Manufacturing in Automotive World’s 2018 special report ‘Vehicle manufacturing & Industry 4.0’.

As machines become increasingly autonomous and almost self aware—able to adapt their actions on the fly and not simply perform repeated set tasks—the ability to visualise their interactions and intentions could prove invaluable.

This article appeared in the Q2 2019 issue of M:bility | Magazine. Follow this link to download the full issue.

Welcome back , to continue browsing the site, please click here