Transforming Human-Robot Collaboration: 4 popular ways for Utilizing Extended Reality
Publish Date:
January 31, 2023
Do you like to work and communicate with robots in manufacturing? Are you part of the community that Imagines a future where technology enhances our abilities and expands our capabilities? Let's explore the potential of utilizing Extended Reality in human and robot shared workplaces from the research community perspective.

Extended Reality(XR) term includes Virtual Reality(VR), Augmented Reality(AR), and Mixed Reality(MR). However, as it referred to the umbrella term, all other technologies that can be created in the future will fall in through this term. Across all emerging technology, XR has gained a large market through different sectors, as in 2021, the market value reached 42.86 Billion US$ globally [1]. Statista states that investment in AR/VR technology worldwide, just in the manufacturing and resources segment, took 13.8% of the forecast share and forecast spending of 1.66 billion dollars in 2020 [2].

Now, let's take a close look at what has happened to the manufacturing industry in the last decades. With Industry's 4.0 revolution in manufacturing, employing industrial robots rapidly grow as they can provide faster production and increase the quality of products with precision. Meanwhile, operators take responsibility for complex tasks outside the robot cell's closed space. Later in the manufacturing and assembly sector, the type of demands changed, such as customization of products or changes in a delivery timeline based on customers' requests. These demands require assembly lines with small lot sizes, which can lead to increased costs if previous solutions are used. The outcome of these changes requires more flexible solutions so that operators and robots can cooperate. Then, the concept of Human-Robot Collaboration (HRC) is shaped to allocate two resources alongside each other (operator and robot) and create a more flexible solution. In recent years, this concept has become more popular and even in more giant industries such as automotive.

Looking over these changes, it catches my interest to see what and how emerging technologies can be integrated into manufacturing. Then I devoted my research to going through the literature on the applications of the XR in HRC to find the answers to these questions: What are the common applications type to utilize XR in HRC? What kind of technologies or devices are used in the development? What kind of robots are targeted to be investigated? Are there any common development tools that scholars utilize them?

In this study, I did a meta-analysis of scholarly research regarding the usage of VR and AR in robotics. Then, I selected 26 articles by narrowing the research focus to HRC applications. After reviewing articles, I categorized articles into four common applications, which here I discuss it briefly:

  1. Operator support: These research’s main idea is to increase communication and required guidance between operator and robot. Researchers used sensors and cameras to monitor robots and detect objects. The main goal of communication with the robot system was to demonstrate the robot’s intention and inform the user. Another perspective was by Highlighting objects with augmentation, and the user can communicate and send commands to the robot.
  2. Simulation: In these applications, researchers utilized simulation software to create a virtual system representation. As a result of this, they allowed the user to walk through and explore environments. It requires applying the robot’s kinematic and inverse kinematics to demonstrate robot movements. Users in these studies could interact with the robot to acquire information such as the robot’s reachability.
  3. Instruction: Another focus of scholarly research was to augment operator instructions. They used sensors to monitor working zones and captured information regarding robot status. Therefore, in an augmented environment, an operator can see real-time assembly execution status feedback, such as active and following tasks. Users can use virtual buttons to control or confirm the robot’s movement.
  4. Manipulation: In the last approach, the goal is to utilize XR technologies to navigate the robot remotely. This can be achieved by teaching target points with user interaction, such as gestures or touch. Here, the robot’s movements are calculated within the robot controllers themselves and follow the robot’s program from a user. Users can manipulate virtual objects for a required task, and their collisions can be monitored virtually with XR features such as spatial mapping.

Here, to answer the rest of my questions, I provide the figure below to share a summary of my result:

Human-Robot Collaboration scenarios have been mostly implemented by collaborative robots. They have an inherently safe design, for example, the robots include force sensors in their joints. These sensors capture forces in case of impact, and as one of the options, it can stop the robot’s movement. The problem is that these robots have low payloads compared to heavier industrial robots. If we want to use the HRC concept with a fast heavier industrial robot, the perspective of using XR will be different. After all, these system should satisfy user's safety before implementation.

Nowadays, more companies show interest in utilizing these technologies everywhere. Even in automation and manufacturing, we can see live demos at public or industrial fairs. I would say the hype of technology keeps user interested to test these applications. I had an opportunity to engage in AR application study related to shared assembly tasks with a collaborative robot and observe student's feedback. The design of the study was by  Dr. Antti Hietanen, here you can find his finding by comparison of AR vs traditional[ approach [3]. Surprisingly, students were eager to use AR headsets to complete assembly tasks in factory station, even everyday.

Personally, I believe these technologies still have limitations, such as battery capacity and field of view. These limitations may decrease the transition process to utilizing the XR, but at least they are mature enough to integrate them for user training. The main power of XR is a pleasing user interface and a feel of presence that keeps users motivated to follow up instructions compared to manual paper or video materials. I think addressing proper training applications in XR is missing here, specifically in HRC. Training is a broad topic and needs to be broken down with the help of industry necessities. Thereafter, we may lean forward to utilize simulation application for HRC cell design process.

At the end, I believe that this study has built step towards understanding the potential of Extended Reality to improve the efficiency and effectiveness of human-robot collaboration. You can find my article by title of "Review on existing VR/AR solutions in human–robot collaboration" which was published in 8th CIRP Conference of Assembly Technology and Systems[4].  

What do you think? Which applications are more interesting for you? Do you see other opportunities to use XR in your company, research, or projects?
Share your thoughts and feedback in the comment section.

Morteza Dianatfar