Working securely entails not just following procedures, but also comprehending the work environment and situations, as well as anticipating what others will do next. A new approach gives robots this degree of context awareness, allowing them to operate more effectively and without interruptions alongside people on production lines.
The human-robot collaboration system can identify each worker it works with, as well as the person’s skeleton model, which is an abstract of body volume, rather than just judging the distance between itself and its human coworkers, according to Hongyi Liu, a researcher at KTH Royal Institute of Technology. The context-aware robot system can detect the worker’s position and even forecast the next one using this knowledge. These skills give a context for the robot to consider when interacting.
The technology, according to Liu, uses artificial intelligence to operate, which requires less computer power and fewer datasets than typical machine learning approaches. Instead, it employs transfer learning, a type of machine learning that reuses knowledge gained via training before being transformed into an operational model.
Professor Lihui Wang of KTH was a co-author on the study, which was just published in Robotics and Computer-Integrated Manufacturing. According to Liu, the technology is ahead of today’s International Organization for Standardization (ISO) regulations for collaborative robot safety, therefore its adoption would necessitate strikes. However, he claims that context-awareness is more efficient than the one-dimensional interactions that workers now have with robots.
When a human approaches a robot it slows down, and if he or she comes close enough, it will stop. If the person moves away it resumes. That’s a pretty low level of context awareness
Hongyi Liu
The terms “agents,” “co-working,” and “human-centered technological systems” highlight new aspects of human-computer interaction (HCI). As the quantity and complexity of human-technology interfaces grow, human intervention capabilities may become restricted, resulting in new issues.
“Under the ISO standard and technical specification, when a human approaches a robot it slows down, and if he or she comes close enough it will stop. If the person moves away it resumes. That’s a pretty low level of context awareness,” he says. “It jeopardizes efficiency. Production is slowed and humans cannot work closely to robots.”
The context-aware robot system, according to Liu, is similar to a self-driving automobile that detects how long a stoplight has been red before anticipating going again. Rather of stopping or downshifting, it begins to adjust its speed by cruising into the junction, avoiding more stress on the brakes and transmission.
The example of robotics is used to illustrate the challenges surrounding automation in the workplace and the creation of new HCI techniques that take into account social ramifications. A necessary participation strategy necessitates a shift in workplace design. This suggests that the focus of study should be on the relationship between technology and social factors as a whole, rather than as distinct entities in the design of an interaction system.
Experiments using the technology have shown that a robot may function more securely and effectively with context without slowing down production. In one of the system’s tests, a robot arm’s course was unexpectedly halted by someone’s hand. Instead of stopping, the robot altered its prediction of the hand’s future trajectory, and the arm went around the hand.
“This is safety not just from the technical point of view in avoiding collisions, but being able to recognize the context of the assembly line,” he says. “This gives an additional layer of safety.”
The study was a follow-up to the Symbiotic Human Robot Collaborative Assembly experiment, which ended in 2019.