Fraunhofer IFF enables human collaboration with cognitive robots
Researchers at Fraunhofer IFF are leveraging AI to equip robots with cognitive abilities that enable them to operate autonomously in unstructured, changing environments, as well as to automate complex processes such as assembly and disassembly in industrial settings or handling objects in healthcare environments.
The technology will be demonstrated at automatica 2025 in Munich. Fraunhofer IFF is also unveiling PARU and computer-aided safety (CAS) at the event, the first safety technologies and planning tools for close human-machine collaboration, which also ensure safety in AI-generated robot movements.
Projection-based and camera-based safety technologies allow robots with AI-based motion control to respond reliably to changes, adapt to new tasks and operate an application securely. This opens up a broad range of new fields of application.
“Cognitive robots can learn from experience, make independent decisions and adapt to various scenarios. For pick-and-place tasks involving picking up components and placing them where they need to go, a cognitive robot no longer needs to learn what the individual workpieces look like before it can grab them. Instead, it uses a camera to register the object’s size, shape, texture and condition and adjusts its behaviour accordingly. In the process, it can handle different environmental conditions and even different packaging materials,” says Magnus Hanses, head of the Cognitive Robotics group at Fraunhofer IFF.
Typically, simulated environments are utilised to train AI models. An example of such an environment is the assembly and disassembly processes, such as removing a motherboard from a computer. Any number of virtual robots can operate in digital space simultaneously and at a much faster pace without any safety concerns. There are many advantages to learning in a digital simulation, but the virtual learning environment is never exactly the same as the real world.
The challenge for researchers is to bridge this reality gap or Sim2Real gap as much as possible. Two possible approaches include designing the simulation to be as realistic as possible or to ensure it encompasses the broadest possible range of real-world scenarios so that the neural network used for the AI learns to generalise and navigate unfamiliar environments. One method, domain randomisation, enables the creation of numerous simulated environments with varying properties and the training of a model that operates effectively within all of them.
“There are many different parameters, such as lighting, that affect the simulation. We can change this set of parameters during the training. The robot doesn’t learn to solve the exact simulation. Instead, it comes to understand the abstract concept behind it. The reality becomes just another version of a simulation for the AI, if you will,” Hanses explains.
Patented speed and separation monitoring
Another key challenge in cognitive robotics is that there is currently no way to ensure the safety of AI-generated robot movements in meeting safety standards. For AI-based robots to interact safely with humans in a workspace, researchers at Fraunhofer IFF have developed PARU, a patented technology for monitoring workspaces. PARU utilises advanced projector and camera technology to project visible warnings and protective fields directly around the machine and recognises when people enter the safety zones.
“After the projector and the two cameras are calibrated, virtual expectation images are generated as the first step. Then, the projector projects a visible light curtain around the robot and the component to be picked up, following the distance formula set out in the relevant standard, ISO/TS 15066. This light curtain acts as a safety line, visualising for employees the protective space that humans must keep clear,” explains Norbert Elkmann, head of the Robotic Systems department at Fraunhofer IFF. “If any part of a worker’s body comes into contact with the line, the line is interrupted. The cameras recognise that there is a discrepancy between what they expect to see and the real-world image. Depending on the situation, the robot halts its movement right away or slows its speed.”
The safety areas are adjusted dynamically to the machine’s movements, making PARU ideal for use in cognitive robotics. “No other system allows for a smaller distance between humans and robots while observing the specifications set by the applicable standards and also needing so little space. This is possible because the cameras and sensors recognise not only torsos, arms, and heads but even fingers,” Elkmann says.
Another advantage is that the projection can also indicate to the worker where the robot will move next, further enhancing trust in working with machines. The additionally coded visible safety lines work independently of ambient lighting angles and conditions. If the cameras or projectors stop working, the entire system is automatically switched off.
CAS — intelligent software for adaptive robots
Computer-aided safety (CAS) is a suite of digital safety solutions that enable efficient, cost-effective, and safe human-robot collaboration (HRC) applications. Product-ready software modules are available for the efficient calculation of safe distances and speeds. Digital assistants support the risk assessment and safety approval processes, facilitating compliance for newcomers with the full spectrum of obligations under the EU’s Machinery Directive. Unlike the collision measurement feature, the safety approval tool operates entirely digitally. It considers parameters such as collision force and pain threshold to ascertain the robot’s maximum permitted speed. The modules can be optionally incorporated into any robot control system or existing simulation environment for planning purposes, ensuring precise alignment of economic specifications with applicable safety requirements. This approach prevents planning errors and reduces engineering costs.
CAS was developed based on data gathered from years of unique tests involving subjects, which have produced new threshold limits and other key biomechanical indicators for safe HRC. Collision and clamping loads applied to a specially designed pendulum were utilised to determine the pain threshold through tests involving more than 100 human subjects. The ethics committee and the Department of Trauma Surgery at Otto von Guericke University Magdeburg supported Fraunhofer IFF throughout the studies.
Image: Cognitive robotics in use: A collaborative robot with AI-based controls removes the motherboard from a computer, demonstrating complex disassembly processes that were previously difficult to automate. Copyright: Fraunhofer IFF.
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News
