Handheld interface makes training robots easy and intuitive
Engineers are designing robotic helpers that can enable robots to learn through demonstration. Teaching is typically done in one of three ways: via remote control by remotely manoeuvring a robot, by physically moving the robot through the motions, or by performing the task themselves while the robot watches and mimics.
To aid this process, MIT engineers have developed a three-in-one training interface that allows robots to learn a task through any of these three training methods. The interface is in the form of a handheld, sensor-equipped tool that can attach to many standard collaborative robotic arms. The attachment is used to teach a robot to carry out a task by remotely controlling the robot, physically manipulating it, or demonstrating the task themselves.
The MIT team tested the new tool, which they refer to as a “versatile demonstration interface,” on a standard collaborative robotic arm. The interface is equipped with a camera and markers that track the tool’s position and movements over time, along with force sensors to measure the amount of pressure applied during a given task.
When the interface is attached, users can remotely control the entire robot. The interface’s camera records the robot’s movements, which the robot can use as training data to learn the task independently. Similarly, a person can physically move the robot through a task, with the interface attached. The VDI can also be detached and physically held by a person to perform the desired task, while the camera records its motions. When the VBI is reattached, the robot can use the recording to mimic the motions.
To test the attachment’s usability, the team brought the interface, along with a collaborative robotic arm, to a local innovation centre where manufacturing experts learn about and test technology that can improve factory-floor processes. The researchers set up an experiment in which they asked volunteers at the centre to use the robot and all three of the interface’s training methods to complete two everyday manufacturing tasks: press-fitting and moulding. In press-fitting, the user trained the robot to press and fit pegs into holes, similar to many fastening tasks. For molding, a volunteer trained the robot to push and roll a rubbery, dough-like substance evenly around the surface of a center rod, similar to some thermomolding tasks.
The researchers say the new interface offers increased training flexibility, which could expand the types of users and “teachers” who interact with robots. It may also enable robots to acquire a broader range of skills. For instance, a person could remotely train a robot to handle toxic substances, while another person further down the production line could physically move the robot through the motions of boxing up a product. At the end of the line, someone else could use the attachment to draw a company logo as the robot watches and learns to do the same.
“We are trying to create intelligent and skilled teammates that can effectively work with humans to get complex work done,” says Mike Hagenow, a postdoc at MIT in the Department of Aeronautics and Astronautics. “We believe flexible demonstration tools can help far beyond the manufacturing floor, in other domains where we hope to see increased robot adoption, such as home or caregiving settings.”
Hagenow will present a paper detailing the new interface at the IEEE Intelligent Robots and Systems (IROS) conference in October.
Paper: “Versatile Demonstration Interface: Toward More Flexible Robot Demonstration Collection”
https://arxiv.org/abs/2410.19141v2 or https://doi.org/10.48550/arXiv.2410.19141.
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News
