And the use of cumbersome multiple motion-capture cameras to provide precise information about the robot’s 3D movement and positions somehow defeats the purpose of designing a soft robot. Now, researchers from MIT have leveraged Artificial Intelligence (AI) algorithms to analyse the output of flexible kirigami-shaped sensors integrated to the skin of a soft robot trunk, and enable proprioception, the ability for the soft robot to “feel” how it is twisted or bent and understand its own position in space.
Described in a paper published in the journal IEEE Robotics and Automation Letters, the skin sensors consist of sheets of conductive materials used for electromagnetic interference shielding, which the researchers hollowed out or cut into precise kirigami patterns that make the sheets much more flexible and stretchable.
Because of their piezoresistive properties (varying their electrical resistance when strained), these materials turn out to make effective soft sensors as they deform in response to the trunk’s stretching and compressing. Electrical resistance of the sensors is converted to a specific output voltage, which is then fed into a novel deep-learning model that sifts through the noise and captures clear signals to estimate the robot’s 3D configuration, correlated to real movement data captured with a motion-capture system for ground truth data. The researchers validated their system on a soft robotic arm resembling an elephant trunk, that can predict its own position as it autonomously swings around and extends.