Combining low-cost LEDs and dyes, the stretchable sensor, say the researchers, could give soft robotic systems – and anyone using augmented reality (AR) technology – the ability to feel the same rich, tactile sensations that mammals depend on to navigate the natural world. The researchers drew inspiration from silica-based distributed fiber-optic sensors, which detect minor wavelength shifts as a way to identify multiple properties, such as changes in humidity, temperature, and strain.

However, while such sensors have been used for monitoring mechanical deformations in stiff infrastructures such as bridges, roads, and buildings, silica fibers aren’t compatible with soft and stretchable electronics. Intelligent soft systems also present their own structural challenges, say the researchers.

“We know that soft matters can be deformed in a very complicated, combinational way, and there are a lot of deformations happening at the same time,” says doctoral student Hedan Bai, a co-lead author of a paper on the research. “We wanted a sensor that could decouple these.”

The researchers’ solution was to make a stretchable lightguide for multimodal sensing (SLIMS) – a long tube that contains a pair of polyurethane elastomeric cores. One core is transparent, while the other is filled with absorbing dyes at multiple locations and connects to an LED. Each core is coupled with a red-green-blue sensor chip to register geometric changes in the optical path of light.

The dual-core design, say the researchers, increases the number of outputs by which the sensor can detect a range of deformations – pressure, bending or elongation – by lighting up the dyes, which act as spatial encoders. The researchers then paired that technology with a mathematical model that can decouple, or separate, the different deformations and pinpoint their exact locations and magnitudes.

While distributed fiber-optic sensors require high-resolution detection equipment, SLIMS sensors can operate with small optoelectronics that have lower resolution, making them less expensive, simpler to manufacture, and more easily integrated into small systems. For example, say the researchers, a SLIMS sensor could be incorporated into a robot’s hand to detect slippage.

The technology is also wearable. The researchers designed a 3D-printed glove with a SLIMS sensor running along each finger. The glove is powered by a lithium battery and equipped with Bluetooth so it can transmit data to basic software, designed by the researchers, that reconstructs the glove’s movements and deformations in real time.

“Right now, sensing is done mostly by vision,” says lead researcher Rob Shepherd, associate professor of mechanical and aerospace engineering in the College of Engineering. “We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch. This is the most convenient and practical way to do it in a scalable way.”

The researchers are currently working to commercialize the technology for physical therapy and sports medicine. While both fields have leveraged motion-tracking technology, until now, say the researchers, they have lacked the ability to capture force interactions.

The researchers say they are also looking into the ways SLIMS sensors can boost virtual and augmented reality experiences.

“VR and AR immersion is based on motion capture.” says Shepherd. “Touch is barely there at all. Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tire. If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualization could say, ‘Turn and then stop, so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”

For more, see “Stretchable distributed fiber-optic sensors.”

Related articles:
Stretchable optical ‘lace’ gives robots heightened sensory ability
Soft robot technology combines fiber-optic sensors, machine learning
Stretchable, washable sensor can be woven into materials


If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles