The plastic ‘skin’ can detect how hard it is being pressed and generate an electric signal to deliver this sensory input directly to a living brain cell. The material is the result of work by professor of chemical engineering at Stanford, Zhenan Bao, who has spent a decade trying to develop a material that mimics skin’s ability to flex and heal, while also serving as the sensor net that sends touch, temperature, and pain signals to the brain.
Ultimately she wants to create a flexible electronic fabric embedded with sensors that could cover a prosthetic limb and replicate some of skin’s sensory functions. Bao’s work takes another step toward the goal by replicating one aspect of touch, the sensory mechanism that enables us to distinguish the pressure difference between a limp handshake and a firm grip.
"This is the first time a flexible, skin-like material has been able to detect pressure and also transmit a signal to a component of the nervous system," says Bao, who led the 17-person research team responsible for the achievement.
The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work features a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake.
Five years ago, Bao’s team members first described how to use plastics and rubbers as pressure sensors by measuring the natural springiness of their molecular structures. They then increased this natural pressure sensitivity by indenting a waffle pattern into the thin plastic, which further compresses the plastic’s molecular springs.
To exploit this pressure-sensing capability electronically, the team scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.
This allowed the plastic sensor to mimic human skin, which transmits pressure information to the brain as short pulses of electricity, similar to Morse code. Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism. Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.
The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.
Bao’s team has been developing flexible electronics that can bend without breaking. For this project, team members worked with researchers from PARC, a Xerox company, which has a technology that uses an inkjet printer to deposit flexible circuits onto plastic. Covering a large surface is important to making artificial skin practical, and the PARC collaboration offered that prospect.
Finally the team had to prove that the electronic signal could be recognized by a biological neuron. It did this by adapting a technique developed by Karl Deisseroth, a fellow professor of bioengineering at Stanford who pioneered a field that combines genetics and optics, called optogenetics. Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.
For this experiment the team members engineered a line of neurons to simulate a portion of the human nervous system. They translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.
To show that the signal could communicate reliably with the nervous system, Prof Bao and her colleagues passed it to a blue LED and shined the light onto a slice of brain from a mouse. There, a subset of brain cells had been engineered to respond to this stimulation, by expressing a light-sensitive channel that floods the cell with charge when hit by blue photons.
When the scientists measured the impulses of individual cells within the slice, they saw a faithful readout of the pulses being produced by the touch sensor, and flashed by the light – even at a rate of 200 pulses per second.
Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices. Bao’s team has already worked with Bianxiao Cui, an associate professor of chemistry at Stanford, to show that direct stimulation of neurons with electrical pulses is possible.
Bao’s team envisions developing different sensors to replicate, for instance, the ability to distinguish corduroy versus silk, or a cold glass of water from a hot cup of coffee. This will take time. There are six types of biological sensing mechanisms in the human hand, and the experiment described in Science reports success in just one of them.
But the current two-ply approach means the team can add sensations as it develops new mechanisms. And the inkjet printing fabrication process suggests how a network of sensors could be deposited over a flexible layer and folded over a prosthetic hand.
For more, see the paper in the journal Science: A skin-inspired organic digital mechanoreceptor.
Related artices:
Haptic prosthesis gives back missing limb’s natural feel
Wireless ‘thought into action’ brain sensor begins benchtop testing
Wearable haptics — feeling augmented