Scientists have devised a clever new method of allowing people to feel sensations that are transmitted to their skin. Beyond its applications in fields such as gaming and telepresence, the technology could also be used to guide the blind.
Back in 2019, we heard how a team co-led by Northwestern University’s Prof. John A. Rogers developed a prototype device known as an “epidermal VR” patch. It took the form of a thin, soft, flexible and slightly-tacky elastomer membrane containing an array of wirelessly-powered, wirelessly-controlled, disc-shaped electronic actuators.
When the 15-by-15-cm (5.9-inch) patch was temporarily adhered to the skin, the actuators could be individually triggered to vibrate, replicating the sensation of being lightly touched in a given pattern. Rogers and colleagues have now taken that concept a step further.
Their new battery-powered prototype patch consists of 19 magnetic actuators encapsulated within a thin, flexible silicone-mesh membrane. Those wirelessly-activated actuators are still capable of vibrating, but they can also twist – thus applying horizontal tension to the skin – plus they can move up and down, exerting and relieving vertical pressure on the skin.
Importantly, the actuators have a bistable design, meaning they can stay in either of two positions without requiring any energy to do so.
When they move down, they lock into that position, maintaining pressure on the skin without using any electricity. When a small amount of electricity is used to unlock them, the elastic energy stored in the stretched-down skin is released, thus pushing them back up. They then stay up on their own, using no electricity.
As a result, one charge of the patch’s battery is good for much more runtime than would otherwise be possible.
Needless to say, the technology could be utilized in VR gaming systems, allowing users to actually feel the sensation of touching surfaces or being touched in virtual environments. It could also transmit touches between two remotely located users, or it could relay a sense of touch from a prosthetic hand to a user’s residual arm stump – thus letting the person feel what the hand is touching.
That said, Rogers’ team experimented with yet another potential application.
Blindfolded test subjects wore the patch while trying to make their way past a variety of obstacles which were in their path. Although the people couldn’t see those obstacles, the items were detected by a Lidar sensor on a smartphone that was linked to the patch via Bluetooth.
The patch was thus able to alert the volunteers to the locations of the obstacles, initially by exerting light pressure on the left- or right-hand side accordingly. If the person proceeded to move towards the obstacle instead of avoiding it, the pressure would become more intense and move to the middle of the patch.
“We show that this system can support a basic version of ‘vision’ in the form of haptic patterns delivered to the surface of the skin based on data collected using the 3D imaging function (LiDAR) available on smartphones,” says Rogers. “This sort of ‘sensory substitution’ provides a primitive, but functionally meaningful, sense of one’s surroundings without reliance on eyesight – a capability useful for individuals with vision impairments.”
A paper on the research, which also involved scientists from China’s Westlake University and Dalian University of Technology, was recently published in the journal Nature.
Source: Northwestern University