Virtual Reality Goggles for the Skin

Virtual Reality Goggles for the Skin

A haptic device translates information captured by smartphones into physical sensations on the skin.
Many people use their smartphones as navigational aids, using GPS-connected apps to provide directions. A team of researchers has worked to extend that capability to people who are visually impaired. The team built a haptic device that translates information visually captured with smartphone technology into physical sensations on the skin. 

The device, developed by a team led by bioelectronics professor John A. Rogers from Northwestern University, uses 19 magnetic actuators  hexagonally arrayed within a flexible silicone mesh to send physical cues by stimulating the skin of the wearer.  The team’s design and research were outlined in a recent issue of Nature

“We're trying to reproduce a realistic and immersive sense of physical touch with these wearable devices,” said Matthew Flavin, first author of the paper and former postdoctoral researcher in Rogers’ lab.  

The research takes technology already available in common smartphones, including sophisticated sensors and software that captures visual cues and information such as LiDAR, and delivers it to a wearable haptic device via Bluetooth technology. Bluetooth also allows for this information to be uploaded to the cloud, allowing for remote operation.  

Using existing smartphone technology and their associated available libraries helped the research team save time otherwise spent developing their own machine learning algorithm, allowing them to focus on perfecting the design of a unit actuator that could effectively communicate through touch to the skin. 
 

Gaming technology 

“What we're doing with this actuator that other actuators can't do, or can't do very well, is we can deliver more stimuli than those other devices,” Flavin said. “One of the things that this lets us do, to be able to poke, to twist, to vibrate, is develop this sensory substitution system,” for those with vision impairments.  

Each of the 19 magnetic actuators work together to selectively deliver sensations to the different receptors within the skin. Image credit: Northwestern University
Each of the 19 magnetic actuators work together to selectively deliver sensations to the different receptors within the skin. Much the way the different receptors in the eye that are sensitive to red, green, or blue light, the skin has different receptors that are sensitive to different types of mechanical stimuli such as vibration, low frequency pressures, or sheer force. 

“It’s kind of like going from black and white vision to color vision for our skin,” said Flavin.  

“This wearable haptic device can deliver very complex mechanical input in a way that can help a person who doesn't have visual information, to tell them when there's objects in front of them, to help them navigate towards something that they're interested in,” he added. 

LiDAR technology not only detects that there is an object in front of the user, but also its distance.

ME Magazine is Digital

There's a new issue of Mechanical Engineering magazine every month. Members can check out this month's issue right now.
A video demonstration of the technology shows an array of yellow dots on the screen that correspond to each of the magnetic actuators. Six “virtual detection windows” are projected on the smartphone’s field of vision at a certain distance. When the user moves and an object crosses the boundary of one of those boxes, it triggers a pattern of indentation that turns the yellow dots green, translating that information to the corresponding haptic devices, which then press, twist or otherwise stimulate the skin to help guide the user to or around that object.
 

Maintaining mobility 

The physical delivery of this range of stimuli to the skin requires low frequency pressure, however,  and the team found that existing devices meeting the low frequency pressure criteria were bulky and usually needed to be tethered to a power supply – quite impractical for someone wearing a haptic device and who just wants to go on about their day. To maintain user mobility, the team developed a bistable design that leverages the skin’s own energy to save energy and improve the life of the battery from each charge.  

Bi-stability stores energy from the actuator in the skin and in its internal structure, acting almost like a light switch that only requires energy when it changes from an on to off state (or vice versa).  

“That was a critical element,” Flavin said. “We’re using the natural bio-elasticity of skin combined with some special magnetic materials, which gives rise to this bistable operation, where we're recovering the energy stored in skin and reapplying it during its operation.” 

“Since most of the time we're occupying those states without transitioning between them, we can save a lot of energy by having this bi-stable operation,” he added.   

In addition to Flavin, Rogers’ team at Northwestern included Yonggang Huang, the Jan and Marcia Achenbach Professor in Mechanical Engineering at McCormick, whose team, including Shupeng Li, conducted systematic computational modeling to ensure the device works for all skin types. The team also included Hanqing Jiang of Westlake University in China, who built the small structures within the device that enables twisting motions and Zhaoqian Xie of Dalian University of Technology in China.   

Flavin has since launched the Flavin Neuromachines Lab at Georgia Tech University where he says he’ll continue his work on wearable bio-electronics and developing new devices and new applications to help people, perhaps in areas such as pain management or to address stroke or spinal cord injuries.   

“These are all based similarly on the idea of substituting and augmenting some missing sensory information with these haptic devices,” Flavin said. “We're interested in is developing these things to help people.” 

Nancy Kristof is a technology writer in Denver, Colo. 

You are now leaving ASME.org