Prosthetic Limbs That Can Feel
Prosthetic Limbs That Can Feel
Denise Oswalt, a bioengineering doctoral student in Arizona State University's Neural Engineering Lab. Image: Kevin O’Neill/ASU
One day in the not-too-distant future, prosthetic limbs may become much more useful and user friendly, thanks to the work of Bradley Greger’s Neural Engineering Lab at Arizona State University.
Greger, associate professor in the School of Biological and Health Systems Engineering, and his team recently reported on research that takes another step toward making prosthetics that give amputees a high degree of control. The key, according to Greger, is the interface between the prosthetics and nerve messengers to the brain.
“It’s not just telling the fingers to move. The brain has to know the fingers have moved as directed,” Greger says.
The goal is to create limbs that patients will use as true extensions of themselves. To accomplish this, the team is seeking to establish two-way communication between a user and a new prosthetic limb so that it is capable of controlling more than 20 different movements.
In the recent research, the team implanted an array of electrodes in two nerves in the arms of two amputees for 30 days. The electrodes were stimulated with varying degrees of amplitude and frequency to determine how the participants perceived the stimulation. Then, as the subjects controlled fingers of a virtual robotic hand, neural activity was recorded during intended movements of their phantom fingers, and 13 specific movements were decoded.
Greger says the motor and sensory information from the microelectrode arrays indicated that patients outfitted with a highly dexterous prosthesis controlled with such a two-way interface might begin to think of the prosthesis as an extension of themselves. The idea is that a participant controls a virtual prosthetic hand by thinking about moving the amputated hand, and the nerve signals are recorded by microelectrodes. A computer algorithm decodes the signals and controls the virtual prosthetic hand.
“It’s not like we need any fundamental breakthrough,” Greger says. “We need some good engineering and sufficient resources. He says the biggest fundamental engineering challenge is longevity, as the devices must last at least a decade.
“The issue is a robustly engineered electrode of the right materials that is also compliant,” he explains. “It has to be a little bit more biological. It’s got to move and shift and be flexible like the nerve that it is interfacing with. A lot of the approaches have come from an electrical engineering background where they approach it from a rigid circuit connector.”
This work, coming from an electronic perspective, focuses on one-to-one mapping of the finger movements, Greger says. They don’t look at the hand as a mechanical device, something that works synergistically.
“The real exciting opportunity is to think about the neural code not as one-to-one mapping when I move my index finger, but when I do this whole kind of posture with my hands, there is real synergy. The challenge is how to get the neural signal that’s operating in full synergy to talk to a mechanical device that’s set up to also move with the synergies,” he says.
In part because of his background, Greger is taking a different approach. He earned bachelor’s degrees from Washington State University in both philosophy (that led him to thinking about theory of the mind) and biology. Those paved the way to neuroscience, and as a graduate student at Washington University in St. Louis, he developed an interest in studying neuronal encoding in primates and humans. “Like everyone did in the 1990s, I started thinking, ‘How can this help some people?” he recalls. As a post doc, he focused on neuroengineering and began investigating the use of microelectrode arrays in neuroprosthetics at Caltech. He joined ASU in 2013.
For the next phase of the study, markers will be applied to the patient’s healthy hand to record its movements, and then these measurements will be used to direct the virtual hand, with the patient using an Oculus Rift virtual reality headset. Another team member, Kevin O’Neill, a doctoral student at ASU, is developing technology that allows the patient to see what the virtual limb is doing and decodes the neural messages that enable the motion to happen.
Once the team learns what information the signals hold, a neural decoding system can be built to direct the prosthesis so that it becomes almost intuitive. “There will be some learning curve because we will be introducing them to a fairly complex system that listens to the nerve and takes those signals that used to control the hand--now gone--and use them to control the prosthetic hand,” Greger says. “We are hopeful that it will be more intuitive and [it is] very important that there is some sensory feedback so when they touch something, they get some sense that they touched something. That will really help them have a sense of embodiment. It really becomes like ‘their hand.’
“We’re working toward limbs that are accessible both financially and in terms of usability... something that is maybe not quite as sophisticated [as the expensive ones] but certainly better than the current generation of prosthetic hands,” Greger says.
He believes that the end technology may have much broader applications in controlling organ systems, dealing with, for example, high blood pressure. “It may not be as exciting but it can help a lot of people,” notes Greger.
Nancy S. Giges is an independent writer
The real exciting opportunity is to think about the neural code not as one-to-one mapping when I move my index finger, but when I do this whole kind of posture with my hands, there is real synergy.Prof. Bradley Greger, Arizona State University