Thought-Controlled Robot Arm

The servile robots of the future will be controlled not with joysticks, smart phones, or voice commands, but with thoughts. Already, mind-controlled robots have allowed a paralyzed man to feel again through a robotic hand, and a quadriplegic woman to use a robotic arm to grab a bottle and drink. But both of these cases required a chip to be implanted in the brain of the robot controller, a costly and sensitive procedure, to say the least. Now there’s a way to make robots obedient to thoughts without the requirement of brain surgery.

Instead of a chip implant, the new method uses and EEG cap to measure brain waves. “It’s non-invasive, there’s no risk, and it’s cheap,” says Bin He, a biomedical engineering professor at the University of Minnesota, who lead the research. With the cap strapped on their heads, 13 human volunteers were able to make an off-the-shelf robotic arm grab an object, move it to a shelf, and, finally, release it.

His subjects were all healthy, able-bodied students with full use of their limbs. So the signal they needed to generate was not the same as simply willing their actual arms and hands to move and grip as they normally would. Instead, He directed them to imagine they were performing these tasks without actually doing it. Previous work by He showed that imagining a movement and actually performing it are not that different, at least from the point of view of fMRI scans and the EEG cap. “What is the difference, or the similarity, between imagining you’re moving compared to actually moving? We found a very nice co-localization in terms of activity between fMRI findings and EEG imaging findings in both movement imagination and actual movement.”

 

Research subjects fitted with a specialized noninvasive brain cap were able to move the robotic arm by imagining moving their own arms. Image: University of Minnesota

 

But eager robot masters will need to do more than just slap on the cap and put their imagination to use. “It does require a relatively lengthy training process,” says He. Subjects go to He’s lab once a week, for two hour sessions. It takes 10 to 15 such sessions before a subject has mastered the technique. “For some subjects we ask them to execute the actual movement, to help them get a feel for how they should imagine moving. Usually we ask them to move an arm and then not to move it, but to imagine how you moved it before,” he says. “We’re teaching them the skill of motor imagination.” Once they’ve got a feel for that kind of imagining, they try to move a cursor on a screen form left to right. Then they graduate to grasping, releasing, and moving the arm, in a step-by-step process.

The subjects aren’t the only ones who are learning. The software adapts to them as well, learning the peculiarities of how each subject imagines movement.

He’s next step is to train amputees. Will the signal they generate be similar to that of He’s able-limbed subjects so far? He thinks they’ll be similar. “But we need to work out the details.”

The applications for non-invasive mind-controlled robots is vast. Those without the full use of their arms will be glad to have a new one at their disposal, to be sure, but there are many other people who stand to benefit. Imagine having an extra arm for playing the piano, piloting a plane, or performing any task where both flesh-and-blood arms are busy.

But before applications functional or futuresque become a reality there’s much work to be done. He wants to improve the control resolution as well as shorten the training periods. But however the technology is eventually adapted and adopted, the first major achievement has been made. “It’s the first time, to our knowledge, that it’s been demonstrated that a human subject can perform sophisticated 3D control using EEG signals.”

Michael Abrams is an independent writer.

Usually we ask them move an arm and then not to move it, but to imagine how you moved it before. We’re teaching them the skill of motor imagination.Prof. Bin He, University of Minnesota

You are now leaving ASME.org