From Science Life BlogIn Iron Man, Tony Stark engineers himself a robotic suit of armor that serves two purposes, fighting against the terrorists who took him captive while keeping pieces of shrapnel from puncturing his heart. Based on a new study from a University of Chicago neuroscience laboratory, wearable robots like Iron Man’s suit may also serve a dual purpose for a different type of user: quadriplegic patients.
Scientists, in an effort worthy of comic books, have successfully developed brain-machine interfaces that allow people to move computer cursors and prosthetic arms with their thoughts alone. When paralysis occurs due to a spinal cord injury or neurological disease, signals from the brain fail to reach the muscles of the body. But the brain electrical activity normally responsible for movement remains intact, and brain-machine interfaces (BMIs) seek to translate that information into the operation of an external device. One such BMI, called BrainGate, was successfully tested in quadriplegic patients 4 years ago.
The BrainGate™ Co. is a privately-held firm focused on the advancement of the BrainGate™ Neural Interface System. The Company owns the Intellectual property of the BrainGate™ system as well as new technology being developed by the BrainGate company. In addition, the Company also owns the intellectual property of Cyberkinetics which it purchased in April 2009.
The goals of the BrainGate™ Company is to create technology that will allow severely disabled individuals—including those with traumatic spinal cord injury and loss of limbs—to communicate and control common every-day functions literally through thought.
However, while those patients were able to hit various computer targets and even type e-mails with their thoughts, their control of the cursor was somewhat shaky. When a person moves a computer cursor the old-fashioned way - with their hand on a mouse - information moves in two directions. Signals from the brain travel to the hand directing the movement, and sensory feedback goes back to the brain reporting on the movement’s success, both from the eyes tracking the cursor and from the location and movement of the hand in space. This latter sense, called proprioception or kinesthetic feedback, was not present in BrainGate trials; the patients’ had only visual feedback to help adjust their movement.
“In the early days when we were doing this, we didn’t even consider sensory feedback as an important component of the system,” said Nicholas Hatsopoulos, professor and chair of computational neuroscience at the University of Chicago. “We really thought it was just one-way: signals were coming from the brain, and then out to control the limb. It’s only more recently that the community has really realized that there is this loop with feedback coming back.”