The difference between zero and one.,
by W Sweet, Clin Neurosurg. 1976; 23: pp. 32-51.
No abstract available.PMID: 975685 [PubMed - indexed for MEDLINE]
Robot Arm Controlled Using Command Signals Recorded Directly from Brain Neurons.,
by J. K. Chapin,
Our laboratory employs multi-electrode based brain interface technologies to investigate the control of movement by the sensory and motor systems in the brain. We have recently demonstrated that experimental animals can learn to control a robot arm using brain-derived signals alone, as recorded from neuronal populations in the motor cortex. This approach could be used to restore motor function in paralysis patients.
Monkey controls robotic arm using brain signals sent over Internet.,
by Elizabeth A. Thomson, News Office, December 6, 2000
Monkeys in North Carolina have remotely operated a robotic arm 600 miles away in MIT's Touch Lab -- using their brain signals. The feat is based on a neural-recording system reported in the November 16 issue of Nature. In that system, tiny electrodes implanted in the animals' brains detected their brain signals as they controlled a robot arm to reach for a piece of food. According to the scientists from Duke University Medical Center, MIT and the State University of New York (SUNY) Health Science Center, the new system could form the basis for a brain-machine interface that would allow paralyzed patients to control the movement of prosthetic limbs. The Internet experiment "was a historic moment, the start of something totally new," Mandayam Srinivasan, director of MIT's Touch Lab, said in a November 15 story in the Wall Street Journal. They even tested whether the signals could be transmitted over a standard Internet connection, controlling a similar arm in MIT's Laboratory for Human and Machine Haptics, informally known as the Touch Lab. "When we initially conceived the idea of using monkey brain signals to control a distant robot across the Internet, we were not sure how variable delays in signal transmission would affect the outcome," said Dr. Srinivasan. "Even with a standard TCP/IP connection, it worked out beautifully. It was an amazing sight to see the robot in our lab move, knowing that it was being driven by signals from a monkey brain at Duke. It was as if the monkey had a 600-mile-long virtual arm."
Cortical Ensemble Adaptation to Represent Velocity of an Artificial Actuator Controlled by a Brain-Machine Interface.,
by Mikhail A. Lebedev, Jose M. Carmena, Joseph E. O'Doherty, Miriam Zacksenhouse, Craig S. Henriquez, Jose C. Principe, and Miguel A. L. Nicolelis
The Journal of Neuroscience, May 11, 2005, 25(19):4681-4693; doi:10.1523/JNEUROSCI.4088-04.2005
Monkeys can learn to directly control the movements of an artificial actuator by using a brain-machine interface (BMI) driven by the activity of a sample of cortical neurons. Eventually, they can do so without moving their limbs. Neuronal adaptations underlying the transition from control of the limb to control of the actuator are poorly understood. Here, we show that rapid modifications in neuronal representation of velocity of the hand and actuator occur in multiple cortical areas during the operation of a BMI. Initially, monkeys controlled the actuator by moving a hand-held pole. During this period, the BMI was trained to predict the actuator velocity. As the monkeys started using their cortical activity to control the actuator, the activity of individual neurons and neuronal populations became less representative of the animal's hand movements while representing the movements of the actuator. As a result of this adaptation, the animals could eventually stop moving their hands yet continue to control the actuator. These results show that, during BMI control, cortical ensembles represent behaviorally significant motor parameters, even if these are not associated with movements of the animal's own limb.