Tuesday, January 18, 2011

Electroencephalography, Eye tracking, and Robot Suits for Paraplegics

Assistive technology for paraplegia, locked-in syndrome, and other diseases which affect mobility is far behind where it could be. A Japanese company, Cyberdyne, has designed the HAL Robot Suit to help address this problem. Unfortunately, their Robot Suit only solves half of the problem. It senses the electrical activity in your limbs in order to know when to move, but if you can't move your limbs, that's no help at all. Another thing that immobilized people would very much like to be able to do is speak, and I shall consider this capacity as well.

Control Methods
Two methods for controlling a robot suit and speaking are: eye tracking, and EEG (electroencephalography). In eye tracking, a camera looks at your eye and figures out where you're looking. It's something we humans do all the time without any particular effort. We've evolved to be good at this, because it enables us to figure out what someone else is thinking about or intending to do. This technology is already being used to create assistive devices which enable immobilized people to speak by looking at words on a screen. But the technology is imperfect, and it is not being used to control robot suits. A system could be devised in which the patient uses video goggles instead of a fixed computer monitor. The patient could then look to a corner of the screen to switch between speech and movement modes. He or she could look at words to generate sentences to be spoken, or else look at areas of the world in order to tell the robot suit to locomote there.

EEG
The second proposed method for controlling a robot suit and generating speech is EEG. A shower cap filled with electrodes is placed on the head, and the electrical fields generated by brain activity are measured. This could be even more promising a method than eye tracking, as it holds the promise to literally read one's thoughts and transform them into actions. But there are several problems with this approach. First, we don't yet know enough about how EEG signals correspond to particular thought patterns. What's more, EEG isn't very sensitive to the particular region of the brain that the electrical activity is being generated from. How can these problems be overcome.

1. Reverse Correlation
First we need to figure out which patterns of action and stimulation correspond to which patterns of brain activity. Clever scientists have begun this project, by showing different types of stimuli while recording with EEG. However, this approach is limited by the classes of stimuli that experimenters can think up, and neglects the importance of action. Another approach would be to expose test subjects to a barrage of stimuli, and have them perform a large set of actions, and then use reverse correlation, aka intersubject synchronization, to figure out which actions and stimuli correspond to which patterns of brain activity.

2. Genetic Algorithms
But how do we really know that these patterns of brain activity are for these patterns of action and stimulation? A novel way to test such claims is with a genetic algorithm. A genetic algorithm is a computer science technique, inspired by natural selection, for searching through a problem space for fitter and fitter solutions. Here's how this would work:

a. Begin with an initial population of noise images
b. Evaluate each image based on how strongly it elicits a particular EEG response
c. Fitter images survive and reproduce to produce a next generation of images
d. Offspring images are created by mating the fittest images of the previous generation, and mutations are allowed to creep in, to encourage evolution
e. Over time, the population of images evolves to look like the type of stimuli that that pattern of brain activity is for

Connecting the Dots
Having identified a set of EEG patterns that correspond to particular actions and sensations, we can then use these as control signals for moving a robot suit and generating speech. If eye tracking is used simultaneously, the control software might be able to have a pretty good idea of where the patient intends to move or what he/she intends to say.

-Webb

2 comments:

fernando said...

As Nietzsche put it, "Man is something that shall be overcome."

webb said...

What, are you afraid of paraplegic cyborgs taking over the world?