A Brain-Computer Interface for Robotic Arm Control


From a historical point of view it was not long ago since computers first appeared on the

stage. With the invention of the arguably first programmable computer, the Z1 built by Konrad Zuse around 1938, the number of computing devices is growing at an enormous pace.

Clearly the number of computer users is growing to the same extent and nowadays it is a

mandatory skill to be able to use a computer with ease. Primarily intended as a tool to automate calculations it has evolved from a simple tool to an all-rounder used in entertainment,

multimedia applications, industrial controlling and has even almost replaced pencil and paper. During the evolution of computers numerous devices have been developed to allow for

easy communication between humans and computers. These devices are often specifically

tailored to a certain task. A prime example is the keyboard which is almost exclusively used

to communicate letters to the computer and is less suitable to manipulate graphical objects

as they occur in graphical applications like drawing programs, modern graphical user interfaces or games. For the latter task usually 2D-pointing devices like mice or trackballs are used

since this type of task demands non-symbolic continuous input data for which pointing gestures are the most intuitive way. Numerous other input devices have been developed for even

more specialized tasks, e.g. braille keyboards and displays (figure 1.1) intended for sight impaired people as a replacement for the standard keyboard and computer screen. Yet, all of

these input devices require that the user has at least to some extent voluntary control over

their limb movement. For severely impaired people as it is the case with amyotrophic lateral

sclerosis (ALS) or spinal chord injury, no movement of the muscles in the lower extremities

or even the whole body is possible. Therefore none of the preceding input devices is applicable for this group of people. Even though the target group of ALS patients in a late phase is

quite small, people with spinal chord injuries are quite common. People who have lost motor

control over their body completely and are unable to talk, move or express their feelings in

other ways are considered as locked-in patients since they are essentially prisoners in their

own body. As their brain is usually not severely damaged, one possible way to get access to

their world would be to extract relevant information about their intentions directly from their

brain-activity. Nowadays we have several methods to measure brain activity of humans, both

invasive and non-invasive techniques. Devices exploiting brain-activity data for communication are not even new. Considering the development timeline of the computer mouse,