Brain–computer interface: controlling a robotic arm using facial expressions

http://journals.tubitak.gov.tr/elektrik/issues/elk-18-26-2/elk-26-2-7-1606-296.pdf

Abstract: The aim of this paper is to develop a brain–computer interface (BCI) system that can control a robotic

arm using EEG signals generated by facial expressions. The EEG signals are acquired using a neurosignal acquisition

headset. The robotic arm consists of a 3-D printed prosthetic hand that is attached to a forearm and elbow made of

craft wood. The arm is designed to make four moves. Each move is controlled by one facial expression. Hence, four

different EEG signals are used in this work. The performance of the BCI robotic arm is evaluated by testing it on 10

subjects. Initially 14 electrodes were used to collect the EEG signals, and the accuracy of the system is around 95%. We

have further analyzed the minimum requirement for the number of electrodes for the system to function properly. Seven

(instead of 14) electrodes in the parietal, temporal, and frontal regions are sufficient for the system to function properly.

The accuracy of the system with 7 electrodes is around 95%.