Classifying Brain Waves of Left and Right Hand Movement Imagery with a Portable Electroencephalograph
MetadataShow full item record
Abstract Objective: This is a follow-up to research done by Li, Xu, Zhu (2015). This research attempts to show that a portable electroencephalograph (EEG) with its electrodes placed on the forehead is capable of classifying eye movements, but is not capable of classifying movement imagery. Background: Brain-Computer Interfaces (BCIs) can already enable severely disabled patients to interact with the environment (Chaudhary, Birbaumer, & Ramos-Murguialday, 2016). However, the necessary traditional EEG systems are difficult and inconvenient to use for portable BCIs. In practice, a portable EEG system is desired, but using them to classify movement imagery lacks research. Method: The portable EEG system Muse was used to gather data while participants followed a stimulus on a screen. The stimulus alternated between appearing left and appearing right. Participants followed the stimulus either with their eyes, or by imagining closing their hand at the same side as the stimulus. For each of these and two other tasks a Support Vector Machine and a Neural Network were trained. Results: Both algorithms were able to classify looking to the left versus looking to the right with an accuracy above 80%. On the contrary, neither of them was capable of classifying imagining closing the left versus the right hand. Conclusion: These results show that a portable EEG system is capable of classifying the direction of eye movements. In addition, and unlike the statement by Li et al. (2015), this research suggests that Muse is not capable of classifying left and right hand movement imagery. Application: Patients with classical locked-in syndrome can still make vertical eye movements (Bauer, Gerstenbrand, & Rumpl, 1979). In case this research’ results generalize to all eye movements, there is potential for these patients for easier interaction with the external world.