Simultaneously exploring multi-scale and asymmetric EEG features for emotion recognition

13 Oct 2021  ·  Yihan Wu, Min Xia, Li Nie, Yangsong Zhang, Andong Fan ·

In recent years, emotion recognition based on electroencephalography (EEG) has received growing interests in the brain-computer interaction (BCI) field. The neuroscience researches indicate that the left and right brain hemispheres demonstrate activity differences under different emotional activities, which could be an important principle for designing deep learning (DL) model for emotion recognition. Besides, owing to the nonstationarity of EEG signals, using convolution kernels of a single size may not sufficiently extract the abundant features for EEG classification tasks. Based on these two angles, we proposed a model termed Multi-Scales Bi-hemispheric Asymmetric Model (MSBAM) based on convolutional neural network (CNN) structure. Evaluated on the public DEAP and DREAMER datasets, MSBAM achieved over 99% accuracy for the two-class classification of low-level and high-level states in each of four emotional dimensions, i.e., arousal, valence, dominance and liking, respectively. This study further demonstrated the promising potential to design the DL model from the multi-scale characteristics of the EEG data and the neural mechanisms of the emotion cognition.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods