An intelligent emotion recognition system based on electroencephalography (EEG) signals shows considerable potential in various domains such as healthcare, entertainment, and education, thanks to its portability, high temporal resolution, and real-time capabilities. However, the existing research in this field faces limitations stemming from the nonstationary nature and individual variability of EEG signals. In this study, we present a novel EEG emotion recognition model, named GraphEmotionNet, designed to enhance the accuracy of EEG-based emotion recognition through the incorporation of a spatiotemporal attention mechanism and transfer learning. The proposed GraphEmotionNet model can effectively learn the intrinsic connections between EEG channels and construct an adaptive graph. This graph’s adaptive nature is crucial in optimizing spatial–temporal graph convolutions, which in turn enhances spatial–temporal feature characterization and contributes to the process of emotion classification. Moreover, an integration of domain adaptation aligns the extracted features across different domains, further alleviating the impact of individual EEG variability. We evaluate the model performance on two benchmark databases, employing two types of cross-validation protocols: within-subject cross-validation and cross-subject cross-validation. The experimental results affirm the model’s efficacy in extracting EEG features linked to emotional semantics and demonstrate its promising performance in emotion recognition.
- Article type
- Year
- Co-author
Emotions, formed in the process of perceiving external environment, directly affect human daily life, such as social interaction, work efficiency, physical wellness, and mental health. In recent decades, emotion recognition has become a promising research direction with significant application values. Taking the advantages of electroencephalogram (EEG) signals (i.e., high time resolution) and video-based external emotion evoking (i.e., rich media information), video-triggered emotion recognition with EEG signals has been proven as a useful tool to conduct emotion-related studies in a laboratory environment, which provides constructive technical supports for establishing real-time emotion interaction systems. In this paper, we will focus on video-triggered EEG-based emotion recognition and present a systematical introduction of the current available video-triggered EEG-based emotion databases with the corresponding analysis methods. First, current video-triggered EEG databases for emotion recognition (e.g., DEAP, MAHNOB-HCI, SEED series databases) will be presented with full details. Then, the commonly used EEG feature extraction, feature selection, and modeling methods in video-triggered EEG-based emotion recognition will be systematically summarized and a brief review of current situation about video-triggered EEG-based emotion studies will be provided. Finally, the limitations and possible prospects of the existing video-triggered EEG-emotion databases will be fully discussed.