AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (2.3 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

An improved EEGNet for single-trial EEG classification in rapid serial visual presentation task

Faculty of Intelligent Manufacturing, Wuyi University, Jiangmen 529020, Guangdong, China
Show Author Information

Abstract

As a new type of brain-computer interface (BCI), the rapid serial visual presentation (RSVP) paradigm has attracted significant attention. The mechanism of RSVP is detecting the P300 component corresponding to the target image to realize fast and correct recognition. This paper proposed an improved EEGNet model to achieve good performance in offline and online data. Specifically, the data were filtered by xDAWN to enhance the signal-to-noise ratio of the electroencephalogram (EEG) signals. The focal loss function was used instead of the cross-entropy loss function to solve the classification problems of unbalanced samples. Additionally, the subject-specific data were fed to the improved EEGNet model to obtain a subject-specific model. We applied the proposed model at the BCI Controlled Robot Contest in World Robot Contest 2021 and won the second place. The average recall rate of the four participants reached 51.56% in triple classification. In the offline data benchmark dataset (64 subjects-RSVP tasks), the average recall rates of groups A and B reached 76.07% and 78.11%, respectively. We provided an alternative method to identify targets based on the RSVP paradigm.

References

[1]
Wolpaw JR, Birbaumer N, McFarland DJ, et al. Brain-computer interfaces for communication and control. Clin Neurophysiol 2002, 113(6): 767-791.
[2]
Li J, Liu Y, Lu Z. A competitive brain-computer interface: Multi-person car racing system. In 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 2013, pp 2200-2203.
[3]
Rebsamen B, Guan C, Zhang H. A brain controlled wheelchair to navigate in familiar environments. IEEE Trans Neural Syst Rehabil Eng 2010, 18(6): 590-598.
[4]
Li J, Liang J, Zhao Q, et al. Design of assistive wheelchair system directly steered by human thoughts. Int J Neural Syst 2013, 23(3): 1350013.
[5]
Pinegger A, Faller J, Halder S, et al. Control or non-control state: that is the question! An asynchronous visual P300-based BCI approach. J Neural Eng 2015, 12(1): 014001.
[6]
Linden DEJ. The p300: where in the brain is it produced and what does it tell us? Neuroscientist 2005, 11(6): 563-576.
[7]
Chun MM, Potter MC. A two-stage model for multiple target detection in rapid serial visual presentation. J Exp Psychol Hum Percept Perform 1995, 21(1): 109-127.
[8]
Polich J, Kok A. Cognitive and biological determinants of P300: an integrative review. Biol Psychol 1995, 41(2): 103-146.
[9]
Lemm S, Blankertz B, Curio G, et al. Spatio-spectral filters for improving the classification of single trial EEG. IEEE Trans Biomed Eng 2005, 52(9): 1541-1548.
[10]
Dornhege G, Blankertz B, Krauledat M, et al. Combined optimization of spatial and temporal filters for improving brain-computer interfacing. IEEE Trans Biomed Eng 2006, 53(11): 2274-2281.
[11]
Yu K, Shen KQ, Shao SY, et al. Common spatio- temporal pattern for single-trial detection of event- related potential in rapid serial visual presentation triage. IEEE Trans Biomed Eng 2011, 58(9): 2513-2520.
[12]
Yu K, Shen K, Shao S, et al. Bilinear common spatial pattern for single-trial ERP-based rapid serial visual presentation triage. J Neural Eng 2012, 9(4): 046013.
[13]
Sajda P, Gerson A, Parra L. High-throughput image search via single-trial event detection in a rapid serial visual presentation task. In First International IEEE EMBS Conference on Neural Engineering, Capri, Italy, 2003, pp 7-10.
[14]
Marathe AR, Ries AJ, McDowell K. Sliding HDCA: single-trial EEG classification to overcome and quantify temporal variability. IEEE Trans Neural Syst Rehabil Eng 2014, 22(2): 201-211.
[15]
Mathan S, Whitlow S, Mazaeva N. Sensor-based cognitive state assessment in a mobile environment. In Proceedings of the 11th International Conference on Human-Computer Interaction, Las Vegas, Nevada, USA, 2005, pp 110-119.
[16]
Xiao XL, Xu MP, Jin J, et al. Discriminative canonical pattern matching for single-trial classification of ERP components. IEEE Trans Biomed Eng 2020, 67(8): 2266-2275.
[17]
Cecotti H, Gräser A. Convolutional neural networks for P300 detection with application to brain-computer interfaces. IEEE Trans Pattern Anal Mach Intell 2011, 33(3): 433-445.
[18]
Joshi R, Goel P, Sur M, et al. Single trial P300 classification using convolutional LSTM and deep learning ensembles method. In Intelligent Human Computer Interaction. Tiwary U, Eds. Cham: Springer, 2018, pp 3-15.
[19]
Lawhern VJ, Solon AJ, Waytowich NR, et al. EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng 2018, 15(5): 056013.
[20]
Liu MF, Wu W, Gu ZH, et al. Deep learning based on Batch Normalization for P300 signal detection. Neurocomputing 2018, 275: 288-297.
[21]
Lan Z, Yan C, Li ZX, et al. MACRO: multi-attention convolutional recurrent model for subject-independent ERP detection. IEEE Signal Process Lett 2021, 28: 1505-1509.
[22]
Ma R, Yu T, Zhong X. Capsule network for ERP detection in brain-computer interface. IEEE Trans Neural Syst Rehabil Eng 2021, 29: 718-730.
[23]
Schirrmeister RT, Springenberg JT, Fiederer LDJ, et al. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum Brain Mapp 2017, 38(11): 5391-5420.
[24]
Zhang SG, Wang YJ, Zhang LJ, et al. A benchmark dataset for RSVP-based brain-computer interfaces. Front Neurosci 2020, 14: 568000.
[25]
Rivet B, Souloumiac A, Attina V, et al. xDAWN algorithm to enhance evoked potentials: application to brain-computer interface. IEEE Trans Biomed Eng 2009, 56(8): 2035-2043.
[26]
Rivet B, Cecotti H, Souloumiac A, et al. Theoretical analysis of xDAWN algorithm: application to an efficient sensor selection in a p300 BCI. In 19th European Signal Processing Conference, Barcelona, Spain, 2011, pp 1382-1386.
[27]
Ang KK, Chin ZY, Wang CC, et al. Filter bank common spatial pattern algorithm on BCI competition IV datasets 2a and 2b. Front Neurosci 2012, 6: 39.
[28]
Dyrholm M, Christoforou C, Parra LC. Bilinear discriminant component analysis. J Mach Learn Res 2007, 8(3): 1097-1111.
[29]
Chollet F. Xception: Deep learning with depthwise separable convolutions. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Hawaii, USA, 2017, pp 1251-1258.
[30]
Springenberg JT, Dosovitskiy A, Brox T. Striving for simplicity: The all convolutional net. arXiv preprint 2014, arXiv: 1412.6806.
[31]
Reverdy P, Leonard NE. Parameter estimation in softmax decision-making models with linear objective functions. IEEE Trans Autom Sci Eng 2016, 13(1): 54-67.
[32]
Lin TY, Goyal P, Girshick R. Focal loss for dense object detection. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 2017, pp 2980-2988.
[33]
Zhao YN, Dong CX, Zhang GB, et al. EEG-Based Seizure detection using linear graph convolution network with focal loss. Comput Methods Programs Biomed 2021, 208: 106277.
Brain Science Advances
Pages 111-126
Cite this article:
Zhang H, Wang Z, Yu Y, et al. An improved EEGNet for single-trial EEG classification in rapid serial visual presentation task. Brain Science Advances, 2022, 8(2): 111-126. https://doi.org/10.26599/BSA.2022.9050007

1895

Views

228

Downloads

18

Crossref

Altmetrics

Received: 08 February 2022
Revised: 11 March 2022
Accepted: 21 March 2022
Published: 29 June 2022
© The authors 2022.

This article is published with open access at journals.sagepub.com/home/BSA

Creative Commons Non Commercial CC BY- NC: This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License (http://www.creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/ en-us/nam/open-access-at-sage).

Return