AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (2.2 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

p-Norm Broad Learning for Negative Emotion Classification in Social Networks

Laboratory of Language Engineering and Computing, Guangdong University of Foreign Studies, Guangzhou 510006, China
Guangdong Provincial Key Laboratory of Nanophotonic Functional Materials and Devices, South China Normal University, Guangzhou 511400, China
School of Computer Science and Software, Zhaoqing University, Zhaoqing 526000, China
School of Information Science and Technology, Guangdong University of Foreign Studies, Guangzhou 510006, China
School of Computing, University of Leeds, Leeds LS2 9JT, United Kingdom
Show Author Information

Abstract

Negative emotion classification refers to the automatic classification of negative emotion of texts in social networks. Most existing methods are based on deep learning models, facing challenges such as complex structures and too many hyperparameters. To meet these challenges, in this paper, we propose a method for negative emotion classification utilizing a Robustly Optimized BERT Pretraining Approach (RoBERTa) and p-norm Broad Learning ( p-BL). Specifically, there are mainly three contributions in this paper. Firstly, we fine-tune the RoBERTa to adapt it to the task of negative emotion classification. Then, we employ the fine-tuned RoBERTa to extract features of original texts and generate sentence vectors. Secondly, we adopt p-BL to construct a classifier and then predict negative emotions of texts using the classifier. Compared with deep learning models, p-BL has advantages such as a simple structure that is only 3-layer and fewer parameters to be trained. Moreover, it can suppress the adverse effects of more outliers and noise in data by flexibly changing the value of p. Thirdly, we conduct extensive experiments on the public datasets, and the experimental results show that our proposed method outperforms the baseline methods on the tested datasets.

References

[1]
M. Bouazizi and T. Ohtsuki, Multi-class sentiment analysis on Twitter: Classification performance and challenges, Big Data Mining and Analytics, vol. 2, no. 3, pp. 181-194, 2019.
[2]
B. Liu, S. J. Tang, X. G. Sun, Q. Y. Chen, J. X. Cao, J. Z. Luo, and S. S. Zhao, Context-aware social media user sentiment analysis, Tsinghua Science and Technology, vol. 25, no. 4, pp. 528-541, 2020.
[3]
Z. X. Li, H. R. Xie, G. Cheng, and Q. Li, Word-level emotion distribution with two schemas for short text emotion classification, Knowl. Based Syst., vol. 227, p. 107163, 2021.
[4]
X. Kang, X. F. Shi, Y. N. Wu, and F. J. Ren, Active learning with complementary sampling for instructing class-biased multi-label text emotion classification, IEEE Trans. Affect. Comput., vol. 14, no. 8, pp. 1-14, 2020.
[5]
B. Pang, L. Lee, and S. Vaithyanathan, Thumbs up? Sentiment classification using machine learning techniques, in Proc. 2002 Conf. on Empirical Methods in Natural Language Processing (EMNLP 2002), Stroudsburg, PA, USA, 2002, pp. 79-86.
[6]
Y. J. An, S. T. Sun, and S. J. Wang, Naive Bayes classifiers for music emotion classification based on lyrics, in Proc. 2017 IEEE/ACIS 16th Int. Conf. on Computer and Information Science (ICIS), Wuhan, China, 2017, pp. 635-638.
[7]
A. Mishra, A. Singh, P. Ranjan, and A. Ujlayan, Emotion classification using ensemble of convolutional neural networks and support vector machine, in Proc. 2020 7th Int. Conf. on Signal Processing and Integrated Networks (SPIN), Paris, France, 2020, pp. 1006-1010.
[8]
H. Y. Chen, Z. Liu, X. Kang, S. Nishide, and F. J. Ren, Investigating voice features for speech emotion recognition based on four kinds of machine learning methods, in Proc. 2019 IEEE 6th Int. Conf. on Cloud Computing and Intelligence Systems (CCIS), Singapore, 2019, pp. 195-199.
[9]
Y. Bengio, H. Schwenk, J. S. Sénecal, F. Morin, J. L. Gauvain, and L. C. Jain, Neural probabilistic language models, in Innovations in Machine Learning: Theory and Applications, D. E. Holmes and L. C. Jain, eds. Berlin, Germany: Springer-Verlag, 2006, pp. 137-186.
[10]
T. Mikolov, K. Chen, G. Corrado, and J. Dean, Efficient estimation of word representations in vector space, in Proc. 1st Int. Conf. on Learning Representations (ICLR), Scottsdale, AZ, USA, 2013, pp. 1-12.
[11]
J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, BERT: Pre-training of deep bidirectional transformers for language understanding, in Proc. 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, MN, USA, 2019, pp. 4171-4186.
[12]
Y. H. Liu, M. Ott, N. Goyal, J. F. Du, M. Joshi, D. Q. Chen, O. Levy, M. Lewis, L. Zettlemoyer, and V. Stoyanov, RoBERTa: A robustly optimized BERT pretraining approach, arXiv preprint arXiv: 1907.11692, 2019.
[13]
G. L. Zhai, Y. Yang, H. Wang, and S. D. Du, Multi-attention fusion modeling for sentiment analysis of educational big data, Big Data Mining and Analytics, vol. 3, no. 4, pp. 311-319, 2020.
[14]
Y. Qian, W. W. Liu, and J. P. Huang, A self-attentive convolutional neural networks for emotion classification on user-generated contents, IEEE Access, vol. 8, pp. 154198-154208, 2020.
[15]
C. Y. Wang, H. Li, X. Hu, and J. L. Zhou, Attention model using full-time information for sentences emotion classification, in 2019 IEEE 8th Joint Int. Information Technology and Artificial Intelligence Conf. (ITAIC), Chongqing, China, 2019, pp. 402-407.
[16]
H. J. Liu, J. R. Zhang, Q. S. Liu, and J. D. Cao, Minimum spanning tree based graph neural network for emotion classification using EEG, Neural Netw., vol. 145, pp. 308-318, 2022.
[17]
C. L. P. Chen and Z. L. Liu, Broad learning system: An effective and efficient incremental learning system without the need for deep architecture, IEEE Trans. Neural Networks Learn. Syst., vol. 29, no. 1, pp. 10-24, 2018.
[18]
B. D. Chen, L. Xing, Z. Z. Wu, J. L. Liang, J. C. Príncipe, and N. N. Zheng, Smoothed least mean p-power error criterion for adaptive filtering, Digit. Signal Process., vol. 40, pp. 154-163, 2015.
[19]
G. Y. Li, Y. X. Li, H. Y. Chen, and W. Deng, Fractional-order controller for course-keeping of underactuated surface vessels based on frequency domain specification and improved particle swarm optimization algorithm, Appl. Sci., vol. 12, no. 6, p. 3139, 2022.
[20]
H. J. Cui, Y. Guan, and H. Y. Chen, Rolling element fault diagnosis based on VMD and sensitivity MCKD, IEEE Access, vol. 9, pp. 120297-120308, 2021.
[21]
W. Deng, X. X. Zhang, Y. Q. Zhou, Y. Liu, X. B. Zhou, H. L. Chen, and H. M. Zhao, An enhanced fast non-dominated solution sorting genetic algorithm for multi-objective problems, Inf. Sci., vol. 585, pp. 441-453, 2022.
[22]
T. Shao, Y. R. Zheng, and J. Benesty, An affine projection sign algorithm robust against impulsive interferences, IEEE Signal Process. Lett., vol. 17, no. 4, pp. 327-330, 2010.
[23]
M. Xiang, Y. L. Xia, and D. P. Mandic, Performance analysis of deficient length quaternion least mean square adaptive filters, IEEE Trans. Signal Process., vol. 68, pp. 65-80, 2020.
[24]
H. Lee, S. Jeong, and Y. Suh, The influence of negative emotions in an online brand community on customer innovation activities, in Proc. 2014 47th Hawaii Int. Conf. on System Sciences, Waikoloa, HI, USA, 2014, pp. 1854-1863.
[25]
C. Tung and W. Lu, Analyzing depression tendency of web posts using an event-driven depression tendency warning model, Artif. Intell. Med., vol. 66, pp. 53-62, 2016.
[26]
P. C. Huang, J. S. Wu, and C. N. Lee, Negative emotion event detection for Chinese posts on Facebook, in Proc. 2015 Int. Conf. on Cloud Computing and Big Data (CCBD), Shanghai, China, 2015, pp. 329-335.
[27]
A. M. Shah, X. B. Yan, A. Qayyum, R. A. Naqvi, and S. J. Shah, Mining topic and sentiment dynamics in physician rating websites during the early wave of the COVID-19 pandemic: Machine learning approach, Int. J. Med. Inf., vol. 149, p. 104434, 2021.
[28]
Y. Sun, X. Y. Zhang, J. H. Ma, and Z. H. Zhang, Classification of negative emotion speech intensity based on similarity algorithm, in Proc. 2018 IEEE Int. Conf. on Information Communication and Signal Processing (ICICSP), Singapore, 2018, pp. 94-97.
[29]
Y. Bie and Y. Yang, A multitask multiview neural network for end-to-end aspect-based sentiment analysis, Big Data Mining and Analytics, vol. 4, no. 3, pp. 195-207, 2021.
[30]
R. Zeng, H. Liu, S. Peng, L. Cao, A. Yang, C. Zong, and G. Zhou, CNN-based broad learning for cross-domain emotion classification, Tsinghua Science and Technology, .
[31]
K. Dheeraj and T. Ramakrishnudu, Negative emotions detection on online mental-health related patients texts using the deep learning with MHA-BCNN model, Exp. Syst. Appl., vol. 182, p. 115265, 2021.
[32]
J. L. Wu, Y. Y. He, L. C. Yu, and K. R. Lai, Identifying emotion labels from psychiatric social texts using a Bi-directional LSTM-CNN model, IEEE Access, vol. 8, pp. 66638-66646, 2020.
[33]
Q. Y. Cheng, Y. Ke, and A. Abdelmouty, Negative emotion diffusion and intervention countermeasures of social networks based on deep learning, J. Intell. Fuzzy Syst., vol. 39, no. 9, pp. 1-11, 2020.
[34]
S. C. Peng, L. H. Cao, Y. M. Zhou, Z. H. Ouyang, A. M. Yang, X. G. Li, W. J. Jia, and S. Yu, A survey on deep learning for textual emotion analysis in social networks, Digit. Commun. Netw., .
[35]
Y. X. Lv, F. N. Wei, L. H. Cao, S. C. Peng, J. W. Niu, S. Yu, and C. R. Wang, Aspect-level sentiment analysis using context and aspect memory network, Neurocomputing, vol. 428, pp. 195-205, 2021.
[36]
M. Defferrard, X. Bresson, and P. Vandergheynst, Convolutional neural networks on graphs with fast localized spectral filtering, in Proc. 30th Int. Conf. on Neural Information Processing Systems (NIPS), Barcelona, Spain, 2016, pp. 3844-3852.
[37]
T. N. Kipf and M. Welling, Semi-supervised classification with graph convolutional networks, in Proc. 5th Int. Conf. on Learning Representations (ICLR), Toulon, France, 2017, pp. 1-14.
[38]
D. Zhang, L. Wu, C. Sun, S. Li, Q. Zhu, and G. Zhou, Modeling both context- and speaker-sensitive dependence for emotion detection in multi-speaker conversations, in Proc. 28th Int. Joint Conf. on Artificial Intelligence (IJCAI), Macao, China, 2019, pp. 5415-5421.
[39]
D. Ghosal, N. Majumder, S. Poria, N. Chhaya, and A. Gelbukh, DialogueGCN: A graph convolutional neural network for emotion recognition in conversation, in Proc. 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int. Joint Conf. Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 2019, pp. 154-164.
[40]
W. Z. Shen, S. Y. Wu, Y. Y. Yang, and X. J. Quan, Directed acyclic graph network for conversational emotion recognition, in Proc. 59th Ann. Meeting of the Association for Computational Linguistics and the 11th Int. Joint Conf. on Natural Language Processing (ACL/IJCNLP), 2021, pp. 1551-1560.
[41]
Z. H. Zhang, F. Min, G. S. Chen, S. P. Shen, Z. C. Wen, and X. B. Zhou, Tri-partition state alphabet-based sequential pattern for multivariate time series, Cogn. Comput., 2021, .
[42]
Y. L. Luo, Q. Fu, J. T. Xie, Y. B. Qin, G. P. Wu, J. X. Liu, F. Jiang, Y. Cao, and X. M. Ding, EEG-based emotion classification using spiking neural networks, IEEE Access, vol. 8, pp. 46007-46016, 2020.
[43]
E. Q. Wu, M. C. Zhou, D. W. Hu, L. J. Zhu, Z. R. Tang, X. Y. Qiu, P. Deng, L. M. Zhu, and H. Ren, Self-paced dynamic infinite mixture model for fatigue evaluation of pilots brains, IEEE Trans. Cybern., .
[44]
X. Han, B. Y. Li, and Z. R. Wang, An attention-based neural framework for uncertainty identification on social media texts, Tsinghua Science and Technology, vol. 25, no. 1, pp. 117-126, 2020.
[45]
S. C. Peng, G. J. Wang, Y. M. Zhou, C. Wan, C. Wang, S. Yu, and J. W. Niu, An immunization framework for social networks through big data based influence modeling, IEEE Trans. Depend. Secure Comput., vol. 16, no. 6, pp. 984-995, 2019.
[46]
Z. F. Zhang, X. L. Li, and C. Q. Gan, Identifying influential nodes in social networks via community structure and influence distribution difference, Digit. Commun. Netw., vol. 7, no. 1, pp. 131-139, 2021.
[47]
Y. H. Chu, H. F. Lin, L. Yang, S. C. Sun, Y. F. Diao, C. R. Min, X. C. Fan, and C. Shen, Hyperspectral image classification with discriminative manifold broad learning system, Neurocomputing, vol. 442, pp. 236-248, 2021.
[48]
J. W. Jin, Y. T. Li, T. J. Yang, L. Zhao, J. W. Duan, and C. L. P. Chen, Discriminative group-sparsity constrained broad learning system for visual recognition, Inf. Sci., vol. 576, pp. 800-818, 2021.
[49]
Z. Liu, S. L. Huang, W. Jin, and Y. Mu, Broad learning system for semi-supervised learning, Neurocomputing, vol. 444, pp. 38-47, 2021.
[50]
S. C. Peng, R. Zeng, H. Z. Liu, G. H. Chen, R. H. Wu, A. M. Yang, and S. Yu, Emotion classification of text based on BERT and broad learning system, in Proc. 5th Asia Pacific Web (APWeb) and Web-Age Information Management (WAIM) Joint Int. Conf. on Web and Big Data, Guangzhou, China, 2021, pp. 382-396.
[51]
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł Kaiser, and I. Polosukhin, Attention is all you need, in Proc. 31st Int. Conf. on Neural Information Processing Systems, San Francisco, CA, USA, 2017, pp. 6000-6010.
[52]
J. Li, X. Wang, Z. P. Tu, and M. R. Lyu, On the diversity of multi-head attention, Neurocomputing, vol. 454, pp. 14-24, 2021.
[53]
R. Sennrich, B. Haddow, and A. Birch, Neural machine translation of rare words with subword units, in Proc. 54th Ann. Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016, pp. 1715-1725.
[54]
R. Debnath, N. Takahide, and H. Takahashi, A decision based one-against-one method for multi-class support vector machine, Pattern Analysis and Applications, vol. 7, no. 2, pp. 164-175, 2004.
[55]
Y. Kim, Convolutional neural networks for sentence classification, in Proc. 2014 Conf. on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, 2014, pp. 1746-1751.
[56]
M. Schuster and K. K. Paliwal, Bidirectional recurrent neural networks, IEEE Trans. Signal Process., vol. 45, no. 11, pp. 2673-2681, 1997.
[57]
P. Zhou, W. Shi, J. Tian, Z. Y. Qi, B. C. Li, H. W. Hao, and B. Xu, Attention-based bidirectional long short-term memory networks for relation classification, in Proc. 54th Ann. Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016, pp. 207-212.
[58]
Y. M. Cui, W. X. Che, T. Liu, B. Qin, S. J. Wang, and G. P. Hu, Revisiting pre-trained models for Chinese natural language processing, in Proc. Findings of the Association for Computational Linguistics: EMNLP 2020, 2020, pp. 657-668.
Big Data Mining and Analytics
Pages 245-256
Cite this article:
Chen G, Peng S, Zeng R, et al. p-Norm Broad Learning for Negative Emotion Classification in Social Networks. Big Data Mining and Analytics, 2022, 5(3): 245-256. https://doi.org/10.26599/BDMA.2022.9020008

5705

Views

254

Downloads

8

Crossref

5

Web of Science

6

Scopus

0

CSCD

Altmetrics

Received: 17 December 2021
Revised: 30 March 2022
Accepted: 31 March 2022
Published: 09 June 2022
© The author(s) 2022.

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return