Graph data have extensive applications in various domains, including social networks, biological reaction networks, and molecular structures. Graph classification aims to predict the properties of entire graphs, playing a crucial role in many downstream applications. However, existing graph neural network methods require a large amount of labeled data during the training process. In real-world scenarios, the acquisition of labels is extremely costly, resulting in labeled samples typically accounting for only a small portion of all training data, which limits model performance. Current semi-supervised graph classification methods, such as those based on pseudo-labels and knowledge distillation, still face limitations in effectively utilizing unlabeled graph data and mitigating pseudo-label bias issues. To address these challenges, we propose a Semi-supervised graph Contrastive learning based on Associative Memory network and Pseudo-label Similarity (SCoAMPS). SCoAMPS integrates pseudo-labeling techniques with contrastive learning by generating contrastive views through multiple encoders, selecting positive and negative samples using pseudo-label similarity, and defining associative memory network to alleviate pseudo-label bias problems. Experimental results demonstrate that SCoAMPS achieves significant performance improvements on multiple public datasets.
G. Chen, S. Peng, R. Zeng, Z. Hu, L. Cao, Y. Zhou, Z. Ouyang, and X. Nie, p-Norm broad learning for negative emotion classification in social networks, Big Data Mining and Analytics, vol. 5, no. 3, pp. 245–256, 2022.
G. A. Pavlopoulos, M. Secrier, C. N. Moschopoulos, T. G. Soldatos, S. Kossida, J. Aerts, R. Schneider, and P. G. Bagos, Using graph theory to analyze biological networks, BioData Min., vol. 4, p. 10, 2011.
R. Kojima, S. Ishida, M. Ohta, H. Iwata, T. Honma, and Y. Okuno, kGCN: A graph-based deep learning framework for chemical structures, J. Cheminform., vol. 12, no. 1, p. 32, 2020.
A. D. Becke, Perspective: Fifty years of density-functional theory in chemical physics, J. Chem. Phys., vol. 140, no. 18, p. 18A301, 2014.
W. Kohn and L. J. Sham, Self-consistent equations including exchange and correlation effects, Phys. Rev., vol. 140, no. 4A, pp. A1133–A1138, 1965.
K. Hansen, F. Biegler, R. Ramakrishnan, W. Pronobis, O. A. von Lilienfeld, K. R. Müller, and A. Tkatchenko, Machine learning predictions of molecular properties: Accurate many-body potentials and nonlocality in chemical space, J. Phys. Chem. Lett., vol. 6, no. 12, pp. 2326–2331, 2015.
J. Zhang and Q. Xu, Attention-aware heterogeneous graph neural network, Big Data Mining and Analytics, vol. 4, no. 4, pp. 233–241, 2021.
Z. Meng, C. Chen, X. Zhang, W. Zhao, and X. Cui, Exploring fragment adding strategies to enhance molecule pretraining in AI-driven drug discovery, Big Data Mining and Analytics, vol. 7, no. 3, pp. 565–576, 2024.
N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt, Weisfeiler-lehman graph kernels, J. Mach. Learn. Res., vol. 12, pp. 2539–2561, 2011.
J. Li, Y. Huang, H. Chang, and Y. Rong, Semi-supervised hierarchical graph classification, IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 5, pp. 6265–6276, 2023.
Y. Dong, M. Luo, J. Li, Z. Liu, and Q. Zheng, Semi-supervised graph contrastive learning with virtual adversarial augmentation, IEEE Trans. Knowl. Data Eng., vol. 36, no. 8, pp. 4232–4244, 2024.
T. Miyato, S. I. Maeda, M. Koyama, and S. Ishii, Virtual adversarial training: A regularization method for supervised and semi-supervised learning, IEEE Trans. Pattern Anal. Mach. Intell., vol. 41, no. 8, pp. 1979–1993, 2019.