[3]
H. Ma, C. Yan, Y. Guo, S. Wang, Y. Wang, H. Sun, and J. Huang, Improving molecular property prediction on limited data with deep multi-label learning, in Proc. 2020 IEEE Int. Conf. Bioinformatics and Biomedicine (BIBM ), Seoul, Republic of Korea, 2020, pp. 2779–2784.
[8]
Y. Song, S. Zheng, Z. Niu, Z. H. Fu, Y. Lu, and Y. Yang, Communicative representation learning on attributed molecular graphs, in Proc. 29 th Int. Joint Conf. Artificial Intelligence, Yokohama, Japan, 2020, pp. 2831–2838.
[9]
H. Li, D. Zhao, and J. Zeng, KPGT: Knowledge-guided pre-training of graph transformer for molecular property prediction, in Proc. 28 th ACM SIGKDD Conf. Knowledge Discovery and Data Mining, Washington, DC, USA, 2022, pp. 857–867.
[11]
Y. Rong, Y. Bian, T. Xu, W. Xie, Y. Wei, W. Huang, and J. Huang, Self-supervised graph transformer on large-scale molecular data, in Proc. 34 th Int. Conf. Neural Information Processing Systems, Vancouver, Canada, 2020, pp. 12559–12571.
[13]
S. Yin and G. Zhong, LGI-GT: Graph transformers with local and global operators interleaving, in Proc. 32 nd Int. Joint Conf. Artificial Intelligence, Macao, China, 2023, pp. 4504–4512.
[14]
S. Luo, T. Chen, Y. Xu, S. Zheng, T. Y. Liu, L. Wang, and D. He, One transformer can understand both 2D & 3D molecular data, arXiv preprint arXiv: 2210.01765, 2023.
[18]
S. Liu, H. Wang, W. Liu, J. Lasenby, H. Guo, and J. Tang, Pre-training molecular graph representation with 3D geometry, arXiv preprint arXiv: 2110.07728, 2022.
[19]
S. Li, J. Zhou, T. Xu, D. Dou, and H. Xiong, GeomGCL: Geometric graph contrastive learning for molecular property prediction, in Proc. 36 th AAAI Conf. Artificial Intelligence, Virtual Event, 2022, pp. 4541–4549.
[20]
J. Zhu, Y. Xia, L. Wu, S. Xie, T. Qin, W. Zhou, H. Li, and T. Y. Liu, Unified 2D and 3D pre-training of molecular representations, in Proc. 28 th ACM SIGKDD Conf. Knowledge Discovery and Data Mining, Washington, DC, USA, 2022, pp. 2626–2636.
[21]
Z. Guo, W. Yu, C. Zhang, M. Jiang, and N. V. Chawla, GraSeq: Graph and sequence fusion learning for molecular property prediction, in Proc. 29 th ACM Int. Conf. Information & Knowledge Management, Virtual Event, 2020, pp. 435–443.
[25]
Z. Hao, C. Lu, Z. Huang, H. Wang, Z. Hu, Q. Liu, E. Chen, and C. Lee, ASGN: An active semi-supervised graph neural network for molecular property prediction, in Proc. 26 th ACM SIGKDD Int. Conf. Knowledge Discovery & Data Mining, Virtual Event, 2020, pp. 731–752.
[26]
F. Y. Sun, J. Hoffmann, V. Verma, and J. Tang, InfoGraph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization, arXiv preprint arXiv: 1908.01000, 2020.
[28]
Y. Sun, Y. Chen, W. Ma, W. Huang, K. Liu, Z. Ma, W. Y. Ma, and Y. Lan, PEMP: Leveraging physics properties to enhance molecular property prediction, in Proc. 31 st ACM Int. Conf. Information & Knowledge Management, Atlanta, GA, USA, 2022, pp. 3505–3513.
[29]
W. Chen, A. Tripp, and J. M. Hernández-Lobato, Meta-learning adaptive deep kernel Gaussian processes for molecular property prediction, arXiv preprint arXiv: 2205.02708, 2023.
[30]
X. Zhuang, Q. Zhang, B. Wu, K. Ding, Y. Fang, and H. Chen, Graph sampling-based meta-learning for molecular property prediction, arXiv preprint arXiv: 2306.16780, 2023.
[35]
D. Weininger, A. Weininger, and J. L. Weininger, SMILES. 2. Algorithm for generation of unique smiles notation, J. Chem. Inf. Comput. Sci., vol. 29, no. 2, pp. 97–101, 1989.
[36]
D. Weininger, SMILES. 3. DEPICT. Graphical depiction of chemical structures, J. Chem. Inf. Comput. Sci., vol. 30, no. 3, pp. 237–243, 1990.
[40]
A. D. McNaught and A. Wilkinson, Compendium of Chemical Terminology, 2nd ed. Oxford, UK: Blackwell Science, 1997.
[42]
G. Landrum, RDKit: A software suite for cheminformatics, computational chemistry, and predictive modeling, Greg Landrum, vol. 8, p. 31, 2013.
[46]
X. Q. Lewell, D. B. Judd, S. P. Watson, and M. M. Hann, RECAP-retrosynthetic combinatorial analysis procedure: A powerful new technique for identifying privileged molecular fragments with useful applications in combinatorial chemistry, J. Chem. Inf. Comput. Sci., vol. 38, no. 3, pp. 511–522, 1998.
[49]
F. Kruger, N. Stiefl, and G. A. Landrum, rdScaffoldNetwork: The scaffold network implementation in RDKit, J. Chem. Inf. Model., vol. 60, no. 7, pp. 3331–3335, 2020.
[54]
G. Zhou, Z. Gao, Q. Ding, H. Zheng, H. Xu, Z. Wei, L. Zhang, and G. Ke, Uni-Mol: A universal 3D molecular representation learning framework, chemRxiv. doi: 10.26434/chemrxiv-2022-jjm0j-v4.
[55]
S. Chithrananda, G. Grand, and B. Ramsundar, ChemBERTa: Large-scale self-supervised pretraining for molecular property prediction, arXiv preprint arXiv: 2010.09885, 2020.
[56]
A. Yüksel, E. Ulusoy, A. Ünlü, and T. Doǧan, SELFormer: Molecular representation learning via SELFIES language models, Mach. Learn. : Sci. Technol., vol. 4, no. 2, p. 025035, 2023.
[57]
X. C. Zhang, J. C. Yi, G. P. Yang, C. K. Wu, T. J. Hou, and D. S. Cao, ABC-Net: A divide-and-conquer based deep learning architecture for SMILES recognition from molecular images, Brief. Bioinform., vol. 23, no. 2, p. bbac033, 2022.
[59]
P. Liu, X. Qiu, X. Chen, S. Wu, and X. Huang, Multi-timescale long short-term memory neural network for modelling sentences and documents, in Proc. 2015 Conf. Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015, pp. 2326–2335.
[60]
J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, Gated feedback recurrent neural networks, in Proc. 32 nd Int. Conf. Int. Conf. Machine Learning, Lille, France, 2015, pp. 2067–2075.
[65]
X. Zhang, C. Chen, Z. Meng, Z. Yang, H. Jiang, and X. Cui, CoAtGIN: Marrying convolution and attention for graph-based molecule property prediction, in Proc. 2022 IEEE Int. Conf. Bioinformatics and Biomedicine (BIBM ), Las Vegas, NV, USA, 2022, pp. 374–379.
[68]
H. Ma, Y. Bian, Y. Rong, W. Huang, T. Xu, W. Xie, G. Ye, and J. Huang, Multi-view graph neural networks for molecular property prediction, arXiv preprint arXiv: 2005.13607, 2020.
[70]
J. Feng, Z. Wang, Y. Li, B. Ding, Z. Wei, and H. Xu, MGMAE: Molecular representation learning by reconstructing heterogeneous graphs with a high mask ratio, in Proc. 31 st ACM Int. Conf. Information & Knowledge Management, Atlanta, GA, USA, 2022, pp. 509–519.
[72]
S. Yang, Z. Li, G. Song, and L. Cai, Deep molecular representation learning via fusing physical and chemical information, in Proc. Annu. Conf. Neural Information Processing Systems, Virtual Event, 2021, pp. 16346–16357.
[78]
C. Lu, Q. Liu, C. Wang, Z. Huang, P. Lin, and L. He, Molecular property prediction: A multilevel quantum interactions modeling perspective, in Proc. 33 rd AAAI Conf. Artificial Intelligence, Honolulu, HI, USA, 2019, pp. 1052–1060.
[79]
M. Fey, J. G. Yuen, and F. Weichert, Hierarchical inter-message passing for learning on molecular graphs, arXiv preprint arXiv: 2006.12179, 2020.
[80]
F. Wu, D. Radev, and S. Z. Li, Molformer: Motif-based transformer on 3D heterogeneous molecular graphs, in Proc. 37 th AAAI Conf. Artificial Intelligence, Washington, DC, USA, 2023, pp. 5312–5320.
[81]
F. B. Fuchs, D. E. Worrall, V. Fischer, and M. Welling, SE(3)-transformers: 3D roto-translation equivariant attention networks, arXiv preprint arXiv: 2006.10503, 2020.
[82]
K. T. Schütt, O. T. Unke, and M. Gastegger, Equivariant message passing for the prediction of tensorial properties and molecular spectra, arXiv preprint arXiv: 2102.03150, 2021.
[83]
J. Brandstetter, R. Hesselink, E. van der Pol, E. J. Bekkers, and M. Welling, Geometric and physical quantities improve E(3) equivariant message passing, arXiv preprint arXiv: 2110.02905, 2022.
[84]
J. Gasteiger, F. Becker, and S. Günnemann, GemNet: Universal directional graph neural networks for molecules, arXiv preprint arXiv: 2106.08903, 2022.
[85]
J. Gasteiger, S. Giri, J. T. Margraf, and S. Günnemann, Fast and uncertainty-aware directional message passing for non-equilibrium molecules, arXiv preprint arXiv: 2011.14115, 2022.
[86]
M. Shuaibi, A. Kolluru, A. Das, A. Grover, A. Sriram, Z. Ulissi, and C. L. Zitnick, Rotation invariant graph neural networks using spin convolutions, arXiv preprint arXiv: 2106.09575, 2021.
[87]
S. Wang, Y. Guo, Y. Wang, H. Sun, and J. Huang, SMILES-BERT: Large scale unsupervised pre-training for molecular property prediction, in Proc. 10 th ACM Int. Conf. Bioinformatics, Computational Biology and Health Informatics, Niagara Falls, NY, USA, 2019, pp. 429–436.
[88]
Y. Wang, X. Chen, Y. Min, and J. Wu, MolCloze: A unified cloze-style self-supervised molecular structure learning model for chemical property prediction, in Proc. 2021 IEEE Int. Conf. Bioinformatics and Biomedicine (BIBM ), Houston, TX, USA, 2021, pp. 2896–2903.
[91]
Ł. Maziarka, T. Danel, S. Mucha, K. Rataj, J. Tabor, and S. Jastrzebski, Molecule attention transformer, arXiv preprint arXiv: 2002.08264, 2020.
[92]
W. Park, W. Chang, D. Lee, J. Kim, and S. W. Hwang, GRPE: Relative positional encoding for graph transformer, arXiv preprint arXiv: 2201.12787, 2022.
[93]
M. S. Hussain, M. J. Zaki, and D. Subramanian, Global self-attention as a replacement for graph convolution, in Proc. 28 th ACM SIGKDD Conf. Knowledge Discovery and Data Mining, Washington, DC, USA, 2022, pp. 655–665.
[94]
D. Masters, J. Dean, K. Klaser, Z. Li, S. Maddrell-Mander, A. Sanders, H. Helal, D. Beker, L. Rampášek, and D. Beaini, GPS++: An optimised hybrid MPNN/transformer for molecular property prediction, arXiv preprint arXiv: 2212.02229, 2022.
[95]
Z. Chen, H. Tan, T. Wang, T. Shen, T. Lu, Q. Peng, C. Cheng, and Y. Qi, Graph propagation transformer for graph representation learning, arXiv preprint arXiv: 2305.11424, 2023.
[101]
D. Kuzminykh, D. Polykovskiy, A. Kadurin, A. Zhebrak, I. Baskov, S. Nikolenko, R. Shayakhmetov, and A. Zhavoronkov, 3D molecular representations based on the wave transform for convolutional neural networks, Mol. Pharm., vol. 15, no. 10, pp. 4378–4385, 2018.
[102]
H. Cai, H. Zhang, D. Zhao, J. Wu, and L. Wang, FP-GNN: A versatile deep learning architecture for enhanced molecular property prediction, Brief. Bioinform., vol. 23, no. 6, p. bbac408, 2022.
[105]
Y. Luo, K. Yang, M. Hong, X. Liu, and Z. Nie, MolFM: A multimodal molecular foundation model, arXiv preprint arXiv: 2307.09484, 2023.
[106]
Y. Sun, M. Islam, E. Zahedi, M. Kuenemann, H. Chouaib, and P. Hu, Molecular property prediction based on bimodal supervised contrastive learning, in Proc. 2022 IEEE Int. Conf. Bioinformatics and Biomedicine (BIBM ), Las Vegas, NV, USA, 2022, pp. 394–397.
[107]
P. Liu, Y. Ren, J. Tao, and Z. Ren, GIT-Mol: A multi-modal large language model for molecular science with graph, image, and text, arXiv preprint arXiv: 2308.06911, 2024.
[112]
Y. Liu, L. Wang, M. Liu, X. Zhang, B. Oztekin, and S. Ji, Spherical message passing for 3D molecular graphs, arXiv preprint arXiv: 2102.05013, 2022.
[114]
J. Zhu, Y. Xia, L. Wu, S. Xie, W. Zhou, T. Qin, H. Li, and T. Y. Liu, Dual-view molecular pre-training, in Proc. 29 th ACM SIGKDD Conf. Knowledge Discovery and Data Mining, Long Beach, CA, USA, 2023, pp. 3615–3627.
[115]
M. Sun, J. Xing, H. Wang, B. Chen, and J. Zhou, MoCL: Data-driven molecular fingerprint via knowledge-aware contrastive learning from molecular graph, in Proc. 27 th ACM SIGKDD Conf. Knowledge Discovery & Data Mining, Singapore, 2021, p. 3585–3594.
[117]
Y. You, T. Chen, Y. Sui, T. Chen, Z. Wang, and Y. Shen, Graph contrastive learning with augmentations, in Proc. 34 th Int. Conf. Neural Information Processing Systems, Vancouver, Canada, 2020, pp. 5812–5823.
[118]
J. Xia, C. Zhao, B. Hu, Z. Gao, C. Tan, Y. Liu, S. Li, and S. Z. Li, Mole-BERT: Rethinking pre-training graph neural networks for molecules, chemRxiv. doi: 10.26434/chemrxiv-2023-dngg4.
[122]
F. Wu, H. Qin, S. Li, S. Z. Li, X. Zhan, and J. Xu, InstructBio: A large-scale semi-supervised learning paradigm for biochemical problems, arXiv preprint arXiv: 2304.03906, 2023.
[123]
Q. Lv, G. Chen, Z. Yang, W. Zhong, and C. Y. C. Chen, Meta learning with graph attention networks for low-data drug discovery, IEEE Trans. Neural Netw. Learn. Syst.
[124]
J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, BERT: Pre-training of deep bidirectional transformers for language understanding, in Proc. 2019 Conf. North American Chapter of the Association for Computational Linguistics : Human Language Technologies, Volume 1 (Long and Short Papers ), Minneapolis, MN, USA, 2019, pp. 4171–4186.
[125]
L. Floridi and M. Chiriatti, GPT-3: Its nature, scope, limits, and consequences, Minds Mach., vol. 30, pp. 681–694, 2020.
[126]
X. C. Zhang, C. K. Wu, Z. J. Yang, Z. X. Wu, J. C. Yi, C. Y. Hsieh, T. J. Hou, and D. S. Cao, MG-BERT: Leveraging unsupervised atomic representation learning for molecular property prediction, Brief. Bioinform., vol. 22, no. 6, p. bbab152, 2021.
[127]
W. Ahmad, E. Simon, S. Chithrananda, G. Grand, and B. Ramsundar, ChemBERTa-2: Towards chemical foundation models, arXiv preprint arXiv: 2209.01712, 2022.
[129]
W. Hu, B. Liu, J. Gomes, M. Zitnik, P. Liang, V. Pande, and J. Leskovec, Strategies for pre-training graph neural networks, arXiv preprint arXiv: 1905.12265, 2020.
[130]
J. Godwin, M. Schaarschmidt, A. Gaunt, A. Sanchez-Gonzalez, Y. Rubanova, P. Veličković, J. Kirkpatrick, and P. Battaglia, Simple GNN regularisation for 3D molecular property prediction & beyond, arXiv preprint arXiv: 2106.07971, 2022.
[131]
S. Liu, H. Guo, and J. Tang, Molecular geometry pretraining with SE(3)-invariant denoising distance matching, arXiv preprint arXiv: 2206.13602, 2023.
[132]
S. Feng, Y. Ni, Y. Lan, Z. M. Ma, and W. Y. Ma, Fractional denoising for 3D molecular pre-training, in Proc. 40 th Int. Conf. Machine Learning, Honolulu, HI, USA, 2023, pp. 9938–9961.
[133]
R. Jiao, J. Han, W. Huang, Y. Rong, and Y. Liu, Energy-motivated equivariant pretraining for 3D molecular graphs, in Proc. 37 th AAAI Conf. Artificial Intelligence, Washington, DC, USA, 2023, pp. 8096–8104.
[134]
X. Gao, W. Gao, W. Xiao, Z. Wang, C. Wang, and L. Xiang, Supervised pretraining for molecular force fields and properties prediction, arXiv preprint arXiv: 2211.14429, 2022.
[135]
X. Wang, H. Zhao, W. W. Tu, and Q. Yao, Automated 3D pre-training for molecular property prediction, in Proc. 29 th ACM SIGKDD Conf. Knowledge Discovery and Data Mining, Long Beach, CA, USA, 2023, pp. 2419–2430.
[136]
L. Zeng, L. Li, and J. Li, MolKD: Distilling cross-modal knowledge in chemical reactions for molecular property prediction, arXiv preprint arXiv: 2305.01912, 2023.
[137]
J. Broberg, M. Bånkestad, and E. Ylipää, Pre-training transformers for molecular property prediction using reaction prediction, arXiv preprint arXiv: 2207.02724, 2022.
[141]
X. Guan and D. Zhang, T-MGCL: Molecule graph contrastive learning based on transformer for molecular property prediction, IEEE/ACM Trans. Comput. Biol. Bioinform., vol. 20, no. 6, pp. 3851–3862, 2023.
[144]
J. Cui, H. Chai, Y. Gong, Y. Ding, Z. Hua, C. Gao, and Q. Liao, MocGCL: Molecular graph contrastive learning via negative selection, in Proc. 2023 Int. Joint Conf. Neural Networks (IJCNN ), Gold Coast, Australia, 2023, pp. 1–8.
[145]
K. He, H. Fan, Y. Wu, S. Xie, and R. Girshick, Momentum contrast for unsupervised visual representation learning, in Proc. 2020 IEEE/CVF Conf. Computer Vision and Pattern Recognition, Seattle, WA, USA, 2020, pp. 9726–9735.
[146]
M. J. Zaki and W. Meira Jr, Data Mining and Analysis : Fundamental Concepts and Algorithms. Cambridge, UK: Cambridge University Press, 2014.
[147]
Y. Wang, Y. Min, E. Shao, and J. Wu, Molecular graph contrastive learning with parameterized explainable augmentations, in Proc. 2021 IEEE Int. Conf. Bioinformatics and Biomedicine (BIBM ), Houston, TX, USA, 2021, pp. 1558–1563.
[148]
M. Liu, Y. Yang, X. Gong, L. Liu, and Q. Liu, HierMRL: Hierarchical structure-aware molecular representation learning for property prediction, in Proc. 2022 IEEE Int. Conf. Bioinformatics and Biomedicine (BIBM ), Las Vegas, NV, USA, 2022, pp. 386–389.
[150]
K. Moon, H. J. Im, and S. Kwon, 3D graph contrastive learning for molecular property prediction, Bioinformatics, vol. 39, no. 6, p. btad371, 2023.
[151]
T. Kuang, Y. Ren, and Z. Ren, 3D-mol: A novel contrastive learning framework for molecular property prediction with 3D information, arXiv preprint arXiv: 2309.17366, 2024.
[153]
R. Hua, X. Wang, C. Cheng, Q. Zhu, and X. Zhou, A chemical domain knowledge-aware framework for multi-view molecular property prediction, in Proc. 7 th China Conf. Knowledge Graph and Semantic Computing Evaluations, Qinhuangdao, China, 2022, pp. 1–11.
[154]
Y. Fang, Q. Zhang, H. Yang, X. Zhuang, S. Deng, W. Zhang, M. Qin, Z. Chen, X. Fan, and H. Chen, Molecular contrastive learning with chemical element knowledge graph, in Proc. 36 th AAAI Conf. Artificial Intelligence, Virtual Event, 2022, pp. 3968–3976.
[155]
M. Xu, H. Wang, B. Ni, H. Guo, and J. Tang, Self-supervised graph-level representation learning with local and global structure, in Proc. 38 th Int. Conf. Machine Learning, Virtual Event, 2021, pp. 11548–11558.
[158]
X. Luo, W. Ju, M. Qu, Y. Gu, C. Chen, M. Deng, X. S. Hua, and M. Zhang, CLEAR: Cluster-enhanced contrast for self-supervised graph representation learning, IEEE Trans. Neural Netw. Learn. Syst., vol. 35, no. 1, pp. 899–912, 2024.
[159]
R. Benjamin, U. Singer, and K. Radinsky, Graph neural networks pretraining through inherent supervision for molecular property prediction, in Proc. 31 st ACM Int. Conf. Information & Knowledge Management, Atlanta, GA, USA, 2022, pp. 2903–2912.
[160]
G. Shi, Y. Zhu, J. K. Liu, and X. Li, Hegcl: Advance self-supervised learning in heterogeneous graph-level representation, IEEE Trans. Neural Netw. Learn. Syst.
[163]
R. Hadsell, S. Chopra, and Y. LeCun, Dimensionality reduction by learning an invariant mapping, in Proc. 2006 IEEE Computer Society Conf. Computer Vision and Pattern Recognition (CVPR’06 ), New York, NY, USA, 2006, pp. 1735–1742.
[164]
G. A. Pinheiro, J. L. F. Da Silva, and M. G. Quiles, SMICLR: Contrastive learning on multiple molecular representations for semisupervised and unsupervised representation learning, J. Chem. Inf. Model., vol. 62, no. 17, pp. 3948–3960, 2022.
[165]
C. Zhang, X. Yan, and Y. Liu, Pseudo-siamese neural network based graph and sequence representation learning for molecular property prediction, in Proc. 2022 IEEE Int. Conf. Bioinformatics and Biomedicine (BIBM ), Las Vegas, NV, USA, 2022, pp. 3911–3913.
[166]
H. Stärk, D. Beaini, G. Corso, P. Tossou, C. Dallago, S. Günnemann, and P. Liò., 3D infomax improves GNNs for molecular property prediction, in Proc. 39 th Int. Conf. Machine Learning, Baltimore, MD, USA, 2022, pp. 20479–20502.
[167]
Y. Zhu, D. Chen, Y. Du, Y. Wang, Q. Liu, and S. Wu, Molecular contrastive pretraining with collaborative featurizations, Journal of Chemical Information and Modeling, vol. 64, no. 4, pp. 1112–1122, 2024.
[168]
A. Tarvainen and H. Valpola, Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results, in Proc. 31 st Int. Conf. Advances in Neural Information Processing Systems, Long Beach, California, USA, 2017, pp. 1195–1204.
[170]
D. Berthelot, N. Carlini, I. Goodfellow, A. Oliver, N. Papernot, and C. Raffel, MixMatch: A holistic approach to semi-supervised learning, in Proc. 33 rd Int. Conf. Neural Information Processing Systems, Vancouver, Canada, 2019, pp. 5049–5059.
[172]
H. Ma, F. Jiang, Y. Rong, Y. Guo, and J. Huang, Robust self-training strategy for various molecular biology prediction tasks, in Proc. 13 th ACM Int. Conf. Bioinformatics, Computational Biology and Health Informatics, Northbrook, IL, USA, 2022, pp. 1–5.
[173]
Z. Zhang and M. R. Sabuncu, Generalized cross entropy loss for training deep neural networks with noisy labels, in Proc. 32 nd Int. Conf. Neural Information Processing Systems, Montréal, Canada, 2018, pp. 8792–8802.
[174]
G. Liu, T. Zhao, E. Inae, T. Luo, and M. Jiang, Semi-supervised graph imbalanced regression, arXiv preprint arXiv: 2305.12087, 2023.
[175]
A. R. Zamir, A. Sax, W. Shen, L. Guibas, J. Malik, and S. Savarese, Taskonomy: Disentangling task transfer learning, in Proc. 2018 IEEE/CVF Conf. Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 2018, pp. 3712–3722.
[177]
X. Chen and K. He, Exploring simple siamese representation learning, in Proc. 2021 IEEE/CVF Conf. Computer Vision and Pattern Recognition, Nashville, TN, USA, 2021, pp. 15745–15753.
[180]
C. Q. Nguyen, C. Kreatsoulas, and K. M. Branson, Meta-learning GNN initializations for low-resource molecular property prediction, arXiv preprint arXiv: 2003.05996, 2020.
[182]
H. S. de Ocáriz Borde and F. Barbero, Graph neural network expressivity and meta-learning for molecular property regression, arXiv preprint arXiv: 2209.13410, 2022.
[184]
Z. Meng, Y. Li, P. Zhao, Y. Yu, and I. King, Meta-learning with motif-based task augmentation for few-shot molecular property prediction, in Proc. 2023 SIAM Int. Conf. Data Mining (SDM ), Minneapolis-St. Paul Twin Cities, MN, USA, 2023, pp. 811–819.
[185]
Z. Guo, C. Zhang, W. Yu, J. Herr, O. Wiest, M. Jiang, and N. V. Chawla, Few-shot graph learning for molecular property prediction, in Proc. Web Conf. 2021, Ljubljana, Slovenia, 2021, pp. 2559–2567.
[186]
Y. Wang, A. Abuduweili, Q. Yao, and D. Dou, Property-aware relation networks for few-shot molecular property prediction, arXiv preprint arXiv: 2107.07994, 2021.
[187]
S. Yao, Z. Feng, J. Song, L. Jia, Z. Zhong, and M.Song, Chemical property relation guided few-shot molecular property prediction, in Proc. 2022 Int. Joint Conf. Neural Networks (IJCNN ), Padua, Italy, 2022, pp. 1–8.
[188]
J. Dong, N. N. Wang, Z. J. Yao, L. Zhang, Y. Cheng, D. Ouyang, A. P. Lu, and D. S. Cao, ADMETlab: A platform for systematic ADMET evaluation based on a comprehensively collected ADMET database, J. Cheminform., vol. 10, p. 29, 2018.
[190]
Y. Ji, L. Zhang, J. Wu, B. Wu, L. Li, L. K. Huang, T. Xu, Y. Rong, J. Ren, D. Xue, et al., DrugOOD: Out-of-distribution dataset curator and benchmark for AI-aided drug discovery–a focus on affinity prediction problems with noise annotations, in Proc. 37 th AAAI Conf. Artificial Intelligence, Washington, DC, USA, 2023, pp. 8023–8031.
[192]
C. Morris, N. M. Kriege, F. Bause, K. Kersting, P. Mutzel, and M. Neumann, TUDataset: A collection of benchmark datasets for learning with graphs, arXiv preprint arXiv: 2007.08663, 2020.
[193]
W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, Open graph benchmark: Datasets for machine learning on graphs, in Proc. 34 th Int. Conf. Neural Information Processing Systems, Vancouver, Canada, 2020, pp. 22118–22133.
[196]
K. Xu, W. Hu, J. Leskovec, and S. Jegelka, How powerful are graph neural networks? arXiv preprint arXiv: 1810.00826, 2019.
[197]
B. Su, D. Du, Z. Yang, Y. Zhou, J. Li, A. Rao, H. Sun, Z. Lu, and J. R. Wen, A molecular multimodal foundation model associating molecule graphs with natural language, arXiv preprint arXiv: 2209.05481, 2022.
[198]
X. Tang, A. Tran, J. Tan, and M. B. Gerstein, MolLM: A unified language model to integrate biomedical text with 2D and 3D molecular representations, bioRxiv. doi: 10.1101/2023.11.25.568656.