Unsupervised domain adaptation (UDA) aims to transfer the knowledge from labeled source domain to unlabeled target domain. The main challenge of UDA stems from the domain shift between the source and target domains. Currently, in the discrete classification problems, most existing UDA methods usually adopt the distribution alignment strategy while enforcing unstable instances to pass through the low-density areas. However, the scenario of ordinal regression (OR) is rarely researched in UDA, and the traditional UDA methods cannot preferably handle OR since they do not preserve the order relationships in data labels, like in human age estimation. To address this issue, we proposed a structure-oriented adaptation strategy, namely, structure preserved ordinal unsupervised domain adaptation (SPODA). More specifically, on one hand, the global structure information was modeled and embedded into an auto-encoder framework via a low-rank transferred structure matrix. On the other hand, the local structure information was preserved through a weighted pair-wise strategy in the latent space. Guided by both the local and global structure information, a well-performance latent space was generated, whose geometric structure was adopted to further obtain a more discriminant ordinal regressor. To further enhance its generalization, a counterpart of SPODA with deep architecture was developed. Finally, extensive experiments indicated that in addressing the OR problem, SPODA was more effective and advanced than existing related domain adaptation methods.
M. Wang, S. Wang, X. Yang, J. Yuan, W. Zhang, Equity in unsupervised domain adaptation by nuclear norm maximization, IEEE Trans. Circuits Syst. Video Technol., 34 (2024), 5533–5545. https://doi.org/10.1109/TCSVT.2023.3346444
M. Wang, Y. Liu, J. Yuan, S. Wang, Z. Wang, W. Wang, Inter-class and inter-domain semantic augmentation for domain generalization, IEEE Trans. Image Process., 33 (2024), 1338–1347. https://doi.org/10.1109/TIP.2024.3354420
Y. Wang, X. Luo, C. Chen, X. Hua, M. Zhang, W. Ju, Disensemi: Semi-supervised graph classification via disentangled representation learning, IEEE Trans. Neural Networks Learn. Syst., (2024). https://doi.org/10.1109/TNNLS.2024.3431871
H. Liu, J. Lu, J. Feng, J. Zhou, Ordinal deep learning for facial age estimation, IEEE Trans. Circuits Syst. Video Technol., 29 (2019). https://doi.org/10.1109/TCSVT.2017.2782709
C. Ren, Y. Zhai, Y. Luo, H. Yan, Towards unsupervised domain adaptation via domain-transformer, Int. J. Comput. Vis., 132 (2024), 6163–6183. https://link.springer.com/article/10.1007/s11263-024-02174-9
S. David, J. Blitzer, K. Crammer, F. Pereira, Analysis of representations for domain adaptation, Adv. Neural Inf. Process. Syst., 19 (2007), 137–144.
S. David, J. Blitzer, K. Crammer, A. Kulesza, F. Pereira, J. W. Vaughan, A theory of learning from different domains, Mach. learn., 79 (2010), 151–175. https://doi.org/10.1007/s10994-009-5152-4
S. Pan, I. Tsang, J. Kwok, Q. Yang, Domain adaptation via transfer component analysis, IEEE Trans. Neural Networks Learn. Syst., 22 (2010), 199–210. https://doi.org/10.1109/TNN.2010.2091281
R. Combes, H. Zhao, Y. Wang, G. Gordon, Domain adaptation with conditional distribution matching and generalized label shift, Adv. Neural Inf. Process. Syst., 33 (2020).
S. Li, S. Song, G. Huang, Prediction reweighting for domain adaptation, IEEE Trans. Neural Networks Learn. Syst., 28 (2016), 1682–1695. https://doi.org/10.1109/TNNLS.2016.2538282
L. Zhang, S. Wang, G. Huang, W. Zuo, J. Yang, D. Zhang, Manifold criterion guided transfer learning via intermediate domain generation, IEEE Trans. Neural Networks Learn. Syst., 30 (2019), 3759–3773. https://doi.org/10.1109/TNNLS.2019.2899037
Y. Grandvalet, Y. Bengio, Semi-supervised learning by entropy minimization, Neural Inf. Process. Syst., (2004), 529–536.
Q. Tian, Y. Zhu, H. Sun, S. Chen, H. Yin, Unsupervised domain adaptation through dynamically aligning both the feature and label spaces, IEEE Trans. Circuits Syst. Video Technol., 32 (2022), 8562–8573. https://doi.org/10.1109/TCSVT.2022.3192135
X. Liu, S. Li, Y. Ge, P. Ye, J. You, J. Lu, Ordinal unsupervised domain adaptation with recursively conditional gaussian imposed variational disentanglement, IEEE Trans. Pattern Anal. Mach. Intell., (2022), 1–14. https://doi.org/10.1109/TPAMI.2022.3183115
Q. Tian, W. Zhang, M. Cao, L. Wang, S. Chen, H. Yin, Moment-guided discriminative manifold correlation learning on ordinal data, ACM Trans. Intell. Syst. Technol. (TIST), 11 (2020), 1–18. https://doi.org/10.1145/3402445
C. Geng, S. Chen, Metric learning-guided least squares classifier learning, IEEE Trans. Neural Networks Learn. Syst., 29 (2018), 6409–6414. https://doi.org/10.1109/TNNLS.2018.2830802
Y. Yao, Y. Zhang, X. Li, Y. Ye, Discriminative distribution alignment: A unified framework for heterogeneous domain adaptation, Pattern Recognit., 101 (2020), 107165. https://doi.org/10.1016/j.patcog.2019.107165
I. Goodfellow, J. Abadie, M. Mirza, B. Xu, D. Farley, S. Ozair, et al., Generative adversarial nets, Adv. Neural Inf. Process. Syst., (2014), 2672–2680.
K. Saito, D. Kim, S. Sclaroff, K. Saenko, Universal domain adaptation through self supervision, Adv. Neural Inf. Process. Syst., 33 (2020), 16282–16292.
L. Zhang, J. Fu, S. Wang, D. Zhang, Z. Dong, C. Chen, Guide subspace learning for unsupervised domain adaptation, IEEE Trans. Neural Networks Learn. Syst., 31 (2020), 3374–3388. https://doi.org/10.1109/TNNLS.2019.2944455
R. Wang, Z. Wu, Z. Weng, J. Chen, G. Qi, Y. Jiang, Cross-domain contrastive learning for unsupervised domain adaptation, IEEE Trans. Multim., 25 (2023), 1665–1673. https://doi.org/10.1109/TMM.2022.3146744
H. Liu, M. Shao, Z. Ding, Y. Fu, Structure-preserved unsupervised domain adaptation, IEEE Trans. Knowl. Data Eng., 31 (2018), 799–812. https://doi.org/10.1109/TKDE.2018.2843342
M. Meng, Q. Chen, J. Wu, Structure preservation adversarial network for visual domain adaptation, Inf. Sci., 579 (2021), 266–280. https://doi.org/10.1016/j.ins.2021.07.085
Q. Tian, H. Sun, C. Ma, M. Cao, Y. Chu, S. Chen, Heterogeneous domain adaptation with structure and classiffcation space alignment, IEEE Trans. Cybern., 52 (2022), 10328–10338. https://doi.org/10.1109/TCYB.2021.3070545
C. Seah, I. Tsang, Y. Ong, Transfer ordinal label learning, IEEE Trans. Neural Networks Learn. Syst., 24 (2013), 1863–1876. https://doi.org/10.1109/TNNLS.2013.2268541
W. Wu, J. He, S. Wang, K. Guan, E. Ainsworth, Distribution-informed neural networks for domain adaptation regression, Adv. Neural Inf. Process. Syst., 35 (2022), 10040–10054.
Q. Tian, M. Cao, S. Chen, H. Yin, Structure-exploiting discriminative ordinal multioutput regression, IEEE Trans. Neural Networks Learn. Syst., 32 (2020), 266–280. https://doi.org/10.1109/TNNLS.2020.2978508
X. He, P. Niyogi, Locality preserving projections, Adv. Neural Inf. Process. Syst., (2003), 153–160.