[3]
A. Enthaler, T. Weustenfeld, F. Gauterin, and J. Koehler, Thermal management consumption and its effect on remaining range estimation of electric vehicles, in Proc. 2014 International Conference on Connected Vehicles and Expo (ICCVE ), Vienna, Austria, 2014, pp. 170–177.
[7]
S. Park and C. Ahn, Stochastic model predictive controller for battery thermal management of electric vehicles, in Proc. 2019 IEEE Vehicle Power and Propulsion Conference (VPPC ), Hanoi, Vietnam, 2019, pp. 1–5.
[12]
T. Fischer, T. Kraus, C. Kirches, and F. Gauterin, Nonlinear model predictive control of a thermal management system for electrified vehicles using FMI, in Proceedings of the 12th International Modelica Conference. doi:10.3384/ecp17132255.
[18]
Y. Liu and J. Zhang, Self-adapting intelligent battery thermal management system via artificial neural network based model predictive control. doi:10.1115/DETC2019-98205.
[20]
P. Engel, S. Lempp, A. Rausch, and W. Tegethoff, Improving thermal management of electric vehicles by prediction of thermal disturbance variables, in Proc. ADAPTIVE 2018, A. Rausch, C. Knieke, and M. Schranz, eds. Barcelona, Spain: IARIA, 2018, pp. 75–83.
[25]
N. Moniz, P. Branco, and L. Torgo, Resampling strategies for imbalanced time series, in Proc. 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA ), Montreal, Canada, 2016, pp. 282–291.
[28]
M. Saripuddin, A. Suliman, S. Syarmila Sameon, and B. N. Jorgensen, Random undersampling on imbalance time series data for anomaly detection, in Proc. MLMI 2021 The 4th International Conference on Machine Learning and Machine Intelligence, Hangzhou, China, 2021, pp. 151–156.
[29]
C. C. Aggarwal, Outlier Analysis, 2nd ed. Cham, Switzerland: Springer, 2017.
[33]
F. T. Liu, K. M. Ting, and Z.-H. Zhou, Isolation forest, in Proc. 2008 Eighth IEEE International Conference on Data Mining, Pisa, Italy, 2008, pp. 413–422.
[34]
R. J. Hyndman, E. Wang, and N. Laptev, Large-scale unusual time series detection, in Proc. 2015 IEEE International Conference on Data Mining Workshop (ICDMW ), Atlantic City, NJ, USA, 2015, pp. 1616–1619.
[37]
S. Jia, H. Xianglin, Q. Sijun, and S. Qing, A bi-directional sampling based on k-means method for imbalance text classification, in Proc. 2016 IEEE/ACIS 15th International Conference on Computer and Information Science (ICIS ), Okayama, Japan, 2016, pp. 1–5.
[41]
L. P. Silvestrin, L. Pantiskas, and M. Hoogendoorn, A framework for imbalanced time-series forecasting, in Machine Learning, Optimization, and Data Science, G. Nicosia, V. Ojha, E. La Malfa, G. La Malfa, G. Jansen, P. M. Pardalos, G. Giuffrida, and R. Umeton, eds. Springer International Publishing, 2022, pp. 250–264.
[49]
Q. Wen, L. Sun, F. Yang, X. Song, J. Gao, X. Wang, and H. Xu, Time series data augmentation for deep learning: A survey, in Proc. of the Thirtieth International Joint Conference on Artificial Intelligence, doi:10.48550/arXiv.2002.12478.
[50]
G. Forestier, F. Petitjean, H. A. Dau, G. I. Webb, and E. Keogh, Generating synthetic time series to augment sparse datasets, in Proc. 2017 IEEE International Conference on Data Mining (ICDM ), New Orleans, LA, USA, 2017, pp. 865–870.
[52]
B. Fu, F. Kirchbuchner, and A. Kuijper, Data augmentation for time series: Traditional vs generative models on capacitive proximity time series, in Proc. of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Corfu, Greece, 2020, pp. 107–116.
[53]
B. Liu, Z. Zhang, and R. Cui, Efficient time series augmentation methods, in Proc. 2020 13th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI ), Chengdu, China, 2020, pp. 1004–1009.
[54]
B. K. Iwana and S. Uchida, Time series data augmentation for neural networks by time warping with a discriminative teacher, in Proc. 2020 25th International Conference on Pattern Recognition (ICPR ), Milan, Italy, 2021, pp. 3558–3565.
[57]
J.-S. Ang, K.-W. Ng, and F.-F. Chua, Modeling time series data with deep learning: A review, analysis, evaluation and future trend, in Proc. 2020 8th International Conference on Information Technology and Multimedia (ICIMU ), Selangor, Malaysia, 2020, pp. 32–37.
[63]
C. M. Liapis, A. Karanikola, and S. Kotsiantis, Energy load forecasting: Investigating mid-term predictions with ensemble learners, in Artificial Intelligence Applications and Innovations, I. Maglogiannis, L. Iliadis, J. Macintyre, and P. Cortez, eds. Cham, Switzerland: Springer International Publishing, 2022, pp. 343–355.
[64]
C. M. Liapis and S. Kotsiantis, Energy balance forecasting: An extensive multivariate regression models comparison, in Proc. of the 12th Hellenic Conference on Artificial Intelligence, Corfu, Greece, 2022, pp. 1–7.
[76]
R. Wen, K. Torkkola, B. Narayanaswamy, and D. Madeka, A multi-horizon quantile recurrent forecaster, in Proc. 31st Annual Conference on Neural Information Processing Systems (NIPS): Time Series Workshop. doi: 10.48550/arXiv.1711.11053.
[80]
M. Feurer and F. Hutter, Hyperparameter optimization, in Automated Machine Learning: Methods, Systems, Challenges, F. Hutter, L. Kotthoff, and J. Vanschoren, eds. Cham, Switerland: Springer International Publishing, 2019, pp. 3–33.
[83]
T. T. Joy, S. Rana, S. Gupta, and S. Venkatesh, Hyperparameter tuning for big data using bayesian optimisation, in Proc. 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, 2016, pp. 2574–2579.
[84]
L. Wang, M. Feng, B. Zhou, B. Xiang, and S. Mahadevan, Efficient hyper-parameter optimization for nlp applications, in Proc. of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015, pp. 2112–2117.
[86]
S. Falkner, A. Klein, and F. Hutter, Bohb: Robust and efficient hyperparameter optimization at scale, in Proc. of the 35th International Conference on Machine Learning. doi: 10.48550/arXiv.1807.01774.
[87]
J. Kong, W. Kowalczyk, D. A. Nguyen, T. Bäck, and S. Menzel, Hyperparameter optimisation for improving classification under class imbalance, in Proc. Switerland 2019 IEEE Symposium Series on Computational Intelligence (SSCI), Xiamen, China, 2019, pp. 3072–3078.
[90]
K. Miao, Q. Hua, and H. Shi, Short-term load forecasting based on cnn-bilstm with bayesian optimization and attention mechanism, in Parallel and Distributed Computing, Applications and Technologies, Y. Zhang, Y. Xu, and H. Tian, eds. Cham, Switerland: Springer International Publishing, 2021, pp. 116–128.
[91]
A. Vysala and J. Gomes, Evaluating and validating cluster results, in Proc. 9th International Conference on Data Mining & Knowledge Management Process (CDKP 2020), Toronto, Canada, 2020, pp. 37–47.
[93]
V. Kuleshov, N. Fenner, and S. Ermon, Accurate uncertainties for deep learning using calibrated regression, in Proc. of the 35th International Conference on Machine Learning. doi: 10.48550/arXiv.1807.00263.