AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (1.9 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Improved Quantile Convolutional and Recurrent Neural Networks for Electric Vehicle Battery Temperature Prediction

Karlsruhe Institute of Technology (KIT), Institute of Vehicle System Technology, Karlsruhe 76131, Germany, and also with Bayerische Motoren Werke (BMW) AG, Munich 80788, Germany
Technical University of Munich (TUM), Munich 80333, Germany
BMW AG, Munich 80788, Germany
KIT, Institute of Vehicle System Technology, Karlsruhe 76131, Germany
Show Author Information

Abstract

The battery thermal management of electric vehicles can be improved using neural networks predicting quantile sequences of the battery temperature. This work extends a method for the development of Quantile Convolutional and Quantile Recurrent Neural Networks (namely Q*NN). Fleet data of 225 629 drives are clustered and balanced, simulation data from 971 simulations are augmented before they are combined for training and testing. The Q*NN hyperparameters are optimized using an efficient Bayesian optimization, before the Q*NN models are compared with regression and quantile regression models for four horizons. The analysis of point-forecast and quantile-related metrics shows the superior performance of the novel Q*NN models. The median predictions of the best performing model achieve an average RMSE of 0.66°C and R2 of 0.84. The predicted 0.99 quantile covers 98.87% of the true values in the test data. In conclusion, this work proposes an extended development and comparison of Q*NN models for accurate battery temperature prediction.

References

[1]

W. Choi, E. Yoo, E. Seol, M. Kim, and H. H. Song, Greenhouse gas emissions of conventional and alternative vehicles: Predictions based on energy policy analysis in south korea, Applied Energy, vol. 265, p. 114754, 2020.

[2]

Y. Zheng, X. He, H. Wang, M. Wang, S. Zhang, D. Ma, B. Wang, and Y. Wu, Well-to-wheels greenhouse gas and air pollutant emissions from battery electric vehicles in china, Mitigation and Adaptation Strategies for Global Change, vol. 25, no. 3, p. 355–370, 2020.

[3]
A. Enthaler, T. Weustenfeld, F. Gauterin, and J. Koehler, Thermal management consumption and its effect on remaining range estimation of electric vehicles, in Proc. 2014 International Conference on Connected Vehicles and Expo (ICCVE ), Vienna, Austria, 2014, pp. 170–177.
[4]

Z. Feng, J. Zhao, C. Guo, S. Panchal, Y. Xu, J. Yuan, R. Fraser, and M. Fowler, Optimization of the cooling performance of symmetric battery thermal management systems at high discharge rates, Energy & Fuels, vol. 37, no. 11, pp. 7990–8004, 2023.

[5]

V. Talele, U. Moralı, M. S. Patil, S. Panchal, and K. Mathew, Optimal battery preheating in critical subzero ambient condition using different preheating arrangement and advance pyro linear thermal insulation, Thermal Science and Engineering Progress, vol. 42, p. 101908, 2023.

[6]

A. M. Billert, S. Erschen, M. Frey, and F. Gauterin, Predictive battery thermal management using quantile convolutional neural networks, Transportation Engineering, vol. 10, p. 100150, 2022.

[7]
S. Park and C. Ahn, Stochastic model predictive controller for battery thermal management of electric vehicles, in Proc. 2019 IEEE Vehicle Power and Propulsion Conference (VPPC ), Hanoi, Vietnam, 2019, pp. 1–5.
[8]

Y. Xie, C. Wang, X. Hu, X. Lin, Y. Zhang, and W. Li, An MPC-based control strategy for electric vehicle battery cooling considering energy saving and battery lifespan, IEEE Transactions on Vehicular Technology, vol. 69, no. 12, pp. 14657–14673, 2020.

[9]

Z. Deng, X. Hu, X. Lin, Y. Che, Le Xu, and W. Guo, Data-driven state of charge estimation for lithium-ion battery packs based on gaussian process regression, Energy, vol. 205, p. 118000, 2020.

[10]

S. Park and C. Ahn, Model predictive control with stochastically approximated cost-to-go for battery cooling system of electric vehicles, IEEE Transactions on Vehicular Technology, vol. 70, no. 5, pp. 4312–4323, 2021.

[11]

J. Lopez-Sanz, C. Ocampo-Martinez, J. Alvarez-Florez, M. Moreno-Eguilaz, R. Ruiz-Mansilla, J. Kalmus, M. Graeber, and G. Lux, Thermal management in plug-in hybrid electric vehicles: A real-time nonlinear model predictive control implementation, IEEE Transactions on Vehicular Technology, vol. 66, no. 9, pp. 7751–7760, 2017.

[12]
T. Fischer, T. Kraus, C. Kirches, and F. Gauterin, Nonlinear model predictive control of a thermal management system for electrified vehicles using FMI, in Proceedings of the 12th International Modelica Conference. doi:10.3384/ecp17132255.
[13]

C. Zhu, F. Lu, H. Zhang, and C. C. Mi, Robust predictive battery thermal management strategy for connected and automated hybrid electric vehicles based on thermoelectric parameter uncertainty, IEEE Journal of Emerging and Selected Topics in Power Electronics, vol. 6, no. 4, pp. 1796–1805, 2018.

[14]

S. Zhao, A. M. Reza, S. Jing, and C. C. Mi, A two-layer real-time optimization control strategy for integrated battery thermal management and hvac system in connected and automated hevs, IEEE Transactions on Vehicular Technology, vol. 70, no. 7, pp. 6567–6576, 2021.

[15]

S. Park and C. Ahn, Computationally efficient stochastic model predictive controller for battery thermal management of electric vehicle, IEEE Transactions on Vehicular Technology, vol. 69, no. 8, pp. 8407–8419, 2020.

[16]

R. Baveja, J. Bhattacharya, S. Panchal, R. Fraser, and M. Fowler, Predicting temperature distribution of passively balanced battery module under realistic driving conditions through coupled equivalent circuit method and lumped heat dissipation method, Journal of Energy Storage, vol. 70, p. 107967, 2023.

[17]

J. Park and Y. Kim, Supervised-learning-based optimal thermal management in an electric vehicle, IEEE Access, vol. 8, pp. 1290–1302, 2020.

[18]
Y. Liu and J. Zhang, Self-adapting intelligent battery thermal management system via artificial neural network based model predictive control. doi:10.1115/DETC2019-98205.
[19]

Y. Liu and J. Zhang, Electric vehicle battery thermal and cabin climate management based on model predictive control, Journal of Mechanical Design, vol. 143, no. 3, p. O31705 , 2020.

[20]
P. Engel, S. Lempp, A. Rausch, and W. Tegethoff, Improving thermal management of electric vehicles by prediction of thermal disturbance variables, in Proc. ADAPTIVE 2018, A. Rausch, C. Knieke, and M. Schranz, eds. Barcelona, Spain: IARIA, 2018, pp. 75–83.
[21]

J. Park, Z. Chen, L. Kiliaris, M. L. Kuang, M. A. Masrur, A. M. Phillips, and Y. L. Murphey, Intelligent vehicle power control based on machine learning of optimal control parameters and prediction of road type and traffic congestion, IEEE Transactions on Vehicular Technology, vol. 58, no. 9, pp. 4741–4756, 2009.

[22]

Y. N. Malek, M. Najib, M. Bakhouya, and M. Essaaidi, Multivariate deep learning approach for electric vehicle speed forecasting, Big Data Mining and Analytics, vol. 4, no. 1, pp. 56–64, 2021.

[23]

S. Fünfgeld, M. Holzapfel, M. Frey, and F. Gauterin, Stochastic forecasting of vehicle dynamics using sequential monte carlo simulation, IEEE Transactions on Intelligent Vehicles, vol. 2, no. 2, p. 1, 2017.

[24]

A. M. Billert, M. Frey, and F. Gauterin, A method of developing quantile convolutional neural networks for electric vehicle battery temperature prediction trained on cross-domain data, IEEE Open Journal of Intelligent Transportation Systems, vol. 3, pp. 411–425, 2022.

[25]
N. Moniz, P. Branco, and L. Torgo, Resampling strategies for imbalanced time series, in Proc. 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA ), Montreal, Canada, 2016, pp. 282–291.
[26]

A. L’ Heureux, K. Grolinger, H. F. Elyamany, and M. A. M. Capretz, Machine learning with big data: Challenges and approaches, IEEE Access, vol. 5, pp. 7776–7797, 2017.

[27]

L. Zhou, S. Pan, J. Wang, and A. V. Vasilakos, Machine learning on big data: Opportunities and challenges, Neurocomputing, vol. 237, pp. 350–361, 2017.

[28]
M. Saripuddin, A. Suliman, S. Syarmila Sameon, and B. N. Jorgensen, Random undersampling on imbalance time series data for anomaly detection, in Proc. MLMI 2021 The 4th International Conference on Machine Learning and Machine Intelligence, Hangzhou, China, 2021, pp. 151–156.
[29]
C. C. Aggarwal, Outlier Analysis, 2nd ed. Cham, Switzerland: Springer, 2017.
[30]

V. Chandola, A. Banerjee, and V. Kumar, Anomaly detection: A survey, ACM Comput. Surv., vol. 41, no. 3, pp. 15:1–15:57, 2009.

[31]

A. Boukerche, L. Zheng, and O. Alfandi, Outlier detection: Methods, models, and classification, ACM Comput. Surv., vol. 53, no. 3, pp. 55:1–55:37, 2020.

[32]

A. Blázquez-García, A. Conde, U. Mori, and J. A. Lozano, A review on outlier/anomaly detection in time series data, ACM Comput. Surv., vol. 54, no. 3, pp. 56:1–56:33, 2021.

[33]
F. T. Liu, K. M. Ting, and Z.-H. Zhou, Isolation forest, in Proc. 2008 Eighth IEEE International Conference on Data Mining, Pisa, Italy, 2008, pp. 413–422.
[34]
R. J. Hyndman, E. Wang, and N. Laptev, Large-scale unusual time series detection, in Proc. 2015 IEEE International Conference on Data Mining Workshop (ICDMW ), Atlantic City, NJ, USA, 2015, pp. 1616–1619.
[35]

L. McInnes, J. Healy, N. Saul, and L. Großberger, Umap: Uniform manifold approximation and projection, Journal of Open Source Software, vol. 3, no. 29, p. 861, 2018.

[36]

C. Weaver, A. C. Fortuin, A. Vladyka, and T. Albrecht, Unsupervised classification of voltammetric data beyond principal component analysis, Chem. Commun., vol. 58, no. 73, pp. 10170–10173, 2022.

[37]
S. Jia, H. Xianglin, Q. Sijun, and S. Qing, A bi-directional sampling based on k-means method for imbalance text classification, in Proc. 2016 IEEE/ACIS 15th International Conference on Computer and Information Science (ICIS ), Okayama, Japan, 2016, pp. 1–5.
[38]

G. Douzas, F. Bacao, and F. Last, Improving imbalanced learning through a heuristic oversampling method based on k-means and smote, Information Sciences, vol. 465, pp. 1–20, 2018.

[39]

S. Susan and A. Kumar, The balancing trick: Optimized sampling of imbalanced datasets–-a brief survey of the recent state of the art, Engineering Reports, vol. 3, no. 4, p. e12298, 2021.

[40]

N. V. Chawla, K. W. Bowyer, L. O. Hall, and W. P. Kegelmeyer, Smote: synthetic minority over-sampling technique, Journal of artificial intelligence research, vol. 16, pp. 321–357, 2002.

[41]
L. P. Silvestrin, L. Pantiskas, and M. Hoogendoorn, A framework for imbalanced time-series forecasting, in Machine Learning, Optimization, and Data Science, G. Nicosia, V. Ojha, E. La Malfa, G. La Malfa, G. Jansen, P. M. Pardalos, G. Giuffrida, and R. Umeton, eds. Springer International Publishing, 2022, pp. 250–264.
[42]

C. Hou, J. Wu, B. Cao, and J. Fan, A deep-learning prediction model for imbalanced time series data forecasting, Big Data Mining and Analytics, vol. 4, no. 4, pp. 266–278, 2021.

[43]

N. Moniz, P. Branco, and L. Torgo, Resampling strategies for imbalanced time series forecasting, International Journal of Data Science and Analytics, vol. 3, no. 3, pp. 161–181, 2017.

[44]

X. Chen, L. Gupta, and S. Tragoudas, Improving the forecasting and classification of extreme events in imbalanced time series through block resampling in the joint predictor-forecast space, IEEE Access, vol. 10, pp. 121048–121079, 2022.

[45]

C. Liu, J. Tan, H. Shi, and X. Wang, Lithium-ion cell screening with convolutional neural networks based on two-step time-series clustering and hybrid resampling for imbalanced data, IEEE Access, vol. 6, pp. 59001–59014, 2018.

[46]

C. Li, L. Minati, K. K. Tokgoz, M. Fukawa, J. Bartels, A. Sihan, K.-I. Takeda, and H. Ito, Integrated data augmentation for accelerometer time series in behavior recognition: Roles of sampling, balancing, and fourier surrogates, IEEE Sensors Journal, vol. 22, no. 24, pp. 24230–24241, 2022.

[47]

T. Schreiber and A. Schmitz, Surrogate time series, Physica D: Nonlinear Phenomena, vol. 142, no. 3, pp. 346–382, 2000.

[48]

B. K. Iwana and S. Uchida, An empirical survey of data augmentation for time series classification with neural networks, PloS one, vol. 16, no. 7, pp. 1–32, 2021.

[49]
Q. Wen, L. Sun, F. Yang, X. Song, J. Gao, X. Wang, and H. Xu, Time series data augmentation for deep learning: A survey, in Proc. of the Thirtieth International Joint Conference on Artificial Intelligence, doi:10.48550/arXiv.2002.12478.
[50]
G. Forestier, F. Petitjean, H. A. Dau, G. I. Webb, and E. Keogh, Generating synthetic time series to augment sparse datasets, in Proc. 2017 IEEE International Conference on Data Mining (ICDM ), New Orleans, LA, USA, 2017, pp. 865–870.
[51]

S. Demir, K. Mincev, K. Kok, and N. G. Paterakis, Data augmentation for time series regression: Applying transformations, autoencoders and adversarial networks to electricity price forecasting, Applied Energy, vol. 304, p. 117695, 2021.

[52]
B. Fu, F. Kirchbuchner, and A. Kuijper, Data augmentation for time series: Traditional vs generative models on capacitive proximity time series, in Proc. of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Corfu, Greece, 2020, pp. 107–116.
[53]
B. Liu, Z. Zhang, and R. Cui, Efficient time series augmentation methods, in Proc. 2020 13th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI ), Chengdu, China, 2020, pp. 1004–1009.
[54]
B. K. Iwana and S. Uchida, Time series data augmentation for neural networks by time warping with a discriminative teacher, in Proc. 2020 25th International Conference on Pattern Recognition (ICPR ), Milan, Italy, 2021, pp. 3558–3565.
[55]

S. Kim, N. H. Kim, and J.-H. Choi, Prediction of remaining useful life by data augmentation technique based on dynamic time warping, Mechanical Systems and Signal Processing, vol. 136, p. 106486, 2020.

[56]

M. Längkvist, L. Karlsson, and A. Loutfi, A review of unsupervised feature learning and deep learning for time-series modeling, Pattern Recognition Letters, vol. 42, pp. 11–24, 2014.

[57]
J.-S. Ang, K.-W. Ng, and F.-F. Chua, Modeling time series data with deep learning: A review, analysis, evaluation and future trend, in Proc. 2020 8th International Conference on Information Technology and Multimedia (ICIMU ), Selangor, Malaysia, 2020, pp. 32–37.
[58]

B. Lim and S. Zohren, Time-series forecasting with deep learning: a survey, Philosophical Transactions of the Royal Society A, vol. 379, no. 2194, p. 20200209, 2021.

[59]

Z. Liu, Z. Zhu, J. Gao, and C. Xu, Forecast methods for time series data: A survey, IEEE Access, vol. 9, pp. 91896–91912, 2021.

[60]

Z. Han, J. Zhao, H. Leung, K. F. Ma, and W. Wang, A review of deep learning models for time series prediction, IEEE Sensors Journal, vol. 21, no. 6, pp. 7833–7848, 2021.

[61]

P.-P. Phyo, Y.-C. Byun, and N. Park, Short-term energy forecasting using machine-learning-based ensemble voting regression, Symmetry, vol. 14, no. 1, p. 160, 2022.

[62]

N. K. Ahmed, A. F. Atiya, N. El Gayar, and H. El-Shishiny, An empirical comparison of machine learning models for time series forecasting, Econometric Reviews, vol. 29, nos. 5&6, pp. 594–621, 2010.

[63]
C. M. Liapis, A. Karanikola, and S. Kotsiantis, Energy load forecasting: Investigating mid-term predictions with ensemble learners, in Artificial Intelligence Applications and Innovations, I. Maglogiannis, L. Iliadis, J. Macintyre, and P. Cortez, eds. Cham, Switzerland: Springer International Publishing, 2022, pp. 343–355.
[64]
C. M. Liapis and S. Kotsiantis, Energy balance forecasting: An extensive multivariate regression models comparison, in Proc. of the 12th Hellenic Conference on Artificial Intelligence, Corfu, Greece, 2022, pp. 1–7.
[65]
M. Ali, Pycaret: An open source, low-code machine learning library in python, https://www.pycaret.org, 2023.
[66]

S. B. Taieb and A. F. Atiya, A bias and variance analysis for multistep-ahead time series forecasting, IEEE Transactions on Neural Networks and Learning Systems, vol. 27, no. 1, pp. 62–76, 2016.

[67]

J. Hong, H. Zhang, and X. Xu, Thermal fault prognosis of lithium-ion batteries in real-world electric vehicles using self-attention mechanism networks, Applied Thermal Engineering, vol. 226, p. 120304, 2023.

[68]

J. Hong, Z. Wang, W. Chen, and Y. Yao, Synchronous multi-parameter prediction of battery systems on electric vehicles using long short-term memory networks, Applied Energy, vol. 254, p. 113648, 2019.

[69]

Z. Deng, X. Lin, J. Cai, and X. Hu, Battery health estimation with degradation pattern recognition and transfer learning, Journal of Power Sources, vol. 525, p. 231027, 2022.

[70]

P. Li, Z. Zhang, R. Grosu, Z. Deng, J. Hou, Y. Rong, and R. Wu, An end-to-end neural network framework for state-of-health estimation and remaining useful life prediction of electric vehicle lithium batteries, Renewable and Sustainable Energy Reviews, vol. 156, p. 111843, 2022.

[71]

X. Song, F. Yang, D. Wang, and K.-L. Tsui, Combined CNN-LSTM network for state-of-charge estimation of lithium-ion batteries, IEEE Access, vol. 7, pp. 88894–88902, 2019.

[72]

L. Ren, J. Dong, X. Wang, Z. Meng, L. Zhao, and M. J. Deen, A data-driven auto-cnn-lstm prediction model for lithium-ion battery remaining useful life, IEEE Transactions on Industrial Informatics, vol. 17, no. 5, pp. 3478–3487, 2021.

[73]
M. Kumar, Scikit-garden: A garden for scikit-learn compatible trees, https://github.com/scikit-garden/scikit-garden, 2013.
[74]

D. Gan, Y. Wang, S. Yang, and C. Kang, Embedding based quantile regression neural network for probabilistic load forecasting, Journal of Modern Power Systems and Clean Energy, vol. 6, no. 2, pp. 244–254, 2018.

[75]

W. Zhang, H. Quan, and D. Srinivasan, An improved quantile regression neural network for probabilistic load forecasting, IEEE Transactions on Smart Grid, vol. 10, no. 4, pp. 4425–4434, 2019.

[76]
R. Wen, K. Torkkola, B. Narayanaswamy, and D. Madeka, A multi-horizon quantile recurrent forecaster, in Proc. 31st Annual Conference on Neural Information Processing Systems (NIPS): Time Series Workshop. doi: 10.48550/arXiv.1711.11053.
[77]

M. Lopez-Martin, A. Sanchez-Esguevillas, L. Hernandez-Callejo, J. I. Arribas, and B. Carro, Additive ensemble neural network with constrained weighted quantile loss for probabilistic electric-load forecasting, Sensors, vol. 21, no. 9, p. 2979, 2021.

[78]

C. Xu, C. Li, and X. Zhou, Interpretable lstm based on mixture attention mechanism for multi-step residential load forecasting, Electronics, vol. 11, no. 14, p. 2189, 2022.

[79]

H. Zhang, W. Tang, W. Na, P.-Y. Lee, and J. Kim, Implementation of generative adversarial network-cls combined with bidirectional long short-term memory for lithium-ion battery state prediction, Journal of Energy Storage, vol. 31, p. 101489, 2020.

[80]
M. Feurer and F. Hutter, Hyperparameter optimization, in Automated Machine Learning: Methods, Systems, Challenges, F. Hutter, L. Kotthoff, and J. Vanschoren, eds. Cham, Switerland: Springer International Publishing, 2019, pp. 3–33.
[81]

J. Wu, X.-Y. Chen, H. Zhang, L.-D. Xiong, H. Lei, and S.-H. Deng, Hyperparameter optimization for machine learning models based on bayesian optimizationb, Journal of Electronic Science and Technology, vol. 17, no. 1, pp. 26–40, 2019.

[82]

H. Cho, Y. Kim, E. Lee, D. Choi, Y. Lee, and W. Rhee, Basic enhancement strategies when using bayesian optimization for hyperparameter tuning of deep neural networks, IEEE Access, vol. 8, pp. 52588–52608, 2020.

[83]
T. T. Joy, S. Rana, S. Gupta, and S. Venkatesh, Hyperparameter tuning for big data using bayesian optimisation, in Proc. 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, 2016, pp. 2574–2579.
[84]
L. Wang, M. Feng, B. Zhou, B. Xiang, and S. Mahadevan, Efficient hyper-parameter optimization for nlp applications, in Proc. of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015, pp. 2112–2117.
[85]

L. Li, K. Jamieson, G. DeSalvo, A. Rostamizadeh, and A. Talwalkar, Hyperband: A novel bandit-based approach to hyperparameter optimization, Journal of Machine Learning Research, vol. 18, no. 185, pp. 1–52, 2018.

[86]
S. Falkner, A. Klein, and F. Hutter, Bohb: Robust and efficient hyperparameter optimization at scale, in Proc. of the 35th International Conference on Machine Learning. doi: 10.48550/arXiv.1807.01774.
[87]
J. Kong, W. Kowalczyk, D. A. Nguyen, T. Bäck, and S. Menzel, Hyperparameter optimisation for improving classification under class imbalance, in Proc. Switerland 2019 IEEE Symposium Series on Computational Intelligence (SSCI), Xiamen, China, 2019, pp. 3072–3078.
[88]

D. A. Nguyen, A. V. Kononova, S. Menzel, B. Sendhoff, and T. Bäck, An efficient contesting procedure for automl optimization, IEEE Access, vol. 10, pp. 75754–75771, 2022.

[89]

X.-B. Jin, W.-Z. Zheng, J.-L. Kong, X.-Y. Wang, Y.-T. Bai, T.-L. Su, and S. Lin, Deep-learning forecasting method for electric power load via attention-based encoder-decoder with bayesian optimization, Energies, vol. 14, no. 6, p. 1596, 2021.

[90]
K. Miao, Q. Hua, and H. Shi, Short-term load forecasting based on cnn-bilstm with bayesian optimization and attention mechanism, in Parallel and Distributed Computing, Applications and Technologies, Y. Zhang, Y. Xu, and H. Tian, eds. Cham, Switerland: Springer International Publishing, 2021, pp. 116–128.
[91]
A. Vysala and J. Gomes, Evaluating and validating cluster results, in Proc. 9th International Conference on Data Mining & Knowledge Management Process (CDKP 2020), Toronto, Canada, 2020, pp. 37–47.
[92]

H. Wang, J. Zhao, B. Wang, L. Tong, and G. Yuan, A quantum approximate optimization algorithm with metalearning for maxcut problem and its simulation via tensorflow quantum, Mathematical Problems in Engineering, vol. 2021, p. 6655455, 2021.

[93]
V. Kuleshov, N. Fenner, and S. Ermon, Accurate uncertainties for deep learning using calibrated regression, in Proc. of the 35th International Conference on Machine Learning. doi: 10.48550/arXiv.1807.00263.
[94]

M. Sajjad, Z. A. Khan, A. Ullah, T. Hussain, W. Ullah, M. Y. Lee, and S. W. Baik, A novel cnn-gru-based hybrid approach for short-term residential load forecasting, IEEE Access, vol. 8, pp. 143759–143768, 2020.

[95]

M. Tutuianu, P. Bonnel, B. Ciuffo, T. Haniu, N. Ichikawa, A. Marotta, J. Pavlovic, and H. Steven, Development of the world-wide harmonized light duty test cycle and a possible pathway for its introduction in the european legislation, Transportation Research Part D: Transport and Environment, vol. 40, pp. 61–75, 2015.

[96]
United States Environmental Protection Agency, Dynamometer drive schedules, https://www.epa.gov/vehicle-and-fuel-emissions-testing/dynamometer-drive-schedules, 2022.
Big Data Mining and Analytics
Pages 512-530
Cite this article:
Billert AM, Yu R, Erschen S, et al. Improved Quantile Convolutional and Recurrent Neural Networks for Electric Vehicle Battery Temperature Prediction. Big Data Mining and Analytics, 2024, 7(2): 512-530. https://doi.org/10.26599/BDMA.2023.9020028

224

Views

38

Downloads

0

Crossref

0

Web of Science

0

Scopus

0

CSCD

Altmetrics

Received: 14 April 2023
Revised: 16 September 2023
Accepted: 09 October 2023
Published: 22 April 2024
© The author(s) 2023.

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return