AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (2.4 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Review | Open Access

Multisensor information fusion: Future of environmental perception in intelligent vehicles

Yongsheng Zhang1Chen Tu2Kun Gao3Liang Wang4( )
CRRC Qishuyan Institute Co., Ltd., Changzhou 213011, China
School of Information Science and Engineering, Southeast University, Nanjing 211189, China
Department of Architecture and Civil Engineering, Chalmers University of Technology, Gothenburg SE-41296, Sweden
School of Vehicle and Mobility, Tsinghua University, Beijing 100084, China
Show Author Information

Abstract

As urban transportation increasingly impacts daily life, efficiently utilizing traffic resources and developing public transportation have become crucial for addressing issues such as congestion, frequent accidents, and noise pollution. The rapid advancement of intelligent autonomous driving technologies, particularly environmental perception technologies, offers new directions for solving these problems. This review discusses the application of multisensor information fusion technology in environmental perception for intelligent vehicles, analyzing the components and performance of various sensors and their specific applications in autonomous driving. Through multisensor information fusion, the accuracy of environmental perception is enhanced, optimizing decision support for autonomous driving systems and thereby improving vehicle safety and driving efficiency. This study also discusses the challenges faced by information fusion technology and future development trends, providing references for further research and application in intelligent transportation systems.

References

 

Abdi, G., Samadzadegan, F., Reinartz, P., 2018. Deep learning decision fusion for the classification of urban remote sensing data. J Appl Remote Sens, 12, 016038.

 

Arun Francis, G., Arulselvan, M., Elangkumaran, P., Keerthivarman, S., Vijaya Kumar, J., 2020. Object detection using ultrasonic sensor. Int J Innov Technol Explor Eng, 8, 207−209.

 

Baloch, Z., Shaikh, F. K., Ali Unar, M., 2018. A context-aware data fusion approach for health-IoT. Int J Inf Technol, 10, 241−245.

 
Banerjee, K., Notz, D., Windelen, J., Gavarraju, S., He, M., 2018. Online camera LiDAR fusion and object detection on hybrid data for autonomous driving. In: 2018 IEEE Intelligent Vehicles Symposium (IV), 1632–1638.
 

Berk, M., Schubert, O., Kroll, H. M., Buschardt, B., Straub, D., 2020. Assessing the safety of environment perception in automated driving vehicles. SAE Int J Trans Safety, 8, 49−74.

 
Black, J., 2018. Urban Transport Planning: Theory and Practice. London, UK: Routledge.
 

Bleiholder, J., Naumann, F., 2009. Data fusion. ACM Comput Surv, 41, 1−41.

 
Buchanan, B. G., Shortliffe, E. H., 1984. Rule-Based Expert Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project. Boston, USA: Addison-Wesley Longman Publishing Co., Inc.
 
Chadwick, S., Maddern, W., Newman, P., 2019. Distant vehicle detection using radar and vision. In: 2019 International Conference on Robotics and Automation (ICRA), 8311–8317.
 
Chen, X., Liu, D., Hua, G., Mo, L., 2021b. A safety lane change method based on sensor date fusion designed for assistance driving. In: 2021 China Automation Congress (CAC), 5819–5823.
 
Cho, H., Seo, Y. W., Kumar, B. V. K. V., Rajkumar, R. R., 2014. A multi-sensor fusion system for moving object detection and tracking in urban driving environments. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), 1836–1843.
 
Choi, J., Ulbrich, S., Lichte, B., Maurer, M., 2013. Multi-target tracking using a 3D-Lidar sensor for autonomous vehicles. In: 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013), 881–886.
 

De Villiers, J., Pavlin, G., Jousselme, A., Maskell, S., de Waal, A., Laskey, K., et al., 2018. Uncertainty representation and evaluation for modeling and decision-making in information fusion. J Adv in Inform Fus, 13, 198−215.

 
de Villiers, J.P., Laskey, K., Jousselme, A.-L., Blasch, E., de Waal, A., Pavlin, G., Costa, P., 2015. Uncertainty representation, quantification and evaluation for data and information fusion. In: 2015 18th International Conference on Information Fusion (Fusion), 50–57.
 
El Madawi, K., Rashed, H., El Sallab, A., Nasr, O., Kamel, H., Yogamani, S., 2019. RGB and LiDAR fusion based 3D Semantic Segmentation for Autonomous Driving. In: 2019 IEEE Intelligent Transportation Systems Conference (ITSC), 7–12.
 
Fan, L., Wang, J., Chang, Y., Li, Y., Wang, Y., Cao, D., 2024. 4D mm Wave radar for autonomous driving perception: A comprehensive survey. IEEE Trans Intell Veh, 1–15.
 
Fan, R., Guo, S., Bocus, M. J., 2023. Autonomous Driving Perception. Singapore: Springer Nature.
 

Foo, P. H., Ng, G. W., 2013. High-level information fusion: An overview. J Adv Inf Fusion, 8, 33−72.

 
Gläser, C., Michalke, T. P., Burkle, L., Niewels, F., 2014. Environment perception for inner-city driver assistance and highly-automated driving. In: 2014 IEEE Intelligent Vehicles Symposium Proceedings, 1270–1275.
 
Göhring, D., Wang, M., Schnurmacher, M., Ganjineh, T., 2011. Radar/Lidar sensor fusion for car-following on highways. In: the 5th International Conference on Automation, Robotics and Applications, 407–412.
 
Guo, S., Jiang, Y., Li, J., Zhou, D., Su, S., Bocus, M. J., et al., 2023. Road environment perception for safe and comfortable driving. In: Autonomous Driving Perception: Fundamentals and Applications. Fan, R., Guo, S., Bocus, M.J., Eds. Singapore: Springer, 357–387.
 
Hartigan, J. A., 2012. Bayes Theory. Berlin, Germany: Springer Science & Business Media.
 
Hofmann, U., Rieder, A., Dickmanns, E. D., 2001. Radar and vision data fusion for hybrid adaptive cruise control on highways. In: Computer Vision Systems: Second International Workshop, ICVS 2001 Vancouver, 125–138.
 

Hussain, S., Thakur, S., Shukla, S., Breslin, J. G., Jan, Q., Khan, F., et al., 2022. A two-layer decentralized charging approach for residential electric vehicles based on fuzzy data fusion. J King Saud Univ Comput Inf Sci, 34, 7391−7405.

 
Iovescu, C., Rao, S., 2017. The fundamentals of millimeter wave sensors. https://www.ti.com/lit/wp/spyy005a/spyy005a.pdf
 
Jiang, B., Martinez, B., Valstar, M. F., Pantic, M., 2014. Decision level fusion of domain specific regions for facial action recognition. In: 2014 22nd International Conference on Pattern Recognition, 1776–1781.
 

Jung, J., Lim, S., Kim, B. K., Lee, S., 2021. CNN-based driver monitoring using millimeter-wave radar sensor. IEEE Sens Lett, 5, 1−4.

 
Kellner, D., Barjenbruch, M., Dietmayer, K., Klappstein, J., Dickmann, J., 2013. Instantaneous lateral velocity estimation of a vehicle using Doppler radar. In: Proceedings of the 16th International Conference on Information Fusion, 877–884.
 
Khoshnaw, A., Zein-Sabatto, S., Malkani, M., 2012. Cross layers decision making and fusion model in layered sensing systems. In: SPIE Proceedings", "Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2012, 56–63.
 
Kim, B., Yi, K., 2013. Probabilistic states prediction algorithm using multi-sensor fusion and application to Smart Cruise Control systems. In: 2013 IEEE Intelligent Vehicles Symposium (IV), 888–895.
 
Knyaz, V. A., 2019. Multimodal data fusion for object recognition. In: Multimodal Sensing: Technologies and Applications, 198–209.
 
Kulkarni, A. U., Potdar, A. M., Hegde, S., Baligar, V. P., 2019. RADAR based Object Detector using Ultrasonic Sensor. In: 2019 1st International Conference on Advances in Information Technology (ICAIT), 204–209.
 
Kumar, R., Wolenetz, M., Agarwalla, B., Shin, J., Hutto, P., Paul, A., et al., 2003. DFuse: A framework for distributed data fusion. In: Proceedings of the 1st international conference on Embedded networked sensor systems, 114–125.
 
Kunz, F., Nuss, D., Wiest, J., Deusch, H., Reuter, S., Gritschneder, F., et al., 2015. Autonomous driving at Ulm University: A modular, robust, and sensor-independent fusion approach. In: 2015 IEEE Intelligent Vehicles Symposium (IV), 666–673.
 
Kutila, M., Pyykonen, P., Ritter, W., Sawade, O., Schaufele, B., 2016. Automotive LIDAR sensor development scenarios for harsh weather conditions. In: 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), 265–270.
 

Li, G., Yan, Z., Fu, Y., Chen, H., 2018. Data fusion for network intrusion detection: A review. Secur Commun Netw, 2018, 8210614.

 
Li, X., Ma, T., Hou, Y., Shi, B., Yang, Y., Liu, Y., et al., 2023. LoGoNet: towards accurate 3D object detection with local-to-global cross- modal fusion. In: 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 17524–17534.
 
Liu, F., Lu, Z., Lin, X., 2022. Vision-based environmental perception for autonomous driving. Proc Inst Mech Eng Part D J Automob Eng, 09544070231203059.
 

Liu, R., Liu, J., Jiang, Z., Fan, X., Luo, Z., 2020b. A bilevel integrated model with data-driven layer ensemble for multi-modality image fusion. IEEE Trans Image Process, 30, 1261−1274.

 

Liu, Z., Cai, Y., Wang, H., Chen, L., Gao, H., Jia, Y., et al., 2021b. Robust target recognition and tracking of self-driving cars with radar and camera information fusion under severe weather conditions. IEEE Trans Intell Transport Syst, 23, 6640−6653.

 

Maksymova, I., Steger, C., Druml, N., 2018. Review of LiDAR sensor data acquisition and compression for automotive applications. Proceedings, 2, 852.

 
Nesti, T., Boddana, S., Yaman, B., 2023. Ultra-Sonic Sensor based Object Detection for Autonomous Vehicles. In: 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 210–218.
 
Nobis, F., Geisslinger, M., Weber, M., Betz, J., Lienkamp, M., 2019. A deep learning-based radar and camera sensor fusion architecture for object detection. In: 2019 Sensor Data Fusion: Trends, Solutions, Applications (SDF), 1–7.
 
Obrvan, M., Ćesić, J., Petrović, I., 2016. Appearance based vehicle detection by radar-stereo vision integration. In: Robot 2015: Second Iberian Robotics Conference, 437–449.
 

Park, S., Ko, B. H., Noh, S., Khang, S. T., Choi, J., 2023. Air-coupled FMCW ultrasonic sensor for high-resolution 3-D perception. IEEE Sens Lett, 8, 1−4.

 
Paulet, M. V., Salceanu, A., Neacsu, O. M., 2016. Ultrasonic radar. In: 2016 International Conference and Exposition on Electrical and Power Engineering (EPE), 551–554.
 
Philipp, R., Qian, H., Hartjen, L., Schuldt, F., Howar, F., 2021. Simulation-based elicitation of accuracy requirements for the environmental perception of autonomous vehicles. In: International Symposium on Leveraging Applications of Formal Methods, 129–145.
 
Rashed, H., Ramzy, M., Vaquero, V., El Sallab, A., Sistu, G., Yogamani, S., 2019. FuseMODNet: real-time camera and LiDAR based moving object detection for robust low-light autonomous driving. In: 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW), 2393–2402.
 
Reza, M., Choudhury, S., Dash, J. K., Roy, D. S., 2020. An ai-based real-time roadway-environment perception for autonomous driving. In: 2020 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-Taiwan), 1–2.
 
Saleh, K., Hossny, M., Nahavandi, S., 2017. Driving behavior classification based on sensor data fusion using LSTM recurrent neural networks. In: 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), 1–6.
 
Schwenker, F., Dietrich, C., Thiel, C., Palm, G., 2006. Learning of decision fusion mappings for pattern recognition. https://www.christianthiel.com/publications/DecisionFusionMappingsIcgstPaper.pdf
 
Sindagi, V., Patel, V., 2019. Multi-level bottom-top and top-bottom feature fusion for crowd counting. In: 2019 IEEE/CVF International Conference on Computer Vision (ICCV), 1002–1012.
 

Steinberg, A. N., Bowman, C. L., 2004. Rethinking the JDL data fusion levels. Nssdf Jhapl, 38, 39.

 
Tsai, Y. H., Dai, H. M., Yan, Y. J., Ou-Yang, M., 2024. Fusion information refresh rate improvement based on adaptive visual tracking in vehicle augmented reality sightseeing interactive system. https://doi.org/10.1002/jsid.1276
 

Ul-Haq, A., 2020. The role of information fusion in transfer learning of obscure human activities during night. J Adv Inform Fus, 15, 49−56.

 
Vasnier, K., Mouaddib, A. I., Gatepaille, S., Brunessaux, S., 2018. Multi-Level Information Fusion Approach with Dynamic Bayesian Networks for an Active Perception of the environment. In: 2018 21st International Conference on Information Fusion (FUSION), 1844–1850.
 
Velasco-Hernandez, G., Yeong, D. J., Barry, J., Walsh, J., 2020. Autonomous driving architectures, perception and data fusion: A review. In: 2020 IEEE 16th International Conference on Intelligent Computer Communication and Processing (ICCP), 315–321.
 
Wang, Y., Cao, G., 2011a. Barrier coverage in camera sensor networks. In: Proceedings of the Twelfth ACM International Symposium on Mobile Ad Hoc Networking and Computing, 1–10.
 
Wang, Y., Cao, G., 2011b. On full-view coverage in camera sensor networks. In: 2011 Proceedings IEEE INFOCOM, 1781–1789.
 
Wang, Y., Sun, F., Lu, M., Yao, A., 2020a. Learning deep multimodal feature representation with asymmetric multi-layer fusion. In: Proceedings of the 28th ACM International Conference on Multimedia, 3902–3910.
 

Wei, P., Zeng, Y., Ouyang, W., Zhou, J., 2023. Multi-sensor environmental perception and adaptive cruise control of intelligent vehicles using Kalman filter. IEEE Trans Intell Transport Syst, 25, 3098−3107.

 
Welch, G., Bishop, G., 1995. An introduction to the Kalman filter. https://courses.cs.washington.edu/courses/cse571/00au/papers/welch-bishop-kalman.pdf
 

Wong, K., Gu, Y., Kamijo, S., 2021. Mapping for autonomous driving: Opportunities and challenges. IEEE Intell Transport Syst Mag, 13, 91−106.

 
Wu, S., McClean, S., 2005. Data fusion with correlation weights. In: European Conference on Information Retrieval, 275–286.
 

Yang, L., Yang, Y., Wu, G., Zhao, X., Fang, S., Liao, X., et al., 2022. A systematic review of autonomous emergency braking system: Impact factor, technology, and performance evaluation. J Adv Transp, 2022, 1188089.

 

Yeong, D. J., Velasco-Hernandez, G., Barry, J., Walsh, J., 2021. Sensor and sensor fusion technology in autonomous vehicles: A review. Sensors, 21, 2140.

 
Yoo, J. H., Kim, Y., Kim, J., Choi, J. W., 2020. 3D-CVF: Generating joint camera and LiDAR features using cross-view spatial feature fusion for 3D object detection. In: European Conference on Computer Vision, 720–736.
 

Yu, Y., Gong, Z., Wang, C., Zhong, P., 2017. An unsupervised convolutional feature fusion network for deep representation of remote sensing images. IEEE Geosci Remote Sensing Lett, 15, 1−5.

 
Yuan, W., Krishnamurthy, S. V., Tripathi, S. K., 2003. Synchronization of multiple levels of data fusion in wireless sensor networks. In: GLOBECOM’03. IEEE Global Telecommunications Conference (IEEE Cat. No.03CH37489), 221–225.
 
Zhang, L., Xie, Y., Luan, X., Zhang, X., 2018. Multi-source heterogeneous data fusion. In: 2018 International Conference on Artificial Intelligence and Big Data (ICAIBD), 47–51.
 
Zhao, H., Zhang, Y., Meng, P., Shi, H., Li, L. E., Lou, T., et al., 2020. Driving scenario perception-aware computing system design in autonomous vehicles. In: 2020 IEEE 38th International Conference on Computer Design (ICCD), 88–95.
 
Zimmermann, H., 2011. Fuzzy Set Theoryand Its Applications. Berlin, Germany: Springer Science & Business Media.
Journal of Intelligent and Connected Vehicles
Pages 163-176
Cite this article:
Zhang Y, Tu C, Gao K, et al. Multisensor information fusion: Future of environmental perception in intelligent vehicles. Journal of Intelligent and Connected Vehicles, 2024, 7(3): 163-176. https://doi.org/10.26599/JICV.2023.9210049

189

Views

27

Downloads

0

Crossref

0

Scopus

Altmetrics

Received: 08 May 2024
Revised: 31 May 2024
Accepted: 03 June 2024
Published: 26 September 2024
© The author(s) 2023.

This is an open access article under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return