PDF (9.7 MB)
Collect
Submit Manuscript
Open Access

Development and Investigation of Vision System for a Small-Sized Mobile Humanoid Robot in a Smart Environment

Amer Tahseen Abu-Jassar1Hani Attar2,3()Ayman Amer2Vyacheslav Lyashenko4Vladyslav Yevsieiev4Ahmed Solyman5
Department of Computer Science, College of Information Technology, Amman Arab University, Amman 11937, Jordan
Faculty of Engineering, Zarqa University, Zarqa 2000, Jordan
College of Engineering, University of Business and Technology, Jeddah 21448, Saudi Arabia
Department of Media Systems and Technology, Kharkiv National University of Radio Electronics, Kharkiv 61166, Ukraine
Department of Electrical and Electronics Engineering, Faculty of Engineering and Architecture, Nişantaşı University, Istanbul 34398, Türkiye
Show Author Information

Abstract

The conducted research aims to develop a computer vision system for a small-sized mobile humanoid robot. The decentralization of the servomotor control and the computer vision systems is investigated based on the hardware solution point of view, moreover, the required software level to achieve an efficient matched design is obtained. A computer vision system using the upgraded tiny-You Only Look Once (YOLO) network model is developed to allow recognizing and identifying objects and making decisions on interacting with them, which is recommended for crowd environment. During the research, a concept of a computer vision system was developed, which describes the interaction between the main elements, on the basis of which hardware modules were selected to implement the task. A structure of information interaction between hardware modules is proposed, and a connection scheme is developed, on the basis of which a model of a computer vision system is assembled for research, with the required algorithmic and software for solving the problem. To ensure the high speed of the computer vision system based on the ESP32-CAM module, the neural network was improved by replacing the Visual Geometry Group 16 (VGG-16) network as the base network for extracting the functions of the Single Shot Detector (SSD) network model with the tiny-YOLO lightweight network model, which made it possible to preserve the multidimensional structure of the network model feature graph, resulting in increasing the detection accuracy, while significantly reducing the amount of calculations generated by the network operation, thereby significantly increasing the detection speed, due to a limited set of objects. Finally, a number of experiments were carried out, both in static and dynamic environments, which showed a high accuracy of identifications.

References

[1]
S. Khlamov, V. Savanevych, I. Tabakova, and T. Trunova, Statistical modeling for the near-zero apparent motion detection of objects in series of images from data stream, in Proc. 12th Int. Conf. Advanced Computer Information Technologies (ACIT), Ruzomberok, Slovakia, 2022, pp. 126–129.
[2]
S. Khlamov, I. Tabakova, and T. Trunova, Recognition of the astronomical images using the Sobel filter, in Proc. 29th Int. Conf. Systems, Signals and Image Processing (IWSSIP), Sofia, Bulgaria, 2022, pp. 1–4.
[3]

M. A. Ahmad, I. Tvoroshenko, J. H. Baker, and V. Lyashenko, Modeling the structure of intellectual means of decision-making using a system-oriented NFO approach, Int. J. Emerg. Trends Eng. Res., vol. 7, no. 11, pp. 460–465, 2019.

[4]
V. Lyashenko, O. Kobylin, and M. Minenko, Tools for investigating the phishing attacks dynamics, in Proc. Int. Scientific-Practical Conf. Problems of Infocommunications. Science and Technology (PIC S&T), Kharkiv, Ukraine, 2018, pp. 43–46.
[5]

H. Attar, A. T. Abu-Jassar, V. Yevsieiev, V. Lyashenko, I. Nevliudov, and A. K. Luhach, Zoomorphic mobile robot development for vertical movement based on the geometrical family caterpillar, Comput. Intell. Neurosci., vol. 2022, p. 3046116, 2022.

[6]

H. Attar, A. T. Abu-Jassar, A. Amer, V. Lyashenko, V. Yevsieiev, and M. R. Khosravi, Control system development and implementation of a CNC laser engraver for environmental use with remote imaging, Comput. Intell. Neurosci., vol. 2022, p. 9140156, 2022.

[7]
A. Rabotiahov, O. Kobylin, Z. Dudar, and V. Lyashenko, Bionic image segmentation of cytology samples method, in Proc. 14th Int. Conf. Advanced Trends in Radioelecrtronics, Telecommunications and Computer Engineering (TCSET), Lviv-Slavske, Ukraine, 2018, pp. 665–670.
[8]

S. M. H. Mousavi, V. Lyashenko, and V. B. S. Prasath, Analysis of a robust edge detection system in different color spaces using color and depth images, Comput. Opt., vol. 43, no. 4, pp. 632–646, 2019.

[9]

A. T. Abu-Jassar, Y. M. Al-Sharo, V. Lyashenko, and S. Sotnik, Some features of classifiers implementation for object recognition in specialized computer systems, TEM J., vol. 10, no. 4, pp. 1645–1654, 2021.

[10]

A. C. Bavelos, N. Kousi, C. Gkournelos, K. Lotsaris, S. Aivaliotis, G. Michalos, and S. Makris, Enabling flexibility in manufacturing by integrating shopfloor and process perception for mobile robot workers, Appl. Sci., vol. 11, no. 9, p. 3985, 2021.

[11]

A. S. M. Al-Obaidi, A. Al-Qassar, A. R. Nasser, A. Alkhayyat, A. J. Humaidi, and I. K. Ibraheem, Embedded design and implementation of mobile robot for surveillance applications, Indonesian J. Sci. Technol., vol. 6, no. 2, pp. 427–440, 2021.

[12]

Y. H. Jung, D. H. Cho, J. W. Hong, S. H. Han, S. B. Cho, D. Y. Shin, E. T. Lim, and S. S. Kim, Development of multi-sensor module mounted mobile robot for disaster field investigation, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., vol. XLIII-B3-2022, pp. 1103–1108, 2022.

[13]

G. Lajkó, R. N. Elek, and T. Haidegger, Endoscopic image-based skill assessment in robot-assisted minimally invasive surgery, Sensors, vol. 21, no. 16, p. 5412, 2021.

[14]
M. M. Rahman, T. Rahman, D. Kim, and M. A. U. Alam, Knowledge transfer across imaging modalities via simultaneous learning of adaptive autoencoders for high-fidelity mobile robot vision, in Proc. 2021 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS), Prague, Czech Republic, 2021, pp. 1267–1273.
[15]

Y. J. Mon, Vision robot path control based on artificial intelligence image classification and sustainable ultrasonic signal transformation technology, Sustainability, vol. 14, no. 9, p. 5335, 2022.

[16]

D. Zhang and Z. Guo, Mobile sentry robot for laboratory safety inspection based on machine vision and infrared thermal imaging detection, Secur. Commun. Netw., vol. 2021, p. 6612438, 2021.

[17]

A. Stateczny, K. Gierlowski, and M. Hoeft, Wireless local area network technologies as communication solutions for unmanned surface vehicles, Sensors, vol. 22, no. 2, p. 655, 2022.

[18]
EMAX ES08MA II 12g mini metal gear analog servo for RC model, https://www.amazon.com/ES08MA-Metal-Analog-Servo-Model/dp/B07KYK9N1G, 2023.
[19]
OV7670 camera module supports VGA CIF auto exposure control display active size 640X480, https://www.amazon.com/Supports-Exposure-Control-Display-640X480/dp/B09X59J8N9/ref=sr_1_6?crid=1OSVL1M09YPB5&keywords=VGA+OV7670&qid=1669977970&sprefix=vga+ov7670+%2Caps%2C600&sr=8-6, 2023.
[20]
01Studio pyai-K210 kit entwicklungs board python ai künstliche intelligenz machine vision deep learning micro python, https://de.aliexpress.com/item/1005001459205624.html?gatewayAdapt=glo2deu, 2023.
[21]
ESP32-CAM esp32 cam seriell zu wifi entwicklung bord micro usb bluetooth + ov2640 kamera modul, https://de.aliexpress.com/item/32947577882.html?gatewayAdapt=glo2deu, 2023.
[22]
Preliminary datasheet, OV7670/OV7171 CMOS VGA (640×480),http://web.mit.edu/6.111/www/f2016/tools/OV7670_2006.pdf, 2023.
[23]
M5Stack K027 M5StickV K210 ai camera, https://eu.mouser.com/new/m5stack/m5stack-k027-m5stickv-k210-ai-camera/, 2023.
[24]
ESP32-Cam module product specification, https://loboris.eu/ESP32/ESP32-CAM%20Product%20Specification.pdf, 2023.
[25]
Neue version 32 kanal roboter servo control board servo motor controller PS2 drahtlose steuerung USB/UART verbindung modus, https://de.aliexpress.com/item/32272674301.html?gatewayAdapt=glo2deu, 2023.
[26]
straße lenkgetriebe control board servo controller von intelligente roboter/serielle USB/PC APP/motor, https://de.aliexpress.com/item/1005004170128663.html?gatewayAdapt=glo2deu, 2023.
[27]
Plen2 control board entwicklung board wireless controller für 18DOF vivi RC biped humanoiden roboter DIY für arduino, https://de.aliexpress.com/item/4000432667382.html?gatewayAdapt=glo2deu, 2023.
[28]
Arduino IDE, https://www.arduino.cc/en/software, 2023.
[29]

P. A. Nugroho, Kontrol lampu gedung melalui WiFi esp8266 dengan web server lokal, J. Elektro Dan Informatika Swadharma, vol. 1, no. 2, pp. 1–11, 2021.

[30]
Ultrasonic module HC-SR04 distance sensor, https://www.amazon.eg/-/en/Ultrasonic-Module-HC-SR04-Distance-arduino/dp/B091D3QWN4, 2023.
[31]
Python release python 3.10, https://www.python.org/downloads/release/ python-3100/, 2023.
[32]
PyCharm, https://www.jetbrains.com/pycharm/, 2023.
[33]
What is a 503 status code? https://www.webfx.com/web-development/glossary/http-status-codes/what-is-a-503-status-code/, 2023.
[34]

L. Withington, D. D. P. D. Vera, C. Guest, C. Mancini, and P. Piwek, Artificial neural networks for classifying the time series sensor data generated by medical detection dogs, Expert Syst. Appl., vol. 184, p. 115564, 2021.

[35]
Z. Wang, X. Gu, R. S. M. Goh, J. T. Zhou, and T. Luo, Efficient spiking neural networks with radix encoding, IEEE Trans. Neural Netw. Learning Syst., doi: 10.1109/TNNLS.2022.3195918.
[36]
B. C. Liu, Q. Yu, J. W. Gao, S. Zhao, X. C. Liu, and Y. F. Lu, Spiking neuron networks based energy-efficient object detection for mobile robot, in Proc. 2021 China Automation Congress (CAC), Beijing, China, 2021, pp. 3224–3229.
[37]
M. Y. Baihaqi, V. Vincent, and J. W. Simatupang, Humanoid robot application as COVID-19 symptoms checker using computer vision and multiple sensors, ELKHA: J. Teknik Elektro, vol. 13, no. 2, pp. 105–112, 2021.
[38]
A. C. Nugraha, M. L. Hakim, S. Yatmono, and M. Khairudin, Development of ball detection system with YOLOv3 in a humanoid soccer robot, J. Phys.: Conf. Ser., vol. 2111, no. 1, p. 012055, 2021.
International Journal of Crowd Science
Pages 29-43
Cite this article:
Abu-Jassar AT, Attar H, Amer A, et al. Development and Investigation of Vision System for a Small-Sized Mobile Humanoid Robot in a Smart Environment. International Journal of Crowd Science, 2025, 9(1): 29-43. https://doi.org/10.26599/IJCS.2023.9100018
Metrics & Citations  
Article History
Copyright
Rights and Permissions
Return