AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (12.1 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Machine Learning-Based Multi-Modal Information Perception for Soft Robotic Hands

Haiming Huang( )Junhao LinLinyuan WuBin FangZhenkun WenFuchun Sun( )
College of Electronics and Information Engineering, Shenzhen University, Shenzhen 518060, China.
Department of Computer Science and Technology, Tsinghua University, Tsinghua University, Beijing 100084, China.
College of Computer Science and Software Engineering, Shenzhen University, Shenzhen 518060, China.
Show Author Information

Abstract

This paper focuses on multi-modal Information Perception (IP) for Soft Robotic Hands (SRHs) using Machine Learning (ML) algorithms. A flexible Optical Fiber-based Curvature Sensor (OFCS) is fabricated, consisting of a Light-Emitting Diode (LED), photosensitive detector, and optical fiber. Bending the roughened optical fiber generates lower light intensity, which reflecting the curvature of the soft finger. Together with the curvature and pressure information, multi-modal IP is performed to improve the recognition accuracy. Recognitions of gesture, object shape, size, and weight are implemented with multiple ML approaches, including the Supervised Learning Algorithms (SLAs) of K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Logistic Regression (LR), and the unSupervised Learning Algorithm (un-SLA) of K-Means Clustering (KMC). Moreover, Optical Sensor Information (OSI), Pressure Sensor Information (PSI), and Double-Sensor Information (DSI) are adopted to compare the recognition accuracies. The experiment results demonstrate that the proposed sensors and recognition approaches are feasible and effective. The recognition accuracies obtained using the above ML algorithms and three modes of sensor information are higer than 85 percent for almost all combinations. Moreover, DSI is more accurate when compared to single modal sensor information and the KNN algorithm with a DSI outperforms the other combinations in recognition accuracy.

References

[1]
G. Ponraj, S. K. Kirthika, N. V. Thakor, C. H. Yeow, S. L. Kukreja, and H. L. Ren, Development of flexible fabric based tactile sensor for closed loop control of soft robotic actuator, in Proc. 13th IEEE Conf. on Automation Science and Engineering, Xi’an, China, 2017, pp. 14511456.
[2]
J. H. Low, W. W. Lee, P. M. Khin, N. V. Thakor, S. L. Kukreja, H. L. Ren, and C. H. Yeow, Hybrid tele-manipulation system using a sensorized 3-D-printed soft robotic gripper and a soft fabric-based haptic glove, IEEE Rob. Autom. Lett., vol. 2, no. 2, pp. 880887, 2017.
[3]
R. Deimel and O. Brock, A novel type of compliant and underactuated robotic hand for dexterous grasping, Int. J. Rob. Res., vol. 35, nos. 1–3, pp. 161185, 2016.
[4]
R. Deimel, M. Radke, and O. Brock, Mass control of pneumatic soft continuum actuators with commodity components, in Proc. 2016 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Daejeon, South Korea, 2016, pp. 774779.
[5]
A. Gupta, C. Eppner, S. Levine, and P. Abbeel, Learning dexterous manipulation for a soft robotic hand from human demonstrations, in Proc. 2016 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Daejeon, South Korea, 2016, pp. 37863793.
[6]
Y. F. Hao, T. M. Wang, and L. Wen, A programmable mechanical freedom and variable stiffness soft actuator with low melting point alloy, in Proc. 10th Int. Conf. on Intelligent Robotics and Applications, Wuhan, China, 2017, pp. 151161.
[7]
G. De Boer, N. Raske, H. B. Wang, J. Kow, A. Alazmani, M. Ghajari, P. Culmer, and R. Hewson, Design optimisation of a magnetic field based soft tactile sensor, Sensors, vol. 17, no. 11, p. 2539, 2017.
[8]
Z. P. Ji, H. Zhu, H. C. Liu, N. Liu, T. Chen, Z. Yang, and L. N. Sun, The design and characterization of a flexible tactile sensing array for robot skin, Sensors, vol. 16, no. 12, p. 2001, 2016.
[9]
S. Seo, S. Kim, J. Jung, R. J. Ma, S. Baik, and H. Moon, Flexible touch sensors made of two layers of printed conductive flexible adhesives, Sensors, vol. 16, no. 9, p. 1515, 2016.
[10]
W. Z. Yuan, S. Y. Dong, and E. H. Adelson, Gelsight: High-resolution robot tactile sensors for estimating geometry and force, Sensors, vol. 17, no. 12, p. 2762, 2017.
[11]
Z. Kappassov, J. A. Corrales, and V. Perdereau, Tactile sensing in dexterous robot hands — Review, Rob. Auton. Syst., vol. 74, pp. 195220, 2015.
[12]
L. Zou, C. Ge, Z. J. Wang, E. Cretu, and X. O. Li, Novel tactile sensor technology and smart tactile sensing systems: A review, Sensors, vol. 17, no. 11, p. 2653, 2017.
[13]
H. C. Zhao, R. K. Huang, and R. F. Shepherd, Curvature control of soft orthotics via low cost solid-state optics, in Proc. 2016 IEEE Int. Conf. on Robotics and Automation, Stockholm, Sweden, 2016, pp. 40084013.
[14]
H. C. Zhao, J. Jalving, R. K. Huang, R. Knepper, A. Ruina, and R. Shepherd, A helping hand: Soft orthosis with integrated optical strain sensors and EMG control, IEEE Rob. Autom. Mag., vol. 23, no. 3, pp. 5564, 2016.
[15]
H. C. Zhao, K. O’Brien, S. Li, and R. F. Shepherd, Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides, Sci. Rob., vol. 1, no. 1, p. eaai7529, 2016.
[16]
P. Kampmann and F. Kirchner, Integration of fiber-optic sensor arrays into a multi-modal tactile sensor processing system for robotic end-effectors, Sensors, vol. 14, no. 4, pp. 68546876, 2014.
[17]
R. F. Xu, J. N. Hu, Q. Lu, D. Y. Wu, and L. Gui, An ensemble approach for emotion cause detection with event extraction and multi-kernel SVMs, Tsinghua Sci. Technol., vol. 22, no. 6, pp. 646659, 2017.
[18]
Z. L. Yuan, Y. Q. Lu, and Y. B. Xue, Droiddetector: Android malware characterization and detection using deep learning, Tsinghua Sci. Technol., vol. 21, no. 1, pp. 114123, 2016.
[19]
F. Wang, H. P. Liu, F. C. Sun, and H. H. Pan, Fabric recognition using zero-shot learning, Tsinghua Sci. Technol., vol. 24, no. 6, pp. 645653, 2019.
[20]
F. Naya, J. Yamato, and K. Shinozawa, Recognizing human touching behaviors using a haptic interface for a pet-robot, in Proc. 1999 IEEE Int. Conf. on Systems, Man, and Cybernetics, Tokyo, Japan, 1999, pp. 10301034.
[21]
H. H. Hu, Y. Z. Han, A. G. Song, S. G. Chen, C. H. Wang, and Z. Wang, A finger-shaped tactile sensor for fabric surfaces evaluation by 2-dimensional active sliding touch, Sensors, vol. 14, no. 3, pp. 48994913, 2014.
[22]
X. Li and H. F. Liu, Greedy optimization for K-means-based consensus clustering, Tsinghua Sci. Technol., vol. 23, no. 2, pp. 184194, 2018.
[23]
H. Chan-Maestas and D. A. Sofge, Tactile sensor system processing based on K-means clustering, in Proc. 2011 10th Int. Conf. on Machine Learning and Applications and Workshops, Honolulu, HI, USA, 2011, pp. 287292.
[24]
H. Shimoe, K. Matsumura, H. Noma, M. Sohgawa, and M. Okuyama, Development of artificial haptic model for human tactile sense using machine learning, in Proc. 2011 IEEE SENSORS, Glasgow, UK, 2017, pp. 13.
Tsinghua Science and Technology
Pages 255-269
Cite this article:
Huang H, Lin J, Wu L, et al. Machine Learning-Based Multi-Modal Information Perception for Soft Robotic Hands. Tsinghua Science and Technology, 2020, 25(2): 255-269. https://doi.org/10.26599/TST.2019.9010009

830

Views

156

Downloads

37

Crossref

N/A

Web of Science

41

Scopus

2

CSCD

Altmetrics

Received: 03 December 2018
Revised: 14 February 2019
Accepted: 11 March 2019
Published: 02 September 2019
© The author(s) 2020

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return