An Underwater Human–Robot Interaction Using Hand Gestures for Fuzzy Control

被引:0
|
作者
Yu Jiang
Xianglong Peng
Mingzhu Xue
Chong Wang
Hong Qi
机构
[1] Jilin University,College of Computer Science and Technology
[2] Jilin University,Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education
来源
关键词
Fuzzy control; Gesture recognition; Human–robot communication; AUV;
D O I
暂无
中图分类号
学科分类号
摘要
Autonomous underwater vehicle (AUV) plays an important role in ocean research and exploration. The underwater environment has a great influence on AUV control and human–robot interaction, since underwater environment is highly dynamic with unpredictable fluctuation of water flow, high pressure and light attenuation. The traditional control model contains a large number of parameters, which is not effective and produces errors frequently. The proposal of fuzzy control addressed this issue to a certain extent. It applies fuzzy variables to the controller, which replace the values in an interval. In addition to the controller, underwater human–robot interaction is also difficult. Divers cannot speak or show any facial expressions underwater. The buttons on the AUV also need to overcome the huge water pressure. In this paper, we proposed a method to recognize the gesture instructions and apply it to the fuzzy control of AUV. Our contribution is the gesture recognition framework for the human–robot interaction, including the gesture detection network and the algorithm for the control of AUV. The experiment result shows the efficiency of the proposed method.
引用
收藏
页码:1879 / 1889
页数:10
相关论文
共 50 条
  • [21] Human Robot Hand Interaction with Plastic Deformation Control
    Murakami, Kenichi
    Ishimoto, Koki
    Senoo, Taku
    Ishikawa, Masatoshi
    [J]. ROBOTICS, 2020, 9 (03):
  • [22] Human hand detection using evolutionary computation for gestures recognition of a partner robot
    Hashimoto, Setsuo
    Kubota, Naoyuki
    Kojima, Fumio
    [J]. KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS, PT 3, PROCEEDINGS, 2006, 4253 : 684 - 691
  • [23] Control of a Mobile Robot by Human Gestures
    Pentiuc, Stefan-Gheorghe
    Vultur, Oana Mihaela
    Ciupu, Andrei
    [J]. INTELLIGENT DISTRIBUTED COMPUTING VII, 2014, 511 : 217 - 222
  • [24] Recognizing Hand Gestures for Human Computer Interaction
    Singh, Dushyant Kumar
    [J]. 2015 INTERNATIONAL CONFERENCE ON COMMUNICATIONS AND SIGNAL PROCESSING (ICCSP), 2015, : 379 - 382
  • [25] Human-Robot Interaction Using Three-Dimensional Gestures
    Ponmani, K.
    Sridharan, S.
    [J]. INTELLIGENT EMBEDDED SYSTEMS, ICNETS2, VOL II, 2018, 492 : 67 - 76
  • [26] Pantomimic Gestures for Human-Robot Interaction
    Burke, Michael
    Lasenby, Joan
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1225 - 1237
  • [27] Understanding human motion and gestures for underwater human-robot collaboration
    Islam, Md Jahidul
    Ho, Marc
    Sattar, Junaed
    [J]. JOURNAL OF FIELD ROBOTICS, 2019, 36 (05) : 851 - 873
  • [28] Conversational Gestures in Human-Robot Interaction
    Bremner, Paul
    Pipe, Anthony
    Melhuish, Chris
    Fraser, Mike
    Subramanian, Sriram
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 1645 - +
  • [29] Human Robot Interaction using Diver Hand Signals
    Codd-Downey, Robert
    Jenkin, Michael
    [J]. HRI '19: 2019 14TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2019, : 550 - 551
  • [30] Grasping control of robot hand using fuzzy neural network
    Chen, Peng
    Hasegawa, Yoshizo
    Yamashita, Mitushi
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 2, PROCEEDINGS, 2006, 3972 : 1178 - 1187