Feel the Point Clouds: Traversability Prediction and Tactile Terrain Detection Information for an Improved Human-Robot Interaction

被引:0
|
作者
Edlinger, Raimund [1 ]
Nuechter, Andreas [2 ]
机构
[1] Univ Appl Sci Upper Austria, A-4600 Wels, Austria
[2] Julius Maximilians Univ, Informat Robot 17, Wurzburg, Germany
关键词
D O I
10.1109/RO-MAN57019.2023.10309349
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The field of human-robot interaction has been rapidly advancing in recent years, as robots are increasingly being integrated into various aspects of human life. However, for robots to effectively collaborate with humans, it is crucial that they have a deep understanding of the environment in which they operate. In particular, the ability to predict traversability and detect tactile information is crucial for enhancing the safety and efficiency of human-robot interactions. To address this challenge, this paper proposes a method called "Feel the Point Clouds" that use point clouds to predict traversability and detect tactile terrain information for a tracked rescue robot. This information can be used to adjust the robot's behavior and movements in real-time, allowing it to interact with the environment in a more intuitive and safe manner. The experimental results of the proposed method are evaluated in various scenarios and demonstrate its effectiveness in improving human-robot interaction and visualization for a more accurate and intuitive understanding of the environment.
引用
收藏
页码:1121 / 1128
页数:8
相关论文
共 50 条
  • [41] Voluntary Interaction Detection for Safe Human-Robot Collaboration
    Grella, Francesco
    Albini, Alessandro
    Cannata, Giorgio
    2022 SIXTH IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING, IRC, 2022, : 353 - 359
  • [42] Face Detection and Recognition with SURF for Human-Robot Interaction
    An, Shan
    Ma, Xin
    Song, Rui
    Li, Yibin
    2009 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION AND LOGISTICS ( ICAL 2009), VOLS 1-3, 2009, : 1945 - 1950
  • [43] Fast Object Detection for Human-Robot Interaction Control
    Shieh, M. Y.
    Chen, Y. H.
    Li, J. H.
    Pai, N. S.
    Chiou, J. S.
    2013 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION (SII), 2013, : 616 - 619
  • [44] Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human-Robot Interaction
    Gandarias, Juan M.
    Gomez-de-Gabriel, Jesus M.
    Garcia-Cerezo, Alfonso J.
    SENSORS, 2018, 18 (03)
  • [45] Probabilistic detection of pointing directions for human-robot interaction
    Shukla, Dadhichi
    Erkent, Oezguer
    Piater, Justus
    2015 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA), 2015, : 601 - 608
  • [46] Operator engagement detection and robot behavior adaptation in human-robot interaction
    Rani, P
    Sarkar, N
    2005 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), VOLS 1-4, 2005, : 2051 - 2056
  • [47] Can I Feel You? Recognizing Human's Emotions During Human-Robot Interaction
    Fiorini, Laura
    Loizzo, Federica G. C.
    D'Onofrio, Grazia
    Sorrentino, Alessandra
    Ciccone, Filomena
    Russo, Sergio
    Giuliani, Francesco
    Sancarlo, Daniele
    Cavallo, Filippo
    SOCIAL ROBOTICS, ICSR 2022, PT I, 2022, 13817 : 511 - 521
  • [48] A Robust Impedance Controller for Improved Safety in Human-Robot Interaction
    Laubscher, Curt A.
    Sawicki, Jerzy T.
    JOURNAL OF DYNAMIC SYSTEMS MEASUREMENT AND CONTROL-TRANSACTIONS OF THE ASME, 2021, 143 (09):
  • [49] PredNet: a simple Human Motion Prediction Network for Human-Robot Interaction
    El-Shamouty, Mohamed
    Pratheepkumar, Anish
    2021 26TH IEEE INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION (ETFA), 2021,
  • [50] Complementary stability and loop shaping for improved human-robot interaction
    Buerger, Stephen P.
    Hogan, Neville
    IEEE TRANSACTIONS ON ROBOTICS, 2007, 23 (02) : 232 - 244