Feel the Point Clouds: Traversability Prediction and Tactile Terrain Detection Information for an Improved Human-Robot Interaction

被引:0
|
作者
Edlinger, Raimund [1 ]
Nuechter, Andreas [2 ]
机构
[1] Univ Appl Sci Upper Austria, A-4600 Wels, Austria
[2] Julius Maximilians Univ, Informat Robot 17, Wurzburg, Germany
关键词
D O I
10.1109/RO-MAN57019.2023.10309349
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The field of human-robot interaction has been rapidly advancing in recent years, as robots are increasingly being integrated into various aspects of human life. However, for robots to effectively collaborate with humans, it is crucial that they have a deep understanding of the environment in which they operate. In particular, the ability to predict traversability and detect tactile information is crucial for enhancing the safety and efficiency of human-robot interactions. To address this challenge, this paper proposes a method called "Feel the Point Clouds" that use point clouds to predict traversability and detect tactile terrain information for a tracked rescue robot. This information can be used to adjust the robot's behavior and movements in real-time, allowing it to interact with the environment in a more intuitive and safe manner. The experimental results of the proposed method are evaluated in various scenarios and demonstrate its effectiveness in improving human-robot interaction and visualization for a more accurate and intuitive understanding of the environment.
引用
收藏
页码:1121 / 1128
页数:8
相关论文
共 50 条
  • [21] Asymmetric Anomaly Detection for Human-Robot Interaction
    Lv, Hao
    Yi, Pengfei
    Liu, Rui
    Hou, YingKun
    Zhou, Dongsheng
    Zhang, Qiang
    Wei, Xiaopeng
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 372 - 377
  • [22] Fuzzy visual detection for human-robot interaction
    Shieh, Ming-Yuan
    Hsieh, Chung-Yu
    Hsieh, Tsung-Min
    ENGINEERING COMPUTATIONS, 2014, 31 (08) : 1709 - 1719
  • [23] High Resolution Soft Tactile Interface for Physical Human-Robot Interaction
    Huang, Isabella
    Bajcsy, Ruzena
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 1705 - 1711
  • [24] The Tactile Ethics of Soft Robotics: Designing Wisely for Human-Robot Interaction
    Arnold, Thomas
    Scheutz, Matthias
    SOFT ROBOTICS, 2017, 4 (02) : 81 - 87
  • [25] Learning Compliant Manipulation through Kinesthetic and Tactile Human-Robot Interaction
    Kronander, Klas
    Billard, Aude
    IEEE TRANSACTIONS ON HAPTICS, 2014, 7 (03) : 367 - 380
  • [26] SkinCell: A Modular Tactile Sensor Patch for Physical Human-Robot Interaction
    Zhang, Ruoshi
    Lin, Ji-Tzuoh
    Olowo, Olalekan O.
    Goulet, Brian P.
    Harris, Bryan
    Popa, Dan O.
    IEEE SENSORS JOURNAL, 2023, 23 (03) : 2833 - 2846
  • [27] Haptic interface with multimodal tactile sensing and feedback for human-robot interaction
    Kang, Mingyu
    Gang, Cheol-Gu
    Ryu, Sang-Kyu
    Kim, Hyeon-Ju
    Jeon, Da-Yeon
    Pyo, Soonjae
    MICRO AND NANO SYSTEMS LETTERS, 2024, 12 (01)
  • [28] Cross-Modal Reconstruction for Tactile Signal in Human-Robot Interaction
    Chen, Mingkai
    Xie, Yu
    SENSORS, 2022, 22 (17)
  • [29] Layered sensor modalities for improved human-robot interaction
    Hestand, D
    Yanco, HA
    2004 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN & CYBERNETICS, VOLS 1-7, 2004, : 2966 - 2970
  • [30] Instrumenting Complex Exoskeletons for Improved Human-Robot Interaction
    Grosu, Victor
    Guerrero, Carlos Rodriguez
    Brackx, Branko
    Grosu, Svetlana
    Vanderborght, Bram
    Lefeber, Dirk
    IEEE INSTRUMENTATION & MEASUREMENT MAGAZINE, 2015, 18 (05) : 5 - 10