Object segmentation in cluttered environment based on gaze tracing and gaze blinking

被引:0
|
作者
Photchara Ratsamee
Yasushi Mae
Kazuto Kamiyama
Mitsuhiro Horade
Masaru Kojima
Tatsuo Arai
机构
[1] Osaka University,Graduate School of Information Science and Technology
[2] Kansai University,Graduate School of Engineering
[3] Takenaka Corporation,Takenaka Research & Development Institute
[4] National Defense Academy,Department of Mechanical Systems Engineering, School of Systems Engineering
[5] Osaka University,Graduate School of Engineering Science
[6] The University of Electro-Communications,undefined
来源
关键词
Gaze interface; Human–robot interaction; Object segmentation;
D O I
暂无
中图分类号
学科分类号
摘要
People with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.
引用
收藏
相关论文
共 50 条
  • [21] Part-based object retrieval in cluttered environment
    Chi, Yanling
    Leung, Maylor K. H.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (05) : 890 - 895
  • [22] NOT LOOKING WHILE LEAPING - THE LINKAGE OF BLINKING AND SACCADIC GAZE SHIFTS
    EVINGER, C
    MANNING, KA
    PELLEGRINI, JJ
    BASSO, MA
    POWERS, AS
    SIBONY, PA
    EXPERIMENTAL BRAIN RESEARCH, 1994, 100 (02) : 337 - 344
  • [23] Metrics in a Dynamic Gaze Environment
    Lochbihler, Aidan
    Wallace, Bruce
    Van Benthem, Kathleen
    Herdman, Chris
    Sloan, Will
    Brightman, Kirsten
    Goubran, Rafik
    Knoefel, Frank
    Marshall, Shawn
    2024 IEEE INTERNATIONAL SYMPOSIUM ON MEDICAL MEASUREMENTS AND APPLICATIONS, MEMEA 2024, 2024,
  • [24] Comparison of head gaze and head and eye gaze within an immersive environment
    Murray, Norman
    Roberts, Dave
    DS-RT 2006: TENTH IEEE INTERNATIONAL SYMPOSIUM ON DISTRIBUTED SIMULATION AND REAL-TIME APPLICATIONS, PROCEEDINGS, 2006, : 70 - +
  • [25] On Object Selection in Gaze Controlled Environments
    Huckauf, Anke
    Urbina, Mario H.
    JOURNAL OF EYE MOVEMENT RESEARCH, 2008, 2 (04):
  • [26] Adaptive Gaze Control for Object Detection
    G. C. H. E. de Croon
    E. O. Postma
    H. J. van den Herik
    Cognitive Computation, 2011, 3 : 264 - 278
  • [27] Women as Subject and Object of the Gaze in Tragedy
    Rabinowitz, Nancy Sorkin
    HELIOS, 2013, 40 (1-2) : 195 - 221
  • [28] Iris Segmentation Based on Ellipse Detection for Gaze Tracking System
    Zhang, Yumeng
    Meng, Cai
    PROCEEDINGS OF THE 2017 2ND INTERNATIONAL CONFERENCE ON MATERIALS SCIENCE, MACHINERY AND ENERGY ENGINEERING (MSMEE 2017), 2017, 123 : 937 - 941
  • [29] The impact and mechanism of gaze cues on object-based attention
    Yan Chi
    Gao Yunfei
    Hu Saisai
    Song Fangxing
    Wang Yonghui
    Zhao Jingjing
    ACTA PSYCHOLOGICA SINICA, 2022, 54 (07) : 748 - 760
  • [30] Adaptive Gaze Control for Object Detection
    de Croon, G. C. H. E.
    Postma, E. O.
    van den Herik, H. J.
    COGNITIVE COMPUTATION, 2011, 3 (01) : 264 - 278