Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery

被引:41
|
作者
Zhang, Xucong [1 ]
Sugano, Yusuke [2 ]
Bulling, Andreas [1 ]
机构
[1] Max Planck Inst Informat, Saarland Informat Campus, Saarbrucken, Germany
[2] Osaka Univ, Grad Sch Informat Sci & Technol, Suita, Osaka, Japan
关键词
Eye Contact; Appearance-Based Gaze Estimation; Attentive User Interfaces; Social Signal Processing; VISUAL FOCUS; ATTENTION; TRACKING;
D O I
10.1145/3126594.3126614
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Eye contact is an important non-verbal cue in social signal processing and promising as a measure of overt attention in human-object interactions and attentive user interfaces. However, robust detection of eye contact across different users, gaze targets, camera positions, and illumination conditions is notoriously challenging. We present a novel method for eye contact detection that combines a state-of-the-art appearance based gaze estimator with a novel approach for unsupervised gaze target discovery, i.e. without the need for tedious and time-consuming manual data annotation. We evaluate our method in two real-world scenarios: detecting eye contact at the workplace, including on the main work display, from cameras mounted to target objects, as well as during everyday social interactions with the wearer of a head-mounted egocentric camera. We empirically evaluate the performance of our method in both scenarios and demonstrate its effectiveness for detecting eye contact independent of target object type and size, camera position, and user and recording environment.
引用
收藏
页码:193 / 203
页数:11
相关论文
共 50 条
  • [41] Unsupervised Outlier Detection in Appearance-Based Gaze Estimation
    Chen, Zhaokang
    Deng, Didan
    Pi, Jimin
    Shi, Bertram E.
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 1088 - 1097
  • [42] Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities
    Kothari, Rakshit
    Yang, Zhizhuo
    Kanan, Christopher
    Bailey, Reynold
    Pelz, Jeff B.
    Diaz, Gabriel J.
    [J]. SCIENTIFIC REPORTS, 2020, 10 (01)
  • [43] Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design
    Feit, Anna Maria
    Williams, Shane
    Toledo, Arturo
    Paradiso, Ann
    Kulkarni, Harish
    Kane, Shaun
    Morris, Meredith Ringel
    [J]. PROCEEDINGS OF THE 2017 ACM SIGCHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'17), 2017, : 1118 - 1130
  • [44] Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities
    Rakshit Kothari
    Zhizhuo Yang
    Christopher Kanan
    Reynold Bailey
    Jeff B. Pelz
    Gabriel J. Diaz
    [J]. Scientific Reports, 10
  • [45] Aerial Gaze Target Recognition Based on Head and Eye Movements
    Wang, Yawen
    Wang, Changyuan
    Xue, Pengxiang
    Zhang, Yu
    Jiang, Guangyi
    Yao, Yining
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2023, 37 (13)
  • [46] Target Detection and Gaze Control with Reduced Acuity
    Freedman, Andrew Carter
    Achtemeier, Jacob
    Baek, Yihwa
    Legge, Gordon E.
    [J]. INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2018, 59 (09)
  • [47] Unsupervised Phase Discovery with Deep Anomaly Detection
    Kottmann, Korbinian
    Huembeli, Patrick
    Lewenstein, Maciej
    Acin, Antonio
    [J]. PHYSICAL REVIEW LETTERS, 2020, 125 (17)
  • [48] Object-aware Gaze Target Detection
    Tonini, Francesco
    Dall'Asen, Nicola
    Beyan, Cigdem
    Ricci, Elisa
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 21803 - 21812
  • [49] Improving Stability of Gaze Target Detection in Videos
    Yang, Zhihao
    Wang, Xinming
    Wang, Zhiyong
    Xu, Qiong
    Xu, Xiu
    Liu, Honghai
    [J]. IECON Proceedings (Industrial Electronics Conference), 2023,
  • [50] Multimodal Across Domains Gaze Target Detection
    Tonini, Francesco
    Beyan, Cigdem
    Ricci, Elisa
    [J]. PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, ICMI 2022, 2022, : 420 - 431