Estimating 3D spatiotemporal point of regard: a device evaluation

被引:0
|
作者
Wagner, Peter [1 ,2 ]
Ho, Arthur [1 ,2 ]
Kim, Juno [2 ]
机构
[1] Brien Holden Vis Inst Ltd, Lv 4,RMB North Wing,14 Barker Str, Sydney, NSW 2052, Australia
[2] Univ New South Wales, Sch Optometry & Vis Sci, Lv 3,RMB North Wing,14 Barker Str, Sydney, NSW 2052, Australia
关键词
PIVOT POINT; EYE; LOCATION; PUPIL; ACCURACY; TRACKING; HUMANS; MODEL;
D O I
10.1364/JOSAA.457663
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
This paper presents and evaluates a system and method that record spatiotemporal scene information and location of the center of visual attention, i.e., spatiotemporal point of regard (PoR) in ecological environments. A primary research application of the proposed system and method is for enhancing current 2D visual attention models. Current eye-tracking approaches collapse a scene's depth structures to a 2D image, omitting visual cues that trigger important functions of the human visual system (e.g., accommodation and vergence). We combined head-mounted eye-tracking with a miniature time-of-flight camera to produce a system that could be used to estimate the spa-tiotemporal location of the PoR-the point of highest visual attention-within 3D scene layouts. Maintaining calibration accuracy is a primary challenge for gaze mapping; hence, we measured accuracy repeatedly by matching the PoR to fixated targets arranged within a range of working distances in depth. Accuracy was estimated as the deviation from estimated PoR relative to known locations of scene targets. We found that estimates of 3D PoR had an overall accuracy of approximately 2 degrees omnidirectional mean average error (OMAE) with variation over a 1 h recording maintained within 3.6 degrees OMAE. This method can be used to determine accommodation and vergence cues of the human visual system continuously within habitual environments, including everyday applications (e.g., use of hand-held devices). (c) 2022 Optica Publishing Group
引用
收藏
页码:1343 / 1351
页数:9
相关论文
共 50 条
  • [1] A General Method for the Point of Regard Estimation in 3D Space
    Pirri, Fiora
    Pizzoli, Matia
    Rudi, Alessandro
    2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011, : 921 - 928
  • [2] Estimating 3D motion and position of point target
    Ouyang, GH
    Sun, JX
    Li, H
    Wang, WH
    ULTRAHIGH- AND HIGH-SPEED PHOTOGRAPHY AND IMAGE-BASED MOTION MEASUREMENT, 1997, 3173 : 386 - 394
  • [3] A Microfluidic Probe Integrated Device for Spatiotemporal 3D Chemical Stimulation in Cells
    Shinha, Kenta
    Nihei, Wataru
    Kimura, Hiroshi
    MICROMACHINES, 2020, 11 (07)
  • [4] Spatiotemporal Learning of Dynamic Gestures from 3D Point Cloud Data
    Owoyemi, Joshua
    Hashimoto, Koichi
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 5929 - 5934
  • [5] A Novel Method for Estimating Free Space 3D Point-of-Regard Using Pupillary Reflex and Line-of-Sight Convergence Points
    Wan, Zijing
    Wang, Xiangjun
    Zhou, Kai
    Chen, Xiaoyun
    Wang, Xiaoqing
    SENSORS, 2018, 18 (07)
  • [6] 3D Scaner (3D input device)
    Sato, Koki
    Journal of the Institute of Image Electronics Engineers of Japan, 2015, 44 (02) : 282 - 284
  • [7] Estimating 3D Gaze Point on Object Using Stereo Scene Cameras
    Wan, Zhonghua
    Xiong, Caihua
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2017, PT I, 2017, 10462 : 323 - 329
  • [8] Estimating Muscle Activity from the Deformation of a Sequential 3D Point Cloud
    Niu, Hui
    Ito, Takahiro
    Desclaux, Damien
    Ayusawa, Ko
    Yoshiyasu, Yusuke
    Sagawa, Ryusuke
    Yoshida, Eiichi
    JOURNAL OF IMAGING, 2022, 8 (06)
  • [9] An evaluation of 3D printing for the manufacture of a binaural recording device
    O'Connor, Daragh
    Kennedy, John
    APPLIED ACOUSTICS, 2021, 171
  • [10] Estimating the 3D center point of an object with Kinect sensor RGB-D images
    Armenio, Gustavo Fardo
    Fabro, Joao Alberto
    Tognella, Renzo de Rosa
    Conter, Felipe Pierre
    de Oliveira, Marlon Vaz
    Silva, Everson de Souza
    2023 LATIN AMERICAN ROBOTICS SYMPOSIUM, LARS, 2023 BRAZILIAN SYMPOSIUM ON ROBOTICS, SBR, AND 2023 WORKSHOP ON ROBOTICS IN EDUCATION, WRE, 2023, : 478 - 483