Inferring Human Knowledgeability from Eye Gaze in Mobile Learning Environments

被引:4
|
作者
Celiktutan, Oya [1 ]
Demiris, Yiannis [1 ]
机构
[1] Imperial Coll London, Dept Elect & Elect Engn, Personal Robot Lab, London, England
基金
欧盟地平线“2020”;
关键词
Assistive mobile applications; Noninvasive gaze tracking; Analysis of eye movements; Human knowledgeability prediction;
D O I
10.1007/978-3-030-11024-6_13
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
What people look at during a visual task reflects an interplay between ocular motor functions and cognitive processes. In this paper, we study the links between eye gaze and cognitive states to investigate whether eye gaze reveal information about an individual's knowledgeability. We focus on a mobile learning scenario where a user and a virtual agent play a quiz game using a hand-held mobile device. To the best of our knowledge, this is the first attempt to predict user's knowledgeability from eye gaze using a noninvasive eye tracking method on mobile devices: we perform gaze estimation using front-facing camera of mobile devices in contrast to using specialised eye tracking devices. First, we define a set of eye movement features that are discriminative for inferring user's knowledgeability. Next, we train a model to predict users' knowledgeability in the course of responding to a question. We obtain a classification performance of 59.1% achieving human performance, using eye movement features only, which has implications for (1) adapting behaviours of the virtual agent to user's needs (e.g., virtual agent can give hints); (2) personalising quiz questions to the user's perceived knowledgeability.
引用
下载
收藏
页码:193 / 209
页数:17
相关论文
共 50 条
  • [1] Inferring Search User Language Proficiency from Eye Gaze Data
    Steichen, Ben
    Kosasih, Wilsen
    Becerra, Christian
    CHIIR'22: PROCEEDINGS OF THE 2022 CONFERENCE ON HUMAN INFORMATION INTERACTION AND RETRIEVAL, 2022, : 211 - 220
  • [2] Inferring user action with mobile gaze tracking
    Toivanen, Miika
    Puolamaki, Kai
    Lukander, Kristian
    Hakkinen, Jukka
    Radun, Jenni
    PROCEEDINGS OF THE 18TH INTERNATIONAL CONFERENCE ON HUMAN-COMPUTER INTERACTION WITH MOBILE DEVICES AND SERVICES (MOBILEHCI 2016), 2016, : 1026 - 1028
  • [3] INFERRING TARGETS FROM GAZE
    Lo, Anthony H. P.
    So, Richard H. Y.
    Shi, Bertram E.
    2014 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2014,
  • [4] Inferring Cognitive Style from Eye Gaze Behavior During Information Visualization Usage
    Steichen, Ben
    Fu, Bo
    Nguyen, Tho
    UMAP'20: PROCEEDINGS OF THE 28TH ACM CONFERENCE ON USER MODELING, ADAPTATION AND PERSONALIZATION, 2020, : 348 - 352
  • [5] Deep gaze pooling: Inferring and visually decoding search intents from human gaze fixations
    Sattar, Hosnieh
    Fritz, Mario
    Bulling, Andreas
    NEUROCOMPUTING, 2020, 387 : 369 - 382
  • [6] Unsupervised Learning of Eye Gaze Representation from the Web
    Dubey, Neeru
    Ghosh, Shreya
    Dhall, Abhinav
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [7] Inferring Human Gaze from Appearance via Adaptive Linear Regression
    Lu, Feng
    Sugano, Yusuke
    Okabe, Takahiro
    Sato, Yoichi
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 153 - 160
  • [8] Learning Attributes from Human Gaze
    Murrugarra-Llerena, Nils
    Kovashka, Adriana
    2017 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2017), 2017, : 510 - 519
  • [9] Gaze estimation of human eye
    Sun, Xinghua
    Chen, Guoyong
    Zhao, Chunxia
    Yang, Jingyu
    2006 6TH INTERNATIONAL CONFERENCE ON ITS TELECOMMUNICATIONS PROCEEDINGS, 2006, : 310 - +
  • [10] Gestatten: Estimation of User's Attention in Mobile MOOCs from Eye Gaze and Gaze Gesture Tracking
    Kar P.
    Chattopadhyay S.
    Chakraborty S.
    Proceedings of the ACM on Human-Computer Interaction, 2020, 4 (EICS)