Gestatten: Estimation of User's Attention in Mobile MOOCs from Eye Gaze and Gaze Gesture Tracking

被引:5
|
作者
Kar P. [1 ]
Chattopadhyay S. [1 ]
Chakraborty S. [2 ]
机构
[1] Jadavpur University, Salt Lake, Kolkata, West Bengal
[2] Indian Institute of Technology Kharagpur, Kharagpur
关键词
attention estimation; gaze gesture; MOOC; region of gaze;
D O I
10.1145/3394974
中图分类号
学科分类号
摘要
The rapid proliferation of Massive Open Online Courses (MOOC) has resulted in many-fold increase in sharing the global classrooms through customized online platforms, where a student can participate in the classes through her personal devices, such as personal computers, smartphones, tablets, etc. However, in the absence of direct interactions with the students during the delivery of the lectures, it becomes difficult to judge their involvements in the classroom. In academics, the degree of student's attention can indicate whether a course is efficacious in terms of clarity and information. An automated feedback can hence be generated to enhance the utility of the course. The precision of discernment in the context of human attention is a subject of surveillance. However, visual patterns indicating the magnitude of concentration can be deciphered by analyzing the visual emphasis and the way an individual visually gesticulates, while contemplating the object of interest. In this paper, we develop a methodology called Gestsatten which captures the learner's attentiveness from his visual gesture patterns. In this approach, the learner's visual gestures are tracked along with the region of focus. We consider two aspects in this approach - first, we do not transfer learner's video outside her device, so we apply in-device computing to protect her privacy; second, considering the fact that a majority of the learners use handheld devices like smartphones to observe the MOOC videos, we develop a lightweight approach for in-device computation. A three level estimation of learner's attention is performed based on these information. We have implemented and tested Gestatten over 48 participants from different age groups, and we observe that the proposed technique can capture the attention level of a learner with high accuracy (average absolute error rate is 8.68%), which meets her ability to learn a topic as measured through a set of cognitive tests. © 2020 ACM.
引用
收藏
相关论文
共 50 条
  • [31] U2Eyes: a binocular dataset for eye tracking and gaze estimation
    Porta, Sonia
    Bossavit, Benoit
    Cabeza, Rafael
    Larumbe-Bergera, Andoni
    Garde, Gonzalo
    Villanueva, Arantxa
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 3660 - 3664
  • [32] Eye-tracking - Visual gaze estimation in 3D space
    Al Nahlaoui, Mohammad Yasser
    TM-TECHNISCHES MESSEN, 2008, 75 (7-8) : 437 - 444
  • [33] Measuring digital literacy with eye tracking: an examination of skills and performance based on user gaze
    Steinfeld, Nili
    Lev-On, Azi
    Abu-Kishk, Hama
    ASLIB JOURNAL OF INFORMATION MANAGEMENT, 2025, 77 (02) : 373 - 390
  • [34] Measuring Digital Literacy with Eye Tracking: An examination of skills and performance based on user gaze
    Steinfeld, Nili
    Lev-On, Azi
    Abu-Kishk, Hama
    PROCEEDINGS OF THE 13TH ACM WEB SCIENCE CONFERENCE, WEBSCI 2021, 2020, : 21 - 28
  • [35] An affective user interface based on facial expression recognition and eye-gaze tracking
    Choi, SM
    Kim, YG
    AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION, PROCEEDINGS, 2005, 3784 : 907 - 914
  • [36] USER-CALIBRATION-FREE REMOTE EYE-GAZE TRACKING SYSTEM WITH EXTENDED TRACKING RANGE
    Model, Dmitri
    Eizenman, Moshe
    2011 24TH CANADIAN CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING (CCECE), 2011, : 1268 - 1271
  • [37] Gaze Analysis and Concentration Monitoring for Children With Attention Disorder Using Eye-Tracking
    Dan, Gota
    Ruxandra, Miron-Onciul
    Claudiu, Domuta
    Alexandra, Fanca
    Adela, Pop -Puscasiu
    Ovidiu, Stan
    Honoriu, Valean
    Liviu, Miclea
    PROCEEDINGS OF 2022 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION, QUALITY AND TESTING, ROBOTICS (AQTR 2022), 2022, : 299 - 304
  • [38] Eye-tracking AD: Cutting-Edge Web Advertising on Smartphone Aligned with User's Gaze
    Tsubouchi, Kota
    Taoka, Kenta
    Ikematsu, Kaori
    Yamanaka, Shota
    Narumi, Koya
    Kawahara, Yasuhiro
    2024 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS AND OTHER AFFILIATED EVENTS, PERCOM WORKSHOPS, 2024, : 469 - 474
  • [39] Spatial attention triggered by eye gaze: Evidence from a microsaccade study
    Yokoyama, Takemasa
    Noguchi, Yasuki
    Kita, Shinichi
    I-PERCEPTION, 2011, 2 (04): : 330 - 330
  • [40] Inferring Search User Language Proficiency from Eye Gaze Data
    Steichen, Ben
    Kosasih, Wilsen
    Becerra, Christian
    CHIIR'22: PROCEEDINGS OF THE 2022 CONFERENCE ON HUMAN INFORMATION INTERACTION AND RETRIEVAL, 2022, : 211 - 220