Deep learning investigation for chess player attention prediction using eye-tracking and game data

被引:11
|
作者
Le Louedec, Justin [1 ]
Guntz, Thomas [1 ]
Crowley, James L. [1 ]
Vaufreydaz, Dominique [1 ]
机构
[1] Univ Grenoble Alpes, CNRS, INRIA, Grenoble INP,LIG, F-38000 Grenoble, France
关键词
Deep neural network; Computer vision; Visual attention; Chess;
D O I
10.1145/3314111.3319827
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article reports on an investigation of the use of convolutional neural networks to predict the visual attention of chess players. The visual attention model described in this article has been created to generate saliency maps that capture hierarchical and spatial features of chessboard, in order to predict the probability fixation for individual pixels Using a skip-layer architecture of an autoencoder, with a unified decoder, we are able to use multiscale features to predict saliency of part of the board at different scales, showing multiple relations between pieces. We have used scan path and fixation data from players engaged in solving chess problems, to compute 6600 saliency maps associated to the corresponding chess piece configurations. This corpus is completed with synthetically generated data from actual games gathered from an online chess platform. Experiments realized using both scan-paths from chess players and the CAT2000 saliency dataset of natural images, high-lights several results. Deep features, pretrained on natural images, were found to be helpful in training visual attention prediction for chess. The proposed neural network architecture is able to generate meaningful saliency maps on unseen chess configurations with good scores on standard metrics. This work provides a baseline for future work on visual attention prediction in similar contexts.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Exploring visual attention to erotic stimuli using eye-tracking technology
    Lykins, Amy
    Meana, Marta
    PSYCHOPHYSIOLOGY, 2008, 45 : S3 - S3
  • [42] Esports and Visual Attention: Evaluating In-Game Advertising through Eye-Tracking during the Game Viewing Experience
    Mancini, Marco
    Cherubino, Patrizia
    Cartocci, Giulia
    Martinez, Ana
    Di Flumeri, Gianluca
    Petruzzellis, Luca
    Cimini, Michele
    Arico, Pietro
    Trettel, Arianna
    Babiloni, Fabio
    BRAIN SCIENCES, 2022, 12 (10)
  • [43] Attention-based video reframing: validation using eye-tracking
    Chamaret, Christel
    Le Meur, Olivier
    19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 1450 - 1453
  • [44] Using EEG and Eye-Tracking to Identify Student Attention in Distance Education
    Becker, Valdecir
    Feliciano de Sa, Felipe Melo
    Cavalcanti, Daniel de Queiroz
    Alves Macedo, Joao Marcelo
    Silva, Signe
    Serrano, Paulo Henrique
    APPLICATIONS AND USABILITY OF INTERACTIVE TV, JAUTI 2023, 2024, 2140 : 119 - 133
  • [45] WHY GAME ELEMENTS MAKE LEARNING SO ATTRACTIVE? A CASE STUDY USING EYE-TRACKING TECHNOLOGY
    Borys, Magdalena
    Mitaszka, Mateusz
    Pudlo, Przemyslaw
    INTED2016: 10TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE, 2016, : 2774 - 2780
  • [46] Investigation of classroom management skills by using eye-tracking technology
    Coskun, Atakan
    Cagiltay, Kursat
    EDUCATION AND INFORMATION TECHNOLOGIES, 2021, 26 (03) : 2501 - 2522
  • [47] Investigation of classroom management skills by using eye-tracking technology
    Atakan Coskun
    Kursat Cagiltay
    Education and Information Technologies, 2021, 26 : 2501 - 2522
  • [48] Visual Attention in Objective Image Quality Assessment: Based on Eye-Tracking Data
    Liu, Hantao
    Heynderickx, Ingrid
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2011, 21 (07) : 971 - 982
  • [49] Deep-SAGA: a deep-learning-based system for automatic gaze annotation from eye-tracking data
    Deane, Oliver
    Toth, Eszter
    Yeo, Sang-Hoon
    BEHAVIOR RESEARCH METHODS, 2023, 55 (03) : 1372 - 1391
  • [50] Deep-SAGA: a deep-learning-based system for automatic gaze annotation from eye-tracking data
    Oliver Deane
    Eszter Toth
    Sang-Hoon Yeo
    Behavior Research Methods, 2023, 55 : 1372 - 1391