Hierarchical CNN and Ensemble Learning for Efficient Eye-Gaze Detection

被引:0
|
作者
Kumar, G. R. Karthik [1 ]
Sandhan, Tushar [1 ]
机构
[1] Indian Inst Technol, Dept Elect Engn, Percept & Intelligence Lab, Kanpur, Uttar Pradesh, India
关键词
Eye gaze; Convolutional Neural Networks; Human intentions and focus; Ensemble Learning; Transfer Learning;
D O I
10.1007/978-3-031-58174-8_44
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An eye, as a sensory organ, not only captures visual information but also provides critical cues about human attention and intentions. It is also possible to infer focus, interest, and even truthfulness from where one's gaze is directed. Gaze also helps in influencing social interactions and human behavior. Our gaze can reveal our engagement, attraction, and subconscious reactions, contributing to nonverbal communication and shaping interpersonal dynamics. Eye gaze direction estimation is a crucial task in many real-world applications, such as driver drowsiness detection, human-computer interaction, and assistive technologies. Traditional single-stage CNN classifiers may not be so useful in such applications due to less representation of data for particular classes. Data augmentation can help up-to some extent, but we need multiple novel ideas to be employed to overcome these limitations. In this work, we proposed a hierarchical CNN architecture for eye gaze direction classification, that integrates predictions of three distinct stages to create an ensemble learning based classifier. The proposed model is trained on a dataset of eye images with ground-truth class labels for each gaze direction. Experimental results demonstrate that our proposed method achieved improved classification results on the eye gaze dataset, indicating its potential for real-world applications.
引用
收藏
页码:529 / 541
页数:13
相关论文
共 50 条
  • [21] EYE-GAZE WORD-PROCESSING
    FREY, LA
    WHITE, KP
    HUTCHINSON, TE
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1990, 20 (04): : 944 - 950
  • [22] An eigenspace approach to eye-gaze estimation
    Bebis, G
    Fujimura, K
    [J]. PARALLEL AND DISTRIBUTED COMPUTING SYSTEMS, 2000, : 604 - 609
  • [23] Selective eye-gaze augmentation to enhance imitation learning in Atari games
    Chaitanya Thammineni
    Hemanth Manjunatha
    Ehsan T. Esfahani
    [J]. Neural Computing and Applications, 2023, 35 : 23401 - 23410
  • [24] Word learning in monolingual and bilingual children: The influence of speaker eye-gaze
    Gangopadhyay, Ishanti
    Kaushanskaya, Margarita
    [J]. BILINGUALISM-LANGUAGE AND COGNITION, 2021, 24 (02) : 333 - 343
  • [25] Eye-Gaze Detection with a Single WebCAM Based on Geometry Features Extraction
    Nguyen Huu Cuong
    Huynh Thai Hoang
    [J]. 11TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV 2010), 2010, : 2507 - 2512
  • [26] Eye-gaze orienting to auditory and tactile targets
    Soto-Faraco, S
    Kingstone, A
    [J]. JOURNAL OF PSYCHOPHYSIOLOGY, 2005, 19 (01) : 61 - 61
  • [27] A System for Web Browsing by Eye-Gaze Input
    Abe, Kiyohiko
    Owada, Kosuke
    Ohi, Shoichi
    Ohyama, Minoru
    [J]. ELECTRONICS AND COMMUNICATIONS IN JAPAN, 2008, 91 (05) : 11 - 18
  • [28] Eye-Gaze Tracking based Interaction in India
    Biswas, Pradipta
    Langdon, Pat
    [J]. 6TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN COMPUTER INTERACTION, IHCI 2014, 2014, 39 : 59 - 66
  • [29] Incidental learning of trust from eye-gaze: Effects of race and facial trustworthiness
    Strachan, James W. A.
    Kirkham, Alexander J.
    Manssuer, Luis R.
    Over, Harriet
    Tipper, Steven P.
    [J]. VISUAL COGNITION, 2017, 25 (7-8) : 802 - 814
  • [30] Predicting Students Performance Using Eye-Gaze Features in an Embodied Learning Environment
    Chettaoui, Neila
    Atia, Ayman
    Bouhlel, Med Salim
    [J]. PROCEEDINGS OF THE 2022 IEEE GLOBAL ENGINEERING EDUCATION CONFERENCE (EDUCON 2022), 2022, : 704 - 711