Gender and gaze gesture recognition for human-computer interaction

被引:35
|
作者
Zhang, Wenhao [1 ]
Smith, Melvyn L. [1 ]
Smith, Lyndon N. [1 ]
Farooq, Abdul [1 ]
机构
[1] Univ W England, Bristol Robot Lab, Ctr Machine Vis, T Block,Frenchay Campus,Coldharbour Lane, Bristol BS16 1QY, Avon, England
基金
英国工程与自然科学研究理事会;
关键词
Assistive HCI; Gender recognition; Eye centre localisation; Gaze analysis; Directed advertising; CLASSIFICATION; IMAGES; SHAPE;
D O I
10.1016/j.cviu.2016.03.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The identification of visual cues in facial images has been widely explored in the broad area of computer vision. However theoretical analyses are often not transformed into widespread assistive Human Computer Interaction (HCI) systems, due to factors such as inconsistent robustness, low efficiency, large computational expense or strong dependence on complex hardware. We present a novel gender recognition algorithm, a modular eye centre localisation approach and a gaze gesture recognition method, aiming to escalate the intelligence, adaptability and interactivity of HCI systems by combining demographic data (gender) and behavioural data (gaze) to enable development of a range of real-world assistive-technology applications. The gender recognition algorithm utilises Fisher Vectors as facial features which are encoded from low-level local features in facial images. We experimented with four types of low-level features: greyscale values, Local Binary Patterns (LBP), LBP histograms and Scale Invariant Feature Transform (SIFT). The corresponding Fisher Vectors were classified using a linear Support Vector Machine. The algorithm has been tested on the FERET database, the LFW database and the FRGCv2 database, yielding 97.7%, 92.5% and 96.7% accuracy respectively. The eye centre localisation algorithm has a modular approach, following a coarse-to-fine, global-to regional scheme and utilising isophote and gradient features. A Selective Oriented Gradient filter has been specifically designed to detect and remove strong gradients from eyebrows, eye corners and self shadows (which sabotage most eye centre localisation methods). The trajectories of the eye centres are then defined as gaze gestures for active HCI. The eye centre localisation algorithm has been compared with 10 other state-of-the-art algorithms with similar functionality and has outperformed them in terms of accuracy while maintaining excellent real-time performance. The above methods have been employed for development of a data recovery system that can be employed for implementation of advanced assistive technology tools. The high accuracy, reliability and real-time performance achieved for attention monitoring, gaze gesture control and recovery of demographic data, can enable the advanced human-robot interaction that is needed for developing systems that can provide assistance with everyday actions, thereby improving the quality of life for the elderly and/or disabled. (C) 2016 Elsevier Inc. All rights reserved.
引用
收藏
页码:32 / 50
页数:19
相关论文
共 50 条
  • [31] Human-computer interaction using gesture recognition and 3D hand tracking
    Segen, J
    Kumar, S
    1998 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING - PROCEEDINGS, VOL 3, 1998, : 188 - 192
  • [32] Sandwich-structured flexible strain sensors for gesture recognition in human-computer interaction
    Chen, Guanzheng
    Zhang, Xin
    Sun, Zeng
    Luo, Xuanzi
    Fang, Guoqing
    Wu, Huaping
    Cheng, Lin
    Liu, Aiping
    EUROPEAN PHYSICAL JOURNAL-SPECIAL TOPICS, 2025,
  • [33] Emotion recognition for human-computer interaction
    Jianhua TAO
    虚拟现实与智能硬件(中英文), 2021, 3 (01) : 6 - 7
  • [34] Emotion recognition in human-computer interaction
    Fragopanagos, N
    Taylor, JG
    NEURAL NETWORKS, 2005, 18 (04) : 389 - 405
  • [35] A Dynamic Head Gesture Recognition Method for Real-Time Human-Computer Interaction
    Xie, Jialong
    Zhang, Botao
    Chepinskiy, Sergey A.
    Zhilenkov, Anton A.
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT III, 2021, 13015 : 235 - 245
  • [36] Emotion recognition for human-computer interaction
    TAO, Jianhua
    Virtual Reality and Intelligent Hardware, 2021, 3 (01):
  • [37] Emotion recognition in human-computer interaction
    Cowie, R
    Douglas-Cowie, E
    Tsapatsoulis, N
    Votsis, G
    Kollias, S
    Fellenz, W
    Taylor, JG
    IEEE SIGNAL PROCESSING MAGAZINE, 2001, 18 (01) : 32 - 80
  • [38] Application of the Support Vector Machine Algorithm based Gesture Recognition in Human-computer Interaction
    Cao, Wangcheng
    INFORMATICA-JOURNAL OF COMPUTING AND INFORMATICS, 2019, 43 (01): : 123 - 127
  • [39] Design of Human-Computer Interaction Control System Based on Hand-Gesture Recognition
    Wang Zhi-heng
    Cao Jiang-tao
    Liu Jin-guo
    Zhao Zi-qi
    2017 32ND YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION (YAC), 2017, : 143 - 147
  • [40] A dynamic hand gesture recognition dataset for human-computer interfaces
    Fronteddu, Graziano
    Porcu, Simone
    Floris, Alessandro
    Atzori, Luigi
    COMPUTER NETWORKS, 2022, 205