Control of mouse movements using human facial expressions

被引:2
|
作者
Mohamed, Abdul Wahid [1 ]
Koggalage, Ravindra [1 ]
机构
[1] Inst Informat Technol, Dept Comp, Colombo, Sri Lanka
关键词
D O I
10.1109/ICIAFS.2007.4544773
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, a method to create an application which is competent of replacing the traditional input device (mouse) by using human facial features is proposed. Distinctively, using real time videos of the user's face extracted from the video sequence obtained using an off-the-shelf web-camera. It can be applied as an optional input source for those who cannot use their hands due to disabilities or patients who cannot use their hands. In the proposed technique, a method that combines both feature-based and image-based approach is used. The fundamental approach for detection is fast extraction of face candidates using Six-Segmented Rectangular (SSR) filter and then pass them to Support Vector Machine for face verification. In face tracking, the patterns of between-the-eyes are tracked with update template matching. A window that has the feature's template size is scanned over the Region of Interest (ROI) and then calculates the Sum of Squared Difference between a frame that has the feature's template and the current frame. Experiments show that 90% of the system behaves satisfactory for a web-camera at frame rate of 15fps with the image resolution of 320 x 240 frame size. The system consumes little amount of CPU resources allowing other processors to run smoothly.
引用
收藏
页码:13 / 18
页数:6
相关论文
共 50 条
  • [1] Mouse Cursor Control System Using Facial Movements
    Tabuse, Masayoshi
    Mizobe, Manase
    Yoshitomi, Yasunari
    Asada, Taro
    [J]. PROCEEDINGS OF THE 2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL LIFE AND ROBOTICS (ICAROB2020), 2020, : 394 - 397
  • [2] Influence of Facial Expressions on the Human Head Movements
    Kocon, Maja
    [J]. 2018 41ST INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS AND SIGNAL PROCESSING (TSP), 2018, : 442 - 445
  • [3] Eye movements in judgements of facial expressions
    Osada, Y.
    Nagasaka, Y.
    Yamazaki, R.
    [J]. PERCEPTION, 1997, 26 : 97 - 97
  • [4] Ergonomic and Human-Centered Design of Wearable Gaming Controller Using Eye Movements and Facial Expressions
    Wang, Ker-Jiun
    Zhang, Anna
    You, Kaiwen
    Chen, Fangyi
    Liu, Quanbo
    Liu, Yu
    Li, Zaiwang
    Tung, Hsiao-Wei
    Mao, Zhi-Hong
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS-TAIWAN (ICCE-TW), 2018,
  • [5] Expression training for complex emotions using facial expressions and head movements
    Adams, Andra
    Robinson, Peter
    [J]. 2015 INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2015, : 784 - 786
  • [6] Altering Facial Movements Abolishes Neural Mirroring of Facial Expressions
    Kayley Birch-Hurst
    Magdalena Rychlowska
    Michael B. Lewis
    Ross E. Vanderwert
    [J]. Cognitive, Affective, & Behavioral Neuroscience, 2022, 22 : 316 - 327
  • [7] Altering Facial Movements Abolishes Neural Mirroring of Facial Expressions
    Birch-Hurst, Kayley
    Rychlowska, Magdalena
    Lewis, Michael B.
    Vanderwert, Ross E.
    [J]. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE, 2022, 22 (02) : 316 - 327
  • [8] Caricatured facial movements enhance perception of emotional facial expressions
    Furl, Nicholas
    Begum, Forida
    Ferrarese, Francesca Pizzorni
    Jans, Sarah
    Woolley, Caroline
    Sulik, Justin
    [J]. PERCEPTION, 2022, 51 (05) : 313 - 343
  • [9] Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements
    Barrett, Lisa Feldman
    Adolphs, Ralph
    Marsella, Stacy
    Martinez, Aleix M.
    Pollak, Seth D.
    [J]. PSYCHOLOGICAL SCIENCE IN THE PUBLIC INTEREST, 2019, 20 (01) : 1 - 68
  • [10] Quantitative Evaluation of Facial Expressions and Movements of Persons While Using Video Phone
    Asada, Taro
    Yoshitomi, Yasunari
    Kato, Ryota
    Tabuse, Masayoshi
    Narumoto, Jin
    [J]. JOURNAL OF ROBOTICS NETWORKING AND ARTIFICIAL LIFE, 2015, 2 (02): : 111 - 114