A classification method of cooking operations based on eye movement patterns

被引:4
|
作者
Inoue, Hiroya [1 ]
Hirayama, Takatsugu [1 ]
Doman, Keisuke [2 ]
Kawanishi, Yasutomo [1 ]
Ide, Ichiro [1 ]
Deguchi, Daisuke [1 ]
Murase, Hiroshi [1 ]
机构
[1] Nagoya Univ, Nagoya, Aichi, Japan
[2] Chukyo Univ, Nagoya, Aichi, Japan
关键词
cooking operations; gaze analysis; eye movement pattern; fixation; blink; N; -gram; SVR;
D O I
10.1145/2857491.2857500
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We are developing a cooking support system that coaches beginners. In this work, we focus on eye movement patterns while cooking meals because gaze dynamics include important information for understanding human behavior. The system first needs to classify typical cooking operations. In this paper, we propose a gaze-based classification method and evaluate whether or not the eye movement patterns have a potential to classify the cooking operations. We improve the conventional N -gram model of eye movement patterns, which was designed to be applied for recognition of office work. Conventionally, only relative movement from the previous frame was used as a feature. However, since in cooking, users pay attention to cooking ingredients and equipments, we consider fixation as a component of the N -gram. We also consider eye blinks, which is related to the cognitive state. Compared to the conventional method, instead of focusing on statistical features, we consider the ordinal relations of fixation, blink, and the relative movement. The proposed method estimates the likelihood of the cooking operations by Support Vector Regression (SVR) using frequency histograms of N -grams as explanatory variables.
引用
收藏
页码:205 / 208
页数:4
相关论文
共 50 条
  • [41] Method for detecting eye misalignment based on movement near the center of the pupil
    Uchida, Noriyuki
    Takatuka, Kayoko
    Yamaba, Hisaaki
    Mukunoki, Masayuki
    Okazaki, Naonobu
    PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON ARTIFICIAL LIFE AND ROBOTICS (ICAROB 2021), 2021, : 165 - 169
  • [42] Research on athlete’s wrong movement prediction method based on multimodal eye movement recognition
    Wang L.
    International Journal of Reasoning-based Intelligent Systems, 2022, 14 (04) : 176 - 183
  • [43] CUE: Counterfeit-resistant Usable Eye Movement-based Authentication via Oculomotor Plant Characteristics and Complex Eye Movement Patterns
    Komogortsev, Oleg V.
    Karpov, Alexey
    Holland, Corey D.
    SENSING TECHNOLOGIES FOR GLOBAL HEALTH, MILITARY MEDICINE, DISASTER RESPONSE, AND ENVIRONMENTAL MONITORING II: AND BIOMETRIC TECHNOLOGY FOR HUMAN IDENTIFICATION IX, 2012, 8371
  • [44] Functional connectivity-based classification of rapid eye movement sleep behavior disorder
    Matsushima, Toma
    Yoshinaga, Kenji
    Wakasugi, Noritaka
    Togo, Hiroki
    Hanakawa, Takashi
    SLEEP MEDICINE, 2024, 115 : 5 - 13
  • [45] Classification of Depression Using Machine Learning Methods Based on Eye Movement Variance Entropy
    Jiang, Zhongyi
    Zhou, Ying
    Zhang, Yihan
    Dong, Guanzhong
    Chen, Yun
    Zhang, Qiaoyang
    Zou, Ling
    Cao, Yin
    IEEE Access, 2024, 12 : 146107 - 146120
  • [46] A new method for describing search patterns and quantifying visual load using eye movement data
    Chi, CF
    Lin, FT
    INTERNATIONAL JOURNAL OF INDUSTRIAL ERGONOMICS, 1997, 19 (03) : 249 - 257
  • [47] A Novel Method for Automatic Detection and Classification of Movement Patterns in Short Duration Playing Activities
    Rivera, Diego
    Cruz-Piris, Luis
    Fernandez, Susel
    Alarcos, Bernardo
    Garcia, Antonio
    Velasco, Juan R.
    IEEE ACCESS, 2018, 6 : 53409 - 53425
  • [48] Knowledge-Based Patterns of Remembering: Eye Movement Scanpaths Reflect Domain Experience
    Underwood, Geoffrey
    Humphrey, Katherine
    Foulsham, Toni
    HCI AND USABILITY FOR EDUCATION AND WORK, PROCEEDINGS, 2008, 5298 : 125 - 144
  • [49] Effect of glaucoma on eye movement patterns and laboratory-based hazard detection ability
    Lee, Samantha Sze-Yee
    Black, Alex A.
    Wood, Joanne M.
    PLOS ONE, 2017, 12 (06):