Segmentation and Recognition of Eating Gestures from Wrist Motion using Deep Learning

被引:4
|
作者
Luktuke, Yadnyesh Y. [1 ]
Hoover, Adam [1 ]
机构
[1] Clemson Univ, Dept Elect & Comp Engn, Clemson, SC 29634 USA
关键词
Deep learning; eating gestures; energy intake; IMU sensors; segmentation; GLASSES;
D O I
10.1109/BigData50022.2020.9378382
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper describes a novel approach of segmenting and classifying eating gestures from wrist motion using a deep learning neural network. It is inspired by the approach of fully-convolutional neural networks in the task of image segmentation. Our idea is to segment 1D gestures the same way 2D image regions are segmented, by treating each inertial measurement unit (IMU) datum like a pixel. The novelty of our approach lies in training a neural network to recognize data points that describe an eating gesture just like it would be trained to recognize pixels describing an image region. The data for this research is known as the Clemson Cafeteria Dataset. It was collected from 276 participants that ate an unscripted meal at the Harcombe Dining Hall at Clemson University. Each meal consisted of 1 - 4 courses, and 488 such recordings were used for the experiments described in this paper. Sensor readings consist of measurements taken by an accelerometer (x, y, z) and a gyroscope (yaw, pitch, roll). A total of 51,614 unique gestures associated with different activities commonly seen during a meal were identified by 18 trained raters. Our neural network classifier recognized an average of 79.7% of 'bite' and 84.7% of 'drink' gestures correctly per meal. Overall 77.7% of all gestures were recognized correctly on average per meal. This indicates that a deep learning model can successfully be used to segment eating gestures from a time series recording of IMU data using a technique similar to pixel segmentation within an image.
引用
收藏
页码:1368 / 1373
页数:6
相关论文
共 50 条
  • [1] RECOGNITION OF SIGN LANGUAGE GESTURES USING DEEP LEARNING
    Manoj, R.
    Karthick, R. E.
    Priyadharshini, Indira R.
    Renuka, G.
    Monica
    [J]. INTERNATIONAL JOURNAL OF EARLY CHILDHOOD SPECIAL EDUCATION, 2022, 14 (05) : 508 - 516
  • [2] Wrist Ultrasound Segmentation by Deep Learning
    Zhou, Yuyue
    Rakkunedeth, Abhilash
    Keen, Christopher
    Knight, Jessica
    Jaremko, Jacob L.
    [J]. ARTIFICIAL INTELLIGENCE IN MEDICINE, AIME 2022, 2022, 13263 : 230 - 237
  • [3] Temporal signed gestures segmentation in an image sequence using deep reinforcement learning
    Kalandyk, Dawid
    Kapuscinski, Tomasz
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 131
  • [4] Automatic Shuttlecock Motion Recognition Using Deep Learning
    Zhao, Yongkang
    [J]. IEEE ACCESS, 2023, 11 : 111281 - 111291
  • [5] Segmentation of Motion Objects in Video Frames using Deep Learning
    Jiang, Feng
    Liu, Jiao
    Tian, Jiya
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (09) : 11 - 20
  • [6] 3D Hand Gestures Segmentation and Optimized Classification Using Deep Learning
    Khan, Fawad Salam
    Mohd, Mohd Norzali Haji
    Soomro, Dur Muhammad
    Bagchi, Susama
    Khan, M. Danial
    [J]. IEEE ACCESS, 2021, 9 : 131614 - 131624
  • [7] Tongue crack recognition using segmentation based deep learning
    Jianjun Yan
    Jinxing Cai
    Zi Xu
    Rui Guo
    Wei Zhou
    Haixia Yan
    Zhaoxia Xu
    Yiqin Wang
    [J]. Scientific Reports, 13
  • [8] Tongue crack recognition using segmentation based deep learning
    Yan, Jianjun
    Cai, Jinxing
    Xu, Zi
    Guo, Rui
    Zhou, Wei
    Yan, Haixia
    Xu, Zhaoxia
    Wang, Yiqin
    [J]. SCIENTIFIC REPORTS, 2023, 13 (01)
  • [9] Detecting Eating Episodes From Wrist Motion Using Daily Pattern Analysis
    Tang, Zeyu
    Patyk, Adam
    Jolly, James
    Goldstein, Stephanie P.
    Thomas, J. Graham
    Hoover, Adam
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (02) : 1054 - 1065
  • [10] Wrist Motion Recognition by Using Electromyographic Signals
    Luo, Jing
    Yang, Chenguang
    Liu, Chao
    Yuan, Yuxia
    Li, Zhijun
    [J]. 2019 IEEE 4TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (ICARM 2019), 2019, : 130 - 135