Multimodal Wearable Sensing for Sport-Related Activity Recognition Using Deep Learning Networks

被引:19
|
作者
Mekruksavanich, Sakorn [1 ]
Jitpattanakul, Anuchit [2 ,3 ]
机构
[1] Univ Phayao, Sch Informat & Commun Technol, Dept Comp Engn, Phayao, Thailand
[2] King Mongkuts Univ Technol North Bangkok, Fac Appl Sci, Dept Math, Bangkok, Thailand
[3] King Mongkuts Univ Technol North Bangkok, Intelligent & Nonlinear Dynam Innovat Res Ctr, Sci & Technol Res Inst, Bangkok, Thailand
关键词
deep learning; multimodal wearable sensor; human activity recognition; CNN; LSTM;
D O I
10.12720/jait.13.2.132-138
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Wearable sensors using sensor-based Human Activity Recognition (S-HAR) are generally capable of regular simple actions (walking, sitting, or standing), but are indistinguishable from sophisticated activities, such as sports-related activities. Because these involve a more comprehensive, contextual, and fine-grained classification of complex human activities, simplex activity recognition systems are ineffective for growing real-world applications, for example remote rehabilitation observation and sport performance tracking. So, an S-HAR framework for recognizing sport-related activity utilizing multimodal wearable sensors in numerous body positions is proposed in this study. A public dataset named UCI-DSADS was used to investigate the recognition performance of five deep learning networks. According to the experimental results, the BiGRU recognition model surpasses other deep learning networks with a maximum accuracy of 99.62%.
引用
收藏
页码:132 / 138
页数:7
相关论文
共 50 条
  • [31] Multimodal Emotion Recognition Using Deep Neural Networks
    Tang, Hao
    Liu, Wei
    Zheng, Wei-Long
    Lu, Bao-Liang
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT IV, 2017, 10637 : 811 - 819
  • [32] Deep Human Activity Recognition Using Wearable Sensors
    Lawal, Isah A.
    Bano, Sophia
    12TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS (PETRA 2019), 2019, : 45 - 48
  • [33] Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
    Hnoohom, Narit
    Mekruksavanich, Sakorn
    Jitpattanakul, Anuchit
    ELECTRONICS, 2023, 12 (03)
  • [34] Human Activity Recognition With Smartphone and Wearable Sensors Using Deep Learning Techniques: A Review
    Ramanujam, E.
    Perumal, Thinagaran
    Padmavathi, S.
    IEEE SENSORS JOURNAL, 2021, 21 (12) : 13029 - 13040
  • [35] A Survey on Human Activity Recognition Using Deep Learning Techniques and Wearable Sensor Data
    Dua, Nidhi
    Singh, Shiva Nand
    Challa, Sravan Kumar
    Semwal, Vijay Bhaskar
    Kumar, M. L. S. Sai
    MACHINE LEARNING, IMAGE PROCESSING, NETWORK SECURITY AND DATA SCIENCES, MIND 2022, PT I, 2022, 1762 : 52 - 71
  • [36] Multimodal Arabic emotion recognition using deep learning
    Al Roken, Noora
    Barlas, Gerassimos
    SPEECH COMMUNICATION, 2023, 155
  • [37] Multimodal Emotion Recognition using Deep Learning Architectures
    Ranganathan, Hiranmayi
    Chakraborty, Shayok
    Panchanathan, Sethuraman
    2016 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2016), 2016,
  • [38] Uncovering Human Multimodal Activity Recognition with a Deep Learning Approach
    Ranieri, Caetano M.
    Vargas, Patricia A.
    Romero, Roseli A. F.
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [39] Human Activity Recognition System Using Multimodal Sensor and Deep Learning Based on LSTM
    Shin, Soo-Yeun
    Cha, Joo-Heon
    TRANSACTIONS OF THE KOREAN SOCIETY OF MECHANICAL ENGINEERS A, 2018, 42 (02) : 111 - 121
  • [40] Towards Multimodal Deep Learning for Activity Recognition on Mobile Devices
    Radu, Valentin
    Lane, Nicholas D.
    Bhattacharya, Sourav
    Mascolo, Cecilia
    Marina, Mahesh K.
    Kawsar, Fahim
    UBICOMP'16 ADJUNCT: PROCEEDINGS OF THE 2016 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING, 2016, : 185 - 188