Enhanced Complex Human Activity Recognition System: A Proficient Deep Learning Framework Exploiting Physiological Sensors and Feature Learning

被引:9
|
作者
Choudhury, Nurul Amin [1 ]
Soni, Badal [1 ]
机构
[1] Natl Inst Technol Silchar, Dept Comp Sci & Engn, Cachar 788010, India
关键词
Sensor applications; complex human activity recognition (HAR); deep learning; machine learning and daily living activities; physiological sensors;
D O I
10.1109/LSENS.2023.3326126
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Human activity recognition is the process of identifying daily living activities of a person using sensor attributes and intelligent learning algorithms. Identifying complex human activities is tedious, as capturing long-term dependencies and extracting efficient features from the raw sensor data is challenging. This letter proposes an efficient and lightweight hybrid deep learning model for recognizing complex human activities using physiological electromyography (EMG) sensors and enhanced feature learning. The proposed convolutional neural networks - long short-term memory (CNN-LSTM) incorporates multiple 1-D convolution layers for spatial feature extraction and then feeds the generated feature maps to the recurrent layers to identify long-term temporal dependencies. Incorporating a physiological sensor-based raw EMG dataset and minimal preprocessing, we trained and tested our proposed model and achieved the highest accuracy of 84.12% and an average accuracy of 83%. The proposed model outperformed the benchmark models with optimal performance margins and generalized the patterns in significantly less computational time than other deep learning models.
引用
收藏
页数:4
相关论文
共 50 条
  • [41] Deep-Learning-Enhanced Human Activity Recognition for Internet of Healthcare Things
    Zhou, Xiaokang
    Liang, Wei
    Wang, Kevin I-K
    Wang, Hao
    Yang, Laurence T.
    Jin, Qun
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (07): : 6429 - 6438
  • [42] Analysis of Deep Transfer Learning Using DeepConvLSTM for Human Activity Recognition fromWearable Sensors
    Kalabakov, Stefan
    Gjoreski, Martin
    Gjoreski, Hristijan
    Gams, Matjaz
    INFORMATICA-AN INTERNATIONAL JOURNAL OF COMPUTING AND INFORMATICS, 2021, 45 (02): : 289 - 296
  • [43] Deep Ensemble Learning for Human Activity Recognition UsingWearable Sensors via Filter Activation
    Huang, Wenbo
    Zhang, Lei
    Wang, Shuoyuan
    Wu, Hao
    Song, Aiguo
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (01)
  • [44] Human Activity Recognition With Smartphone and Wearable Sensors Using Deep Learning Techniques: A Review
    Ramanujam, E.
    Perumal, Thinagaran
    Padmavathi, S.
    IEEE SENSORS JOURNAL, 2021, 21 (12) : 13029 - 13040
  • [45] Alternative Deep Learning Architectures for Feature-Level Fusion in Human Activity Recognition
    Maitre, Julien
    Bouchard, Kevin
    Gaboury, Sebastien
    MOBILE NETWORKS & APPLICATIONS, 2021, 26 (05): : 2076 - 2086
  • [46] A Deep Bidirectional LSTM Model Enhanced by Transfer-Learning-Based Feature Extraction for Dynamic Human Activity Recognition
    Hassan, Najmul
    Miah, Abu Saleh Musa
    Shin, Jungpil
    APPLIED SCIENCES-BASEL, 2024, 14 (02):
  • [47] SemImput: Bridging Semantic Imputation with Deep Learning for Complex Human Activity Recognition
    Razzaq, Muhammad Asif
    Cleland, Ian
    Nugent, Chris
    Lee, Sungyoung
    SENSORS, 2020, 20 (10)
  • [48] Alternative Deep Learning Architectures for Feature-Level Fusion in Human Activity Recognition
    Julien Maitre
    Kevin Bouchard
    Sébastien Gaboury
    Mobile Networks and Applications, 2021, 26 : 2076 - 2086
  • [49] Rich learning representations for human activity recognition: How to empower deep feature learning for biological time series
    Kanjilal, Ria
    Uysal, Ismail
    JOURNAL OF BIOMEDICAL INFORMATICS, 2022, 134
  • [50] DHERF: A Deep Learning Ensemble Feature Extraction Framework for Emotion Recognition Using Enhanced-CNN
    Basha, Shaik Abdul Khalandar
    Vincent, P. M. Durai Raj
    JOURNAL OF ADVANCES IN INFORMATION TECHNOLOGY, 2024, 15 (07) : 853 - 861