A novel Deep-Learning model for Human Activity Recognition based on Continuous Wavelet Transform

被引:0
|
作者
Pavliuk, Olena [1 ,2 ]
Mishchuk, Myroslav [2 ]
机构
[1] Silesian Tech Univ, Ul Akad 2A, PL-44100 Gliwice, Poland
[2] Lviv Polytech Natl Univ, Stepana Bandery St 12, UA-79000 Lvov, Ukraine
关键词
Human activity recognition; biomedical signal processing; transfer learning; continuous wavelet transform; convolutional neural network;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human Activity Recognition (HAR) has recently become in the spotlight of scientific research due to the development and proliferation of wearable sensors. HAR has found applications in such areas as digital health, mobile medicine, sports, abnormal activity detection and fall prevention. Neural Networks have recently become a widespread method for dealing with HAR problems due to their ability automatically extract and select features from the raw sensor data. However, this approach requires extensive training datasets to perform sufficiently under diverse circumstances. This study proposes a novel Deep Learning - based model, pre-trained on the KU-HAR dataset. The raw, six-channel sensor data was preliminarily processed using the Continuous Wavelet Transform (CWT) for better performance. Nine popular Convolutional Neural Network (CNN) architectures, as well as different wavelets and scale values, were tested to choose the best-performing combination. The proposed model was tested on the whole UCI-HAPT dataset and its subset to assess how it performs on new activities and different amounts of training data. The results show that using the pre-trained model, especially with frozen layers, leads to improved performance, smoother gradient descent and faster training on small datasets. Additionally, the model performed on the KU-HAR dataset with a classification accuracy of 97.48% and F1-score of 97.52%, which is a competitive performance compared to other state-of-the-art HAR models.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Deep learning for human activity recognition
    Li, Xiaoli
    Zhao, Peilin
    Wu, Min
    Chen, Zhenghua
    Zhang, Le
    Neurocomputing, 2021, 444 : 214 - 216
  • [42] Deep learning for human activity recognition
    Li, Xiaoli
    Zhao, Peilin
    Wu, Min
    Chen, Zhenghua
    Zhang, Le
    NEUROCOMPUTING, 2021, 444 : 214 - 216
  • [43] Model of Micro-Leakage Point Recognition of Underground Gas Based on Continuous Wavelet Transform
    Li Hui
    Jiang Jin-bao
    Chen Xu-hui
    Peng Jin-ying
    Qiao Xiao-jun
    Wang Si-jia
    SPECTROSCOPY AND SPECTRAL ANALYSIS, 2019, 39 (12) : 3743 - 3748
  • [44] Polarization Recognition Through Scattering Media Based on Deep-Learning
    Zhuang Qiushi
    He Zewen
    Zhang Chunxu
    Xin Yu
    ACTA OPTICA SINICA, 2021, 41 (22)
  • [45] A Continuous Facial Expression Recognition Model based on Deep Learning Method
    Lin, Szu-Yin
    Tseng, Yi-Wen
    Wu, Chang-Rong
    Kung, Yun-Ching
    Chen, Yi-Zhen
    Wu, Chao-Ming
    2019 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS (ISPACS), 2019,
  • [46] An Efficient Human Activity Recognition Technique Based on Deep Learning
    Khelalef, A.
    Ababsa, F.
    Benoudjit, N.
    PATTERN RECOGNITION AND IMAGE ANALYSIS, 2019, 29 (04) : 702 - 715
  • [47] A Survey of Deep Learning Based Models for Human Activity Recognition
    Nida Saddaf Khan
    Muhammad Sayeed Ghani
    Wireless Personal Communications, 2021, 120 : 1593 - 1635
  • [48] An Efficient Human Activity Recognition Technique Based on Deep Learning
    A. Khelalef
    F. Ababsa
    N. Benoudjit
    Pattern Recognition and Image Analysis, 2019, 29 : 702 - 715
  • [49] A Deep Learning Framework for Smartphone Based Human Activity Recognition
    Mallik, Manjarini
    Sarkar, Garga
    Chowdhury, Chandreyee
    MOBILE NETWORKS & APPLICATIONS, 2023, 29 (1): : 29 - 41
  • [50] Robust Human Activity Recognition based on Deep Metric Learning
    Abdu-Aguye, Mubarak G.
    Gomaa, Walid
    ICINCO: PROCEEDINGS OF THE 16TH INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, VOL 1, 2019, : 656 - 663