Wearable sensor-based pattern mining for human activity recognition: deep learning approach

被引:49
|
作者
Bijalwan, Vishwanath [1 ]
Semwal, Vijay Bhaskar [2 ]
Gupta, Vishal [3 ]
机构
[1] Inst Technol, Gopeshwar, Chamoli, India
[2] Maulana Azad Natl Inst Technol, Bhopal, India
[3] ICFAI Univ Dehradun, Dehra Dun, Uttarakhand, India
关键词
Deep learning; Gait analysis; Human activity recognition (HAR); IMU sensor; Wearable sensor; FEATURE-SELECTION; PUSH RECOVERY; GAIT; MODEL; ROBUST; TRAJECTORIES; LOCOMOTION; ROBOT;
D O I
10.1108/IR-09-2020-0187
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Purpose This paper aims to deal with the human activity recognition using human gait pattern. The paper has considered the experiment results of seven different activities: normal walk, jogging, walking on toe, walking on heel, upstairs, downstairs and sit-ups. Design/methodology/approach In this current research, the data is collected for different activities using tri-axial inertial measurement unit (IMU) sensor enabled with three-axis accelerometer to capture the spatial data, three-axis gyroscopes to capture the orientation around axis and 3 degrees magnetometer. It was wirelessly connected to the receiver. The IMU sensor is placed at the centre of mass position of each subject. The data is collected for 30 subjects including 11 females and 19 males of different age groups between 10 and 45 years. The captured data is pre-processed using different filters and cubic spline techniques. After processing, the data are labelled into seven activities. For data acquisition, a Python-based GUI has been designed to analyse and display the processed data. The data is further classified using four different deep learning model: deep neural network, bidirectional-long short-term memory (BLSTM), convolution neural network (CNN) and CNN-LSTM. The model classification accuracy of different classifiers is reported to be 58%, 84%, 86% and 90%. Findings The activities recognition using gait was obtained in an open environment. All data is collected using an IMU sensor enabled with gyroscope, accelerometer and magnetometer in both offline and real-time activity recognition using gait. Both sensors showed their usefulness in empirical capability to capture a precised data during all seven activities. The inverse kinematics algorithm is solved to calculate the joint angle from spatial data for all six joints hip, knee, ankle of left and right leg. Practical implications This work helps to recognize the walking activity using gait pattern analysis. Further, it helps to understand the different joint angle patterns during different activities. A system is designed for real-time analysis of human walking activity using gait. A standalone real-time system has been designed and realized for analysis of these seven different activities. Originality/value The data is collected through IMU sensors for seven activities with equal timestamp without noise and data loss using wirelessly. The setup is useful for the data collection in an open environment outside the laboratory environment for activity recognition. The paper also presents the analysis of all seven different activity trajectories patterns.
引用
收藏
页码:21 / 33
页数:13
相关论文
共 50 条
  • [31] Wearable Sensor-Based Activity Recognition for Housekeeping Task
    Liu, Kai-Chun
    Yen, Chien-Yi
    Chang, Li-Han
    Hsieh, Chia-Yeh
    Chan, Chia-Tai
    2017 IEEE 14TH INTERNATIONAL CONFERENCE ON WEARABLE AND IMPLANTABLE BODY SENSOR NETWORKS (BSN), 2017, : 67 - 70
  • [32] Invariant Feature Learning for Sensor-Based Human Activity Recognition
    Hao, Yujiao
    Zheng, Rong
    Wang, Boyu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2022, 21 (11) : 4013 - 4024
  • [33] A profile similarity-based personalized federated learning method for wearable sensor-based human activity recognition
    Chai, Yidong
    Liu, Haoxin
    Zhu, Hongyi
    Pan, Yue
    Zhou, Anqi
    Liu, Hongyan
    Liu, Jianwei
    Qian, Yang
    INFORMATION & MANAGEMENT, 2024, 61 (07)
  • [34] Imaging and fusing time series for wearable sensor-based human activity recognition
    Qin, Zhen
    Zhang, Yibo
    Meng, Shuyu
    Qin, Zhiguang
    Choo, Kim-Kwang Raymond
    INFORMATION FUSION, 2020, 53 (53) : 80 - 87
  • [35] Robot Semantic Mapping through Wearable Sensor-based Human Activity Recognition
    Li, Gang
    Zhu, Chun
    Du, Jianhao
    Cheng, Qi
    Sheng, Weihua
    Chen, Heping
    2012 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2012, : 5228 - 5233
  • [36] Wearable sensor-based human activity recognition from environmental background sounds
    Yi Zhan
    Tadahiro Kuroda
    Journal of Ambient Intelligence and Humanized Computing, 2014, 5 : 77 - 89
  • [37] Wearable sensor-based human activity recognition from environmental background sounds
    Zhan, Yi
    Kuroda, Tadahiro
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2014, 5 (01) : 77 - 89
  • [38] Baseline Model Training in Sensor-Based Human Activity Recognition: An Incremental Learning Approach
    Xiao, Jianyu
    Chen, Linlin
    Chen, Haipeng
    Hong, Xuemin
    IEEE ACCESS, 2021, 9 : 70261 - 70272
  • [39] On the Generality of Codebook Approach for Sensor-Based Human Activity Recognition
    Shirahama, Kimiaki
    Grzegorzek, Marcin
    ELECTRONICS, 2017, 6 (02)
  • [40] Sensor-based Activity Recognition using Deep Learning: A Comparative Study
    Trabelsi, Imen
    Francoise, Jules
    Bellik, Yacine
    PROCEEDINGS OF 2022 8TH INTERNATIONAL CONFERENCE ON MOVEMENT AND COMPUTING, MOCO 2022, 2022,