Human Activity Recognition with IMU and Vital Signs Feature Fusion

被引:0
|
作者
Xefteris, Vasileios-Rafail [1 ]
Tsanousa, Athina [1 ]
Mavropoulos, Thanassis [1 ]
Meditskos, Georgios [2 ]
Vrochidis, Stefanos [1 ]
Kompatsiaris, Ioannis [1 ]
机构
[1] Informat Technol Inst, Ctr Res & Technol Hellas, 6th Km Charilaou Thermi, Thessaloniki 57001, Greece
[2] Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki, Greece
来源
基金
欧盟地平线“2020”;
关键词
Human activity recognition; Wearable sensors; Vital signals; Sensor fusion; Feature selection;
D O I
10.1007/978-3-030-98358-1_23
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Combining data from different sources into an integrated view is a recent trend taking advantage of the Internet of Things (IoT) evolution over the last years. The fusion of different modalities has applications in various fields, including healthcare and security systems. Human activity recognition (HAR) is among the most common applications of a healthcare or eldercare system. Inertial measurement unit (IMU) wearable sensors, like accelerometers and gyroscopes, are often utilized for HAR applications. In this paper, we investigate the performance of wearable IMU sensors along with vital signs sensors for HAR. A massive feature extraction, including both time and frequency domain features and transitional features for the vital signs, along with a feature selection method were performed. The classification algorithms and different early and late fusion methods were applied to a public dataset. Experimental results revealed that both IMU and vital signs achieve reasonable HAR accuracy and Fl-score among all the classes. Feature selection significantly reduced the number of features from both IMU and vital signs features while also improved the classification accuracy. The rest of the early and late level fusion methods also performed better than each modality alone, reaching an accuracy level of up to 95.32%.
引用
收藏
页码:287 / 298
页数:12
相关论文
共 50 条
  • [21] Feature Engineering for Human Activity Recognition
    Atalaa, Basma A.
    Ziedan, Ibrahim
    Alenany, Ahmed
    Helmi, Ahmed
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (02) : 160 - 167
  • [22] Pattern Recognition in Vital Signs Using Spectrograms
    Sribhashyam, Sidharth Srivatsav
    Salekin, Md Sirajus
    Goldgof, Dmitry
    Zamzmi, Ghada
    Last, Mark
    Sun, Yu
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 1133 - 1138
  • [23] A Novel Feature Fusion Scheme for Human Recognition at a Distance
    Xing, Xianglei
    Wang, Kejun
    Yang, Xiaofei
    Du, Tongchun
    BIOMETRIC RECOGNITION, CCBR 2015, 2015, 9428 : 544 - 552
  • [24] Human action recognition based on multiple feature fusion
    1600, AMSE Press, 16 Avenue Grauge Blanche, Tassin-la-Demi-Lune, 69160, France (60):
  • [25] IMU-Based Movement Trajectory Heatmaps for Human Activity Recognition
    Konak, Orhan
    Wegner, Pit
    Arnrich, Bert
    SENSORS, 2020, 20 (24) : 1 - 15
  • [26] Assessment of IMU Configurations for Human Activity Recognition Using Ensemble Learning
    Murphy, Samuel J.
    Vitali, Rachel V.
    IEEE ACCESS, 2024, 12 : 111433 - 111442
  • [27] Merging-Squeeze-Excitation Feature Fusion for Human Activity Recognition Using Wearable Sensors
    Laitrakun, Seksan
    APPLIED SCIENCES-BASEL, 2023, 13 (04):
  • [28] A hybrid CNN and BLSTM network for human complex activity recognition with multi-feature fusion
    Huan, Ruohong
    Zhan, Ziwei
    Ge, Luoqi
    Chi, Kaikai
    Chen, Peng
    Liang, Ronghua
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (30) : 36159 - 36182
  • [29] Feature Fusion of a Deep-Learning Algorithm into Wearable Sensor Devices for Human Activity Recognition
    Yen, Chih-Ta
    Liao, Jia-Xian
    Huang, Yi-Kai
    SENSORS, 2021, 21 (24)
  • [30] CSI-Based Location-Independent Human Activity Recognition Using Feature Fusion
    Zhang, Yong
    Liu, Qingqing
    Wang, Yujie
    Yu, Guangwei
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71