HL-HAR: Hierarchical Learning Based Human Activity Recognition in Wearable Computing

被引:0
|
作者
Liu, Yan [1 ]
Zhao, Wentao [1 ]
Liu, Qiang [1 ]
Yu, Linyuan [1 ]
Wang, Dongxu [1 ]
机构
[1] Natl Univ Def Technol, Coll Comp, Changsha 410073, Hunan, Peoples R China
来源
关键词
Activity Recognition; Wearable devices; Machine learning; Hierarchical extreme learning; MACHINE;
D O I
10.1007/978-3-319-68542-7_59
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years there have been many successes of recognizing the human activity using the data collected from the wearable sensors. Besides, many of these applications use the data from the smartphone. But it is also a challenge in practice for two reasons. Most method can achieve a high precision in the cost of increasing memory consumption, or asking for complicated data source. In this paper, (1) Utilizing Plus-L Minus-R selection to single out the optimal combination from the feature vector extracted; (2) Introducing a fast classification method named H-ELM to resolve the problem of the highly memory consumption in the process of calculation. The main benefit of this factor is to reduce memory usage and increase recognition accuracy with a brief feature vector so that a wearable device can identify activities all by itself. And the wearable device can recognize the sample activities even if keeping away from cellphone. Our results show that this method leads to that we can recognize object activities with the overall accuracy of 93.7% in a very short period of time on the dataset of Human Activity Recognition Using Smartphones Dataset. The selected 25-dimension feature vector nearly contains all the information and after many times of test, it can achieve very high percentage of accuracy. Moreover, the method enables the learning velocity to outperform the state-of-the-art on the Human Activity Recognition domain.
引用
收藏
页码:684 / 693
页数:10
相关论文
共 50 条
  • [21] Wearable Computing for Internet of Things: A Discriminant Approach for Human Activity Recognition
    Lu, Wei
    Fan, Fugui
    Chu, Jinghui
    Jing, Peiguang
    Su, Yuting
    IEEE INTERNET OF THINGS JOURNAL, 2019, 6 (02) : 2749 - 2759
  • [22] Channel-Equalization-HAR: A Light-weight Convolutional Neural Network for Wearable Sensor Based Human Activity Recognition
    Huang, Wenbo
    Zhang, Lei
    Wu, Hao
    Min, Fuhong
    Song, Aiguo
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (09) : 5064 - 5077
  • [23] Hierarchical Learning of Dependent Concepts for Human Activity Recognition
    Osmani, Aomar
    Hamidi, Massinissa
    Alizadeh, Pegah
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT II, 2021, 12713 : 79 - 92
  • [24] Recognition of human activity through hierarchical stochastic learning
    Lühr, S
    Bui, HH
    Venkatesh, S
    West, GAW
    PROCEEDINGS OF THE FIRST IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS (PERCOM 2003), 2003, : 416 - 422
  • [25] Deep learning based multimodal complex human activity recognition using wearable devices
    Chen, Ling
    Liu, Xiaoze
    Peng, Liangying
    Wu, Menghan
    APPLIED INTELLIGENCE, 2021, 51 (06) : 4029 - 4042
  • [26] Wearable Sensor-Based Human Activity Recognition with Hybrid Deep Learning Model
    Luwe, Yee Jia
    Lee, Chin Poo
    Lim, Kian Ming
    INFORMATICS-BASEL, 2022, 9 (03):
  • [27] Deep learning based multimodal complex human activity recognition using wearable devices
    Ling Chen
    Xiaoze Liu
    Liangying Peng
    Menghan Wu
    Applied Intelligence, 2021, 51 : 4029 - 4042
  • [28] Wearable Sensor Data Classification for Human Activity Recognition Based on an Iterative Learning Framework
    Davila, Juan Carlos
    Cretu, Ana-Maria
    Zaremba, Marek
    SENSORS, 2017, 17 (06):
  • [29] HAR-DeepConvLG: Hybrid deep learning-based model for human activity recognition in IoT applications
    Ding, Weiping
    Abdel-Basset, Mohamed
    Mohamed, Reda
    INFORMATION SCIENCES, 2023, 646
  • [30] Deep Learning for Human Activity Recognition in Mobile Computing
    Plotz, Thomas
    Guan, Yu
    COMPUTER, 2018, 51 (05) : 50 - 59