Human Activity Recognition Using Multichannel Convolutional Neural Network

被引:0
|
作者
Sikder, Niloy [1 ]
Chowdhury, Md Sanaullah [2 ]
Arif, Abu Shamim Mohammad [1 ]
Nahid, Abdullah-Al [2 ]
机构
[1] Khulna Univ, Comp Sci & Engn Discipline, Khulna, Bangladesh
[2] Khulna Univ, Elect & Commun Engn Discipline, Khulna, Bangladesh
关键词
HAR; human action recognition; human activity classification; multichannel CNN; UCI HAR; feature extraction;
D O I
10.1109/icaee48663.2019.8975649
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Human Activity Recognition (HAR) simply refers to the capacity of a machine to perceive human actions. HAR is a prominent application of advanced Machine Learning and Artificial Intelligence techniques that utilize computer vision to understand the semantic meanings of heterogeneous human actions. This paper describes a supervised learning method that can distinguish human actions based on data collected from practical human movements. The primary challenge while working with HAR is to overcome the difficulties that come with the cyclostationary nature of the activity signals. This study proposes a HAR classification model based on a two-channel Convolutional Neural Network (CNN) that makes use of the frequency and power features of the collected human action signals. The model was tested on the UCI HAR dataset, which resulted in a 95.25% classification accuracy. This approach will help others to conduct further researches on the recognition of human activities based on their biomedical signals.
引用
收藏
页码:560 / 565
页数:6
相关论文
共 50 条
  • [1] Human Activity Recognition Based on Multichannel Convolutional Neural Network With Data Augmentation
    Shi, Wenbing
    Fang, Xianjin
    Yang, Gaoming
    Huang, Ji
    [J]. IEEE ACCESS, 2022, 10 : 76596 - 76606
  • [2] Human activity recognition using temporal convolutional neural network architecture
    Andrade-Ambriz, Yair A.
    Ledesma, Sergio
    Ibarra-Manzano, Mario-Alberto
    Oros-Flores, Marvella, I
    Almanza-Ojeda, Dora-Luz
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 191
  • [3] Convolutional Neural Network for Human Activity Recognition and Identification
    Gamble, Justin A.
    Huang, Jingwei
    [J]. 2020 14TH ANNUAL IEEE INTERNATIONAL SYSTEMS CONFERENCE (SYSCON2020), 2020,
  • [4] Human Activity Recognition Based On Convolutional Neural Network
    Xu, Wenchao
    Pang, Yuxin
    Yang, Yanqin
    Liu, Yanbo
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 165 - 170
  • [5] Human Activity Recognition Based on Convolutional Neural Network
    Coelho, Yves
    Rangel, Luara
    dos Santos, Francisco
    Frizera-Neto, Anselmo
    Bastos-Filho, Teodiano
    [J]. XXVI BRAZILIAN CONGRESS ON BIOMEDICAL ENGINEERING, CBEB 2018, VOL. 2, 2019, 70 (02): : 247 - 252
  • [6] Human Activity Recognition Using Robust Spatiotemporal Features and Convolutional Neural Network
    Uddin, Md Zia
    Khaksar, Weria
    Torresen, Jim
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON MULTISENSOR FUSION AND INTEGRATION FOR INTELLIGENT SYSTEMS (MFI), 2017, : 144 - 149
  • [7] Abnormal Human Activity Recognition using Bayes Classifier and Convolutional Neural Network
    Liu, Congcong
    Ying, Jie
    Han, Feilong
    Ruan, Ming
    [J]. 2018 IEEE 3RD INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP), 2018, : 33 - 37
  • [8] Human Activity Recognition From Accelerometer Data Using Convolutional Neural Network
    Lee, Song-Mi
    Yoon, Sang Min
    Cho, Heeryon
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2017, : 131 - 134
  • [9] Deep Convolutional Neural Networks On Multichannel Time Series For Human Activity Recognition
    Yang, Jian Bo
    Minh Nhut Nguyen
    San, Phyo Phyo
    Li, Xiao Li
    Krishnaswamy, Shonali
    [J]. PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3995 - 4001
  • [10] Method on Human Activity Recognition Based on Convolutional Neural Network
    Haibin, Zhang
    Kubota, Naoyuki
    [J]. INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2019, PT III, 2019, 11742 : 63 - 71