Evolutionary Design of Recurrent Neural Network Architecture for Human Activity Recognition

被引:0
|
作者
Viswambaran, Ramya Anasseriyil [1 ]
Chen, Gang [1 ]
Xue, Bing [1 ]
Nekooei, Mohammad [1 ]
机构
[1] Victoria Univ Wellington, Sch Engn & Comp Sci, POB 600, Wellington 6140, New Zealand
关键词
Recurrent Neural Networks; Genetic Algorithm; Long Short Term Memory; Human Activity Recognition;
D O I
10.1109/cec.2019.8790050
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recurrent Neural Networks (RNNs) are a major class of artificial neural networks and one of the most effective deep learning models for human activity recognition (HAR). However designing the architecture of an RNN together with suitable hyper-parameters for any learning task can be very time-consuming and requires expert domain knowledge. This paper focuses on exploring a Genetic Algorithm (GA) based method to automatically design suitable architectures of Long Short Term Memory (LSTM)-based RNNs in a fully automated manner. An encoding strategy is proposed to encode the architectures and hyper-parameters of LSTM, for the sake of easy operation by GA. As a variant of RNN, LSTM is selected for this work due to its special feature to handle long-term dependencies. For verification and evaluation, three real-world benchmark datasets are used. We study three models of operation for the evolved deep LSTM-RNN, i.e., unidirectional, bidirectional and cascaded. The results of our experiments show that the RNN architectures automatically designed by our GA method can outperform state-of-the-art RNN systems and machine learning systems for HAR.
引用
收藏
页码:554 / 561
页数:8
相关论文
共 50 条
  • [1] Recurrent Neural Network for Human Activity Recognition in Smart Home
    Fang, Hongqing
    Si, Hao
    Chen, Long
    [J]. PROCEEDINGS OF 2013 CHINESE INTELLIGENT AUTOMATION CONFERENCE: INTELLIGENT AUTOMATION, 2013, 254 : 341 - 348
  • [2] Human Activity Recognition a Comparison Between Residual Neural Network and Recurrent Neural Network
    Anu, K. P.
    Benifa, J. V. Bibal
    [J]. ARTIFICIAL INTELLIGENCE: THEORY AND APPLICATIONS, VOL 2, AITA 2023, 2024, 844 : 109 - 123
  • [3] Human activity recognition using temporal convolutional neural network architecture
    Andrade-Ambriz, Yair A.
    Ledesma, Sergio
    Ibarra-Manzano, Mario-Alberto
    Oros-Flores, Marvella, I
    Almanza-Ojeda, Dora-Luz
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 191
  • [4] Deep recurrent neural network for mobile human activity recognition with high throughput
    Inoue, Masaya
    Inoue, Sozo
    Nishida, Takeshi
    [J]. ARTIFICIAL LIFE AND ROBOTICS, 2018, 23 (02) : 173 - 185
  • [5] Evolutionary Recurrent Neural Architecture Search
    Tian, Shuo
    Hu, Kai
    Guo, Shasha
    Li, Shiming
    Wang, Lei
    Xu, Weixia
    [J]. IEEE EMBEDDED SYSTEMS LETTERS, 2021, 13 (03) : 110 - 113
  • [6] HARNAS: Human Activity Recognition Based on Automatic Neural Architecture Search Using Evolutionary Algorithms
    Wang, Xiaojuan
    Wang, Xinlei
    Lv, Tianqi
    Jin, Lei
    He, Mingshu
    [J]. SENSORS, 2021, 21 (20)
  • [7] Implementation of Parallel Evolutionary Convolutional Neural Network for Classification in Human Activity and Image Recognition
    Villegas-Cortez, Juan
    Roman-Alonso, Graciela
    Fernandez De Vega, Francisco
    Flores-Morales, Yafte Aaron
    Cordero-Sanchez, Salomon
    [J]. ADVANCES IN COMPUTATIONAL INTELLIGENCE, MICAI 2023, PT I, 2024, 14391 : 327 - 345
  • [8] Convolutional and recurrent neural network for human activity recognition: Application on American sign language
    Hernandez, Vincent
    Suzuki, Tomoya
    Venture, Gentiane
    [J]. PLOS ONE, 2020, 15 (02):
  • [9] Bangla Online Handwriting Recognition Using Recurrent Neural Network Architecture
    Chakraborty, Bappaditya
    Mukherjee, Partha Sarathi
    Bhattacharya, Ujjwal
    [J]. TENTH INDIAN CONFERENCE ON COMPUTER VISION, GRAPHICS AND IMAGE PROCESSING (ICVGIP 2016), 2016,
  • [10] A new recurrent neural-network architecture for visual pattern recognition
    Lee, SW
    Song, HH
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (02): : 331 - 340