Event-based Action Recognition Using Motion Information and Spiking Neural Networks

被引:0
|
作者
Liu, Qianhui [1 ,2 ]
Xing, Dong [1 ,2 ]
Tang, Huajin [1 ,2 ]
Ma, De [1 ]
Pan, Gang [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Peoples R China
[2] Zhejiang Lab, Hangzhou, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Event-based cameras have attracted increasing attention due to their advantages of biologically inspired paradigm and low power consumption. Since event-based cameras record the visual input as asynchronous discrete events, they are inherently suitable to cooperate with the spiking neural network (SNN). Existing works of SNNs for processing events mainly focus on the task of object recognition. However, events from the event-based camera are triggered by dynamic changes, which makes it an ideal choice to capture actions in the visual scene. Inspired by the dorsal stream in visual cortex, we propose a hierarchical SNN architecture for event-based action recognition using motion information. Motion features are extracted and utilized from events to local and finally to global perception for action recognition. To the best of the authors' knowledge, it is the first attempt of SNN to apply motion information to event-based action recognition. We evaluate our proposed SNN on three event-based action recognition datasets, including our newly published DailyAction-DVS dataset comprising 12 actions collected under diverse recording conditions. Extensive experimental results show the effectiveness of motion information and our proposed SNN architecture for event-based action recognition.
引用
收藏
页码:1743 / 1749
页数:7
相关论文
共 50 条
  • [11] Asynchronous Bioplausible Neuron for Spiking Neural Networks for Event-Based Vision
    Kachole, Sanket
    Sajwani, Hussain
    Naeini, Fariborz Baghaei
    Makris, Dimitrios
    Zweiri, Yahya
    COMPUTER VISION - ECCV 2024, PT LXIV, 2025, 15122 : 399 - 415
  • [12] Adversarial attacks on spiking convolutional neural networks for event-based vision
    Buechel, Julian
    Lenz, Gregor
    Hu, Yalun
    Sheik, Sadique
    Sorbaro, Martino
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [13] TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile Object Recognition
    Gu, Fuqiang
    Sng, Weicong
    Taunyazov, Tasbolat
    Soh, Harold
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 9876 - 9882
  • [14] Spiking Neural Networks With Adaptive Membrane Time Constant for Event-Based Tracking
    Zhang, Jiqing
    Zhang, Malu
    Wang, Yuanchen
    Liu, Qianhui
    Yin, Baocai
    Li, Haizhou
    Yang, Xin
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2025, 34 : 1009 - 1021
  • [15] Event-Based Classification of Defects in Civil Infrastructures with Artificial and Spiking Neural Networks
    Gamage, Udayanga K. N. G. W.
    Zanatta, Luca
    Fumagalli, Matteo
    Cadena, Cesar
    Tolu, Silvia
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2023, PT II, 2023, 14135 : 629 - 640
  • [16] Event-based backpropagation can compute exact gradients for spiking neural networks
    Timo C. Wunderlich
    Christian Pehle
    Scientific Reports, 11
  • [17] Optical flow estimation from event-based cameras and spiking neural networks
    Cuadrado, Javier
    Rancon, Ulysse
    Cottereau, Benoit R.
    Barranco, Francisco
    Masquelier, Timothee
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [18] Event-based backpropagation can compute exact gradients for spiking neural networks
    Wunderlich, Timo C.
    Pehle, Christian
    SCIENTIFIC REPORTS, 2021, 11 (01)
  • [19] Spiking neural networks for frame-based and event-based single object localization
    Barchid, Sami
    Mennesson, Jose
    Eshraghian, Jason
    Djeraba, Chaabane
    Bennamoun, Mohammed
    NEUROCOMPUTING, 2023, 559
  • [20] Human Action Recognition based on Motion Capture Information using Fuzzy Convolution Neural Networks
    Ijjina, Earnest Paul
    Mohan, C. Krishna
    2015 EIGHTH INTERNATIONAL CONFERENCE ON ADVANCES IN PATTERN RECOGNITION (ICAPR), 2015, : 227 - 232