Personalised Human Activity Recognition Using Matching Networks

被引:3
|
作者
Sani, Sadiq [1 ]
Wiratunga, Nirmalie [1 ]
Massie, Stewart [1 ]
Cooper, Kay [2 ]
机构
[1] Robert Gordon Univ, Sch Comp Sci & Digital Media, Aberdeen AB10 7GJ, Scotland
[2] Robert Gordon Univ, Sch Hlth Sci, Aberdeen AB10 7GJ, Scotland
关键词
DEEP;
D O I
10.1007/978-3-030-01081-2_23
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human Activity Recognition (HAR) is typically modelled as a classification task where sensor data associated with activity labels are used to train a classifier to recognise future occurrences of these activities. An important consideration when training HAR models is whether to use training data from a general population (subject-independent), or personalised training data from the target user (subject-dependent). Previous evaluations have shown personalised training to be more accurate because of the ability of resulting models to better capture individual users' activity patterns. From a practical perspective however, collecting sufficient training data from end users may not be feasible. This has made using subject-independent training far more common in real-world HAR systems. In this paper, we introduce a novel approach to personalised HAR using a neural network architecture called a matching network. Matching networks perform nearest-neighbour classification by reusing the class label of the most similar instances in a provided support set, which makes them very relevant to case-based reasoning. A key advantage of matching networks is that they use metric learning to produce feature embeddings or representations that maximise classification accuracy, given a chosen similarity metric. Evaluations show our approach to substantially out perform general subject-independent models by at least 6% macro-averaged F1 score.
引用
下载
收藏
页码:339 / 353
页数:15
相关论文
共 50 条
  • [31] Human Activity Recognition Using 2D Convolutional Neural Networks
    Gholamrezaii, Marjan
    Almodarresi, Seyed Mohammad Taghi
    2019 27TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE 2019), 2019, : 1682 - 1686
  • [32] Artificial neural networks for human activity recognition using sensor based dataset
    Geravesh, Shahab
    Rupapara, Vaibhav
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (10) : 14815 - 14835
  • [33] Artificial neural networks for human activity recognition using sensor based dataset
    Shahab Geravesh
    Vaibhav Rupapara
    Multimedia Tools and Applications, 2023, 82 : 14815 - 14835
  • [34] Deformable Convolutional Networks for Multimodal Human Activity Recognition Using Wearable Sensors
    Xu, Shige
    Zhang, Lei
    Huang, Wenbo
    Wu, Hao
    Song, Aiguo
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [35] Self-Attention Networks for Human Activity Recognition Using Wearable Devices
    Betancourt, Carlos
    Chen, Wen-Hui
    Kuan, Chi-Wei
    2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 1194 - 1199
  • [36] Human Activity Recognition using Wearable Sensors by Deep Convolutional Neural Networks
    Jiang, Wenchao
    Yin, Zhaozheng
    MM'15: PROCEEDINGS OF THE 2015 ACM MULTIMEDIA CONFERENCE, 2015, : 1307 - 1310
  • [37] HUMAN ACTIVITY DETECTION AND ACTION RECOGNITION IN VIDEOS USING CONVOLUTIONAL NEURAL NETWORKS
    Basavaiah, Jagadeesh
    Patil, Chandrashekar Mohan
    JOURNAL OF INFORMATION AND COMMUNICATION TECHNOLOGY-MALAYSIA, 2020, 19 (02): : 157 - 183
  • [38] Human activity recognition with smartphone sensors using deep learning neural networks
    Ronao, Charissa Ann
    Cho, Sung-Bae
    EXPERT SYSTEMS WITH APPLICATIONS, 2016, 59 : 235 - 244
  • [39] Transfer Learning for Human Activity Recognition Using Representational Analysis of Neural Networks
    An S.
    Bhat G.
    Gumussoy S.
    Ogras U.
    ACM Transactions on Computing for Healthcare, 2023, 4 (01):
  • [40] Human locomotion activity recognition using spectral analysis and convolutional neural networks
    Amer, Ahmad
    Ji, Ze
    INTERNATIONAL JOURNAL OF MANUFACTURING RESEARCH, 2021, 16 (04) : 350 - 364