A holistic multi-source transfer learning approach using wearable sensors for personalized daily activity recognition

被引:1
|
作者
Jia, Qi [2 ,3 ]
Guo, Jing [1 ,2 ]
Yang, Po [1 ,2 ,4 ]
Yang, Yun [1 ,2 ]
机构
[1] Yunnan Univ, Sch Software, Kunming 650500, Yunnan, Peoples R China
[2] Yunnan Univ, Yunnan Key Lab Software Engn, Kunming 650500, Yunnan, Peoples R China
[3] Yunnan Univ, Sch Informat Sci & Engn, Kunming 650500, Yunnan, Peoples R China
[4] Univ Sheffield, Dept Comp Sci, Sheffield S1 4DP, S Yorkshire, England
关键词
Human activity recognition; Personalized service; Transfer learning; Domain adaptation;
D O I
10.1007/s40747-023-01218-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human activity recognition (HAR) aims to collect time series through wearable devices to precisely identify specific actions. However, the traditional HAR method ignores the activity variances among individuals, which will cause low generalization when applied to a new individual and indirectly enhance the difficulties of personalized HAR service. In this paper, we fully consider activity divergence among individuals to develop an end-to-end model, the multi-source unsupervised co-transfer network (MUCT), to provide personalized activity recognition for new individuals. We denote the collected data of different individuals as multiple domains and implement deep domain adaptation to align each pair of source and target domains. In addition, we propose a consistent filter that utilizes two heterogeneous classifiers to automatically select high-confidence instances from the target domain to jointly enhance the performance on the target task. The effectiveness and performance of our model are evaluated through comprehensive experiments on two activity recognition benchmarks and a private activity recognition data set (collected by our signal sensors), where our model outperforms traditional transfer learning methods at HAR.
引用
收藏
页码:1459 / 1471
页数:13
相关论文
共 50 条
  • [31] Human Activity Recognition by Using Different Deep Learning Approaches for Wearable Sensors
    Çağatay Berke Erdaş
    Selda Güney
    Neural Processing Letters, 2021, 53 : 1795 - 1809
  • [32] Deep-Learning-Based Human Activity Recognition Using Wearable Sensors
    Nouriani, A.
    McGovern, R. A.
    Rajamani, R.
    IFAC PAPERSONLINE, 2022, 55 (37): : 1 - 6
  • [33] DiReT: An effective discriminative dimensionality reduction approach for multi-source transfer learning
    Tahmoresnezhad, J.
    Hashemi, S.
    SCIENTIA IRANICA, 2017, 24 (03) : 1303 - 1311
  • [34] METIER: A Deep Multi-Task Learning Based Activity and User Recognition Model Using Wearable Sensors
    Chen, Ling
    Zhang, Yi
    Peng, Liangying
    PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2020, 4 (01):
  • [35] EEG-based cross-subject emotion recognition using multi-source domain transfer learning
    Quan, Jie
    Li, Ying
    Wang, Lingyue
    He, Renjie
    Yang, Shuo
    Guo, Lei
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 84
  • [36] Multi-source Transfer Learning with Multi-view Adaboost
    Xu, Zhijie
    Sun, Shiliang
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT III, 2012, 7665 : 332 - 339
  • [37] Investigating Activity Recognition for Hemiparetic Stroke Patients Using Wearable Sensors: A Deep Learning Approach with Data Augmentation
    Oh, Youngmin
    Choi, Sol-A
    Shin, Yumi
    Jeong, Yeonwoo
    Lim, Jongkuk
    Kim, Sujin
    SENSORS, 2024, 24 (01)
  • [38] Multi-Source Tri-Training Transfer Learning
    Cheng, Yuhu
    Wang, Xuesong
    Cao, Ge
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2014, E97D (06): : 1668 - 1672
  • [39] A Representation Learning Framework for Multi-Source Transfer Parsing
    Guo, Jiang
    Che, Wanxiang
    Yarowsky, David
    Wang, Haifeng
    Liu, Ting
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2734 - 2740
  • [40] Providing personalized learning guidance in MOOCs by multi-source data analysis
    Zhang, Ming
    Zhu, Jile
    Wang, Zhuo
    Chen, Yunfan
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2019, 22 (03): : 1189 - 1219