Machine Learning Model Comparisons of User Independent & Dependent Intent Recognition Systems for Powered Prostheses

被引:31
|
作者
Bhakta, Krishan [1 ,2 ]
Camargo, Jonathan [1 ,2 ,3 ]
Donovan, Luke [1 ,4 ]
Herrin, Kinsey [1 ,2 ,3 ]
Young, Aaron [1 ,2 ,3 ]
机构
[1] Exoskeleton & Intelligent Controls EPIC Lab, Atlanta, GA 30332 USA
[2] Georgia Inst Technol, Woodruff Sch Mech Engn, Atlanta, GA 30332 USA
[3] Georgia Inst Technol, Inst Robot & Intelligent Machines, Atlanta, GA 30332 USA
[4] Georgia Inst Technol, Sch Elect & Comp Engn, Atlanta, GA 30332 USA
关键词
Prosthetics and exoskeletons; wearable robots; human performance augmentation; mode classification; transfemoral amputation; TRANSFEMORAL AMPUTEES; CLASSIFICATION METHOD; FEATURE-EXTRACTION; INTACT LIMB; AMBULATION; WALKING; GAIT; LEG; DESIGN;
D O I
10.1109/LRA.2020.3007480
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Developing intelligent prosthetic controllers to recognize user intent across users is a challenge. Machine learning algorithms present an opportunity to develop methods for predicting user's locomotion mode. Currently, linear discriminant analysis (LDA) offers the standard solution in the state-of-the-art for subject dependent models and has been used in the development of subject independent applications. However, the performance of subject independent models differ radically from their dependent counterpart. Furthermore, most of the studies limit the evaluation to a fixed terrain with individual stair height and ramp inclination. In this study, we investigated the use of the XGBoost algorithm for developing a subject independent model across 8 individuals with transfemoral amputation. We evaluated the performance of XGBoost across different stair heights and inclination angles and found that it generalizes well across preset conditions. Our findings suggest that XGBoost offers a potential benefit for both subject independent and subject dependent algorithms outperforming LDA and NN (DEP SS Error: 2.93% +/- 0.49%, DEP TS Error: 7.03% +/- 0.74%, IND SS Error: 10.12% +/- 3.16%, and IND TS Error: 15.78% +/- 2.39%)(p < 0.05). We were also able to show that with the inclusion of extra sensors the model performance could continually be improved in both user dependent and independent models (p < 0.05). Our study provides valuable information for future intent recognition systems to make them more reliable across different users and common community ambulation modes.
引用
收藏
页码:5393 / 5400
页数:8
相关论文
共 50 条
  • [1] A Classification Method for User-Independent Intent Recognition for Transfemoral Amputees Using Powered Lower Limb Prostheses
    Young, Aaron J.
    Hargrove, Levi J.
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2016, 24 (02) : 217 - 225
  • [2] Preliminary Study of the Effect of User Intent Recognition Errors on Volitional Control of Powered Lower Limb Prostheses
    Zhang, Fan
    Liu, Ming
    Huang, He
    2012 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2012, : 2768 - 2771
  • [3] An Improved Extreme Learning Machine (ELM) Algorithm for Intent Recognition of Transfemoral Amputees With Powered Knee Prosthesis
    Zhang, Yao
    Wang, Xu
    Xiu, Haohua
    Chen, Wei
    Ma, Yongxin
    Wei, Guowu
    Ren, Lei
    Ren, Luquan
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2024, 32 : 1757 - 1766
  • [4] User-Independen Intent Recognition for Lower Limb Prostheses Using Depth Sensing
    Massalin, Yerzhan
    Abdrakhmanova, Madina
    Varol, Huseyin Atakan
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2018, 65 (08) : 1759 - 1770
  • [5] An Advanced User Intent Model Based On User Learning Process
    Zhang, Bo
    Qi, Xiaoxuan
    Han, Xiaowei
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2020, 34 (09)
  • [6] Analysis of using EMG and mechanical sensors to enhance intent recognition in powered lower limb prostheses
    Young, A. J.
    Kuiken, T. A.
    Hargrove, L. J.
    JOURNAL OF NEURAL ENGINEERING, 2014, 11 (05)
  • [7] Machine Learning and Recognition of User Tasks for Malware Detection
    Alagrash, Yasamin
    Mohan, Nithasha
    Gollapalli, Sandhya Rani
    Rrushi, Julian
    2019 FIRST IEEE INTERNATIONAL CONFERENCE ON TRUST, PRIVACY AND SECURITY IN INTELLIGENT SYSTEMS AND APPLICATIONS (TPS-ISA 2019), 2019, : 73 - 81
  • [8] User-centric Activity Recognition and Prediction Model using Machine Learning Algorithms
    Roy, Namrata
    Ahmed, Rafiul
    Huq, Mohammad Rezwanul
    Shahriar, Mohammad Munem
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (12) : 501 - 510
  • [9] Activity Recognition on Subject Independent Using Machine Learning
    Kee, Y. J.
    Zainudin, M. N. Shah
    Idris, M. I.
    Ramlee, R. H.
    Kamarudin, M. R.
    CYBERNETICS AND INFORMATION TECHNOLOGIES, 2020, 20 (03) : 64 - 74
  • [10] Neuromechanical and Environment Aware Machine Learning Tool for Human Locomotion Intent Recognition
    Carvalho, Simao
    Figueiredo, Joana
    Santos, Cristina P.
    XV MEDITERRANEAN CONFERENCE ON MEDICAL AND BIOLOGICAL ENGINEERING AND COMPUTING - MEDICON 2019, 2020, 76 : 2028 - 2042