Thumbs up, thumbs down: non-verbal human-robot interaction through real-time EMG classification via inductive and supervised transductive transfer learning

被引:0
|
作者
Jhonatan Kobylarz
Jordan J. Bird
Diego R. Faria
Eduardo Parente Ribeiro
Anikó Ekárt
机构
[1] Federal University of Parana,Department of Electrical Engineering
[2] Aston University,School of Engineering and Applied Science
关键词
Gesture classification; Human-robot interaction; Electromyography; Machine learning; Transfer learning; Inductive transfer learning; Supervised transductive transfer Learning; Myo armband; Pepper robot;
D O I
暂无
中图分类号
学科分类号
摘要
In this study, we present a transfer learning method for gesture classification via an inductive and supervised transductive approach with an electromyographic dataset gathered via the Myo armband. A ternary gesture classification problem is presented by states of ’thumbs up’, ’thumbs down’, and ’relax’ in order to communicate in the affirmative or negative in a non-verbal fashion to a machine. Of the nine statistical learning paradigms benchmarked over 10-fold cross validation (with three methods of feature selection), an ensemble of Random Forest and Support Vector Machine through voting achieves the best score of 91.74% with a rule-based feature selection method. When new subjects are considered, this machine learning approach fails to generalise new data, and thus the processes of Inductive and Supervised Transductive Transfer Learning are introduced with a short calibration exercise (15 s). Failure of generalisation shows that 5 s of data per-class is the strongest for classification (versus one through seven seconds) with only an accuracy of 55%, but when a short 5 s per class calibration task is introduced via the suggested transfer method, a Random Forest can then classify unseen data from the calibrated subject at an accuracy of around 97%, outperforming the 83% accuracy boasted by the proprietary Myo system. Finally, a preliminary application is presented through social interaction with a humanoid Pepper robot, where the use of our approach and a most-common-class metaclassifier achieves 100% accuracy for all trials of a ‘20 Questions’ game.
引用
收藏
页码:6021 / 6031
页数:10
相关论文
共 4 条
  • [1] Thumbs up, thumbs down: non-verbal human-robot interaction through real-time EMG classification via inductive and supervised transductive transfer learning
    Kobylarz, Jhonatan
    Bird, Jordan J.
    Faria, Diego R.
    Ribeiro, Eduardo Parente
    Ekart, Aniko
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2020, 11 (12) : 6021 - 6031
  • [2] Human-robot non-verbal interaction empowered by real-time auditory and visual multiple-talker tracking
    Okuno, HG
    Nakadai, K
    Hidai, KI
    Mizoguchi, H
    Kitano, H
    ADVANCED ROBOTICS, 2003, 17 (02) : 115 - 130
  • [3] Federated Learning-Enhanced Edge Deep Learning Model for EMG-Based Gesture Recognition in Real-Time Human-Robot Interaction
    Zafar, Muhammad Hamza
    Moosavi, Syed Kumayl Raza
    Sanfilippo, Filippo
    IEEE SENSORS JOURNAL, 2025, 25 (05) : 9139 - 9151
  • [4] Real-time, automatic digi-tailor mannequin robot adjustment based on human body classification through supervised learning
    Daneshmand, Morteza
    Abels, Artur
    Anbarjafari, Gholamreza
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2017, 14 (03):