Analysing Action and Intention Recognition in Human-Robot Interaction with ANEMONE

被引:2
|
作者
Alenljung, Beatrice [1 ]
Lindblom, Jessica [1 ]
机构
[1] Univ Skovde, Skovde, Sweden
关键词
Human-Robot Interaction; Human-Robot Collaboration; User-centered; Evaluation; Action Recognition; Intention Recognition; Activity Theory; Seven Stages of Action Model; User Experience (UX);
D O I
10.1007/978-3-030-78465-2_14
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The ANEMONE is a methodological approach for user experience (UX) evaluation of action and intention recognition in human-robot interaction that has activity theory as its theoretical lens in combination with the seven stages of action model and UX evaluation methodology. ANEMONE has been applied in a case where a prototype has been evaluated. The prototype was a workstation in assembly in manufacturing consisting of a collaborative robot, a pallet, a tablet, and a workbench, where one operator is working in the same physical space as one robot. The purpose of this paper is to provide guidance on how to use ANEMONE, with a particular focus on the data analysis part, through describing a real example together with lessons learned and recommendations.
引用
收藏
页码:181 / 200
页数:20
相关论文
共 50 条
  • [1] The ANEMONE: Theoretical Foundations for UX Evaluation of Action and Intention Recognition in Human-Robot Interaction
    Lindblom, Jessica
    Alenljung, Beatrice
    [J]. SENSORS, 2020, 20 (15) : 1 - 49
  • [2] Interaction Intention Recognition via Human Emotion for Human-Robot Natural Interaction
    Yang, Shengtian
    Guan, Yisheng
    Li, Yihui
    Shi, Wenjing
    [J]. 2022 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2022, : 380 - 385
  • [3] Learning Multimodal Confidence for Intention Recognition in Human-Robot Interaction
    Zhao, Xiyuan
    Li, Huijun
    Miao, Tianyuan
    Zhu, Xianyi
    Wei, Zhikai
    Tan, Lifen
    Song, Aiguo
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7819 - 7826
  • [4] Multimodal Uncertainty Reduction for Intention Recognition in Human-Robot Interaction
    Trick, Susanne
    Koert, Dorothea
    Peters, Jan
    Rothkopf, Constantin A.
    [J]. 2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 7009 - 7016
  • [5] MULTIMODAL HUMAN ACTION RECOGNITION IN ASSISTIVE HUMAN-ROBOT INTERACTION
    Rodomagoulakis, I.
    Kardaris, N.
    Pitsikalis, V.
    Mavroudi, E.
    Katsamanis, A.
    Tsiami, A.
    Maragos, P.
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 2702 - 2706
  • [6] Human-robot collaborative interaction with human perception and action recognition
    Yu, Xinyi
    Zhang, Xin
    Xu, Chengjun
    Ou, Linlin
    [J]. NEUROCOMPUTING, 2024, 563
  • [7] Natural Grasp Intention Recognition Based on Gaze in Human-Robot Interaction
    Yang, Bo
    Huang, Jian
    Chen, Xinxing
    Li, Xiaolong
    Hasegawa, Yasuhisa
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (04) : 2059 - 2070
  • [8] Human-robot Interaction Method Combining Human Pose Estimation and Motion Intention Recognition
    Cheng, Yalin
    Yi, Pengfei
    Liu, Rui
    Dong, Jing
    Zhou, Dongsheng
    Zhang, Qiang
    [J]. PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 958 - 963
  • [9] A Method of Intention Estimation for Human-Robot Interaction
    Luo, Jing
    Liu, Chao
    Wang, Ning
    Yang, Chenguang
    [J]. ADVANCES IN COMPUTATIONAL INTELLIGENCE SYSTEMS (UKCI 2019), 2020, 1043 : 69 - 80
  • [10] Human-Robot Interaction in an Unknown Human Intention scenario
    Awais, Muhammad
    Henrich, Dominik
    [J]. 2013 11TH INTERNATIONAL CONFERENCE ON FRONTIERS OF INFORMATION TECHNOLOGY (FIT), 2013, : 89 - 94