Single robot - Multiple human interaction via intelligent user interfaces

被引:6
|
作者
Kumar, Santosh [2 ]
Sekmen, Ali [1 ]
机构
[1] Tennessee State Univ, Dept Comp Sci, Nashville, TN 37209 USA
[2] Tennessee State Univ, Dept Elect & Comp Engn, Nashville, TN 37209 USA
关键词
human-robot interaction; mobile robots; navigation; intelligent user interfaces;
D O I
10.1016/j.knosys.2008.03.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This project addresses some research issues concerning design of intelligent user interfaces for improving human-robot interaction. In some critical applications, users interact with robots via Graphical User Interfaces (GUIs), which usually contain standard components considering a large number of users. Some of these user interface components may be redundant and sometimes confusing for some users depending on their preferences, capabilities, and the context robots are used in. This paper describes an adaptive system that enables a mobile robot to learn its users' preferences and capabilities so that it can offer a dynamic and efficient GUI for each user rather than a standard GUI for all users. The system predicts future actions of the users by generating models based on the users' previous interactions with the robot. The system was implemented and evaluated on a Pioneer 3-AT mobile robot. About 20 participants who were assessed on spatial ability directed the robot in simple spatial navigation tasks to evaluate effectiveness of the adaptive interface. Time to complete the task, the number of steps, and the number of errors were collected. The results showed that although spatial reasoning ability plays an important role in mobile robot navigation, it is less important in the robot control with adaptive interfaces compared to that of the non-adaptive. (c) 2008 Elsevier B.V. All rights reserved.
引用
收藏
页码:458 / 465
页数:8
相关论文
共 50 条
  • [1] Intelligent User Interface for Human-Robot Interaction
    Song, T. H.
    Park, J. H.
    Chung, S. M.
    Kwon, K. H.
    Jeon, J. W.
    2008 6TH IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS, VOLS 1-3, 2008, : 1393 - 1398
  • [2] Human-robot interaction via voice-controllable intelligent user interface
    Medicherla, Harsha
    Sekmen, Ali
    ROBOTICA, 2007, 25 (05) : 521 - 527
  • [3] USER INTERFACES FOR LEARNING REDUNDANCY RESOLUTION IN HUMAN ROBOT INTERACTION
    Liu, Weihui
    Emmerich, Christian
    Zhang, Lizhi
    Steil, Jochen
    FOURTH IAA CONFERENCE ON DYNAMICS AND CONTROL OF SPACE SYSTEMS 2018, PTS I-III, 2018, 165 : 797 - 810
  • [4] Interfaces for human and robot interaction
    Bejczy, AK
    PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS 2003, VOL 1-3, 2003, : 1568 - 1575
  • [5] Comprehensive architecture for intelligent adaptive interface in the field of single-human multiple-robot interaction
    Ilbeygi, Mahdi
    Kangavari, Mohammad Reza
    ETRI JOURNAL, 2018, 40 (04) : 483 - 498
  • [6] Exploring the Use of Tangible User Interfaces for Human-Robot Interaction: A Comparative Study
    Guo, Cheng
    Sharlin, Ehud
    CHI 2008: 26TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS VOLS 1 AND 2, CONFERENCE PROCEEDINGS, 2008, : 121 - 130
  • [7] Intelligent user interfaces
    Puerta, AR
    KNOWLEDGE-BASED SYSTEMS, 1998, 10 (05) : 263 - 264
  • [8] Intelligent Drumming Robot for Human Interaction
    Sui, Linghui
    Su, Yukun
    Yi, Yang
    Li, Zecong
    Zhu, Jinhui
    2020 INTERNATIONAL SYMPOSIUM ON AUTONOMOUS SYSTEMS (ISAS), 2020, : 168 - 173
  • [9] Multiple robot single human interaction: effects on perceived workload
    Adams, J. A.
    BEHAVIOUR & INFORMATION TECHNOLOGY, 2009, 28 (02) : 183 - 198
  • [10] Emotionally Intelligent Conversational User Interfaces: Bridging Empathy and Technology in Human-Computer Interaction
    Votintseva, Anjelika
    Johnson, Rebecca
    Villa, Iva
    HUMAN-COMPUTER INTERACTION, PT I, HCI 2024, 2024, 14684 : 404 - 422