A Multimodal Interface for Real-Time Soldier-Robot Teaming

被引:6
|
作者
Barber, Daniel J. [1 ]
Howard, Thomas M. [2 ]
Walter, Matthew R. [3 ]
机构
[1] Univ Cent Florida, Orlando, FL 32816 USA
[2] Univ Rochester, Rochester, NY USA
[3] Toyota Technol Inst, Chicago, IL USA
来源
关键词
Human Robot Interaction; Multimodal Communication; Automated Speech Recognition; Natural Language Understanding;
D O I
10.1117/12.2224401
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems become increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Design Approach for Investigating Multimodal Communication in Dismounted Soldier-Robot Interaction
    Barber, Daniel
    Bendell, Rhyse
    [J]. HCI INTERNATIONAL 2019 - LATE BREAKING PAPERS, HCII 2019, 2019, 11786 : 3 - 14
  • [2] eEVA as a Real-Time Multimodal Agent Human-Robot Interface
    Pena, P.
    Polceanu, M.
    Lisetti, C.
    Visser, U.
    [J]. ROBOT WORLD CUP XXII, ROBOCUP 2018, 2019, 11374 : 262 - 274
  • [3] A soldier-robot ad hoc network
    Luu, Brian B.
    O'Brien, Barry J.
    Baran, David G.
    Hardy, Rommie L.
    [J]. Fifth Annual IEEE International Conference on Pervasive Computing and Communications Workshops, Proceedings, 2007, : 558 - 563
  • [4] Real-time Framework for Multimodal Human-Robot Interaction
    Gast, Juergen
    Bannat, Alexander
    Rehrl, Tobias
    Wallhoff, Frank
    Rigoll, Gerhard
    Wendt, Cornelia
    Schmidt, Sabrina
    Popp, Michael
    Faerber, Berthold
    [J]. HSI: 2009 2ND CONFERENCE ON HUMAN SYSTEM INTERACTIONS, 2009, : 273 - 280
  • [5] Real-Time Hand Gesture Recognition for Robot Hand Interface
    Lv, Xiaomeng
    Xu, Yulin
    Wang, Ming
    [J]. LIFE SYSTEM MODELING AND SIMULATION, 2014, 461 : 209 - 214
  • [6] Real-time teaming of multiple reconfigurable manufacturing systems
    Li, Xingyu
    Bayrak, Alparslan Emrah
    Epureanu, Bogdan, I
    Koren, Yoram
    [J]. CIRP ANNALS-MANUFACTURING TECHNOLOGY, 2018, 67 (01) : 437 - 440
  • [7] Technological Evaluation of Gesture and Speech Interfaces for Enabling Dismounted Soldier-Robot Dialogue
    Kattoju, Ravi Kiran
    Barber, Daniel J.
    Abich, Julian
    Harris, Jonathan
    [J]. UNMANNED SYSTEMS TECHNOLOGY XVIII, 2016, 9837
  • [8] Real-time arm motion imitation for human–robot tangible interface
    Yukyung Choi
    SyungKwon Ra
    Soowhan Kim
    Sung-Kee Park
    [J]. Intelligent Service Robotics, 2009, 2 : 61 - 69
  • [9] Multimodal Chemosensor-Based, Real-Time Biomaterial/Cell Interface Monitoring
    Kubon, Massimo
    Hartmann, Hanna
    Moschallski, Meike
    Burkhardt, Claus
    Link, Gorden
    Werner, Simon
    Lavalle, Philippe
    Urban, Gerald
    Vrana, Nihal Engin
    Stelzle, Martin
    [J]. ADVANCED BIOSYSTEMS, 2018, 2 (06)
  • [10] An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
    Zhang, Jinhua
    Wang, Baozeng
    Zhang, Cheng
    Xiao, Yanqing
    Wang, Michael Yu
    [J]. FRONTIERS IN NEUROROBOTICS, 2019, 13