A role of multi-modal rhythms in physical interaction and cooperation

被引:0
|
作者
Kenta Yonekura
Chyon Hae Kim
Kazuhiro Nakadai
Hiroshi Tsujino
Shigeki Sugano
机构
[1] Tsukuba Univ.,The Dept. of Intelligent Interaction Technologies
[2] Honda Research Institute Japan Co.,School of Creative Science and Engineering
[3] Ltd.,undefined
[4] Waseda Univ.,undefined
关键词
Test Subject; Motion Capture System; Practice Phase; Remote Controller; Pitch Direction;
D O I
暂无
中图分类号
学科分类号
摘要
As fundamental research for human-robot interaction, this paper addresses the rhythmic reference of a human while turning a rope with another human. We hypothyzed that when interpreting rhythm cues to make a rhythm reference, humans will use auditory and force rhythms more than visual ones. We examined 21-23 years old test subjects. We masked perception of each test subject using 3 kinds of masks, an eye-mask, headphones, and a force mask. The force mask is composed of a robot arm and a remote controller. These instruments allow a test subject to turn a rope without feeling force from the rope. In the first experiment, each test subject interacted with an operator that turned a rope with a constant rhythm. 8 experiments were conducted for each test subject that wore combinations of masks. We measured the angular velocity of force between a test subject/the operator and a rope. We calculated error between the angular velocities of the force directions, and validated the error. In the second experiment, two test subjects interacted with each other. 1.6 - 2.4 Hz auditory rhythm was presented from headphones so as to inform target turning frequency. Addition to the auditory rhythm, the test subjects wore eye-masks. The first experiment showed that visual rhythm has little influence on rope-turning cooperation between humans. The second experiment provided firmer evidence for the same hypothesis because humans neglected their visual rhythms.
引用
收藏
相关论文
共 50 条
  • [1] A role of multi-modal rhythms in physical interaction and cooperation
    Yonekura, Kenta
    Kim, Chyon Hae
    Nakadai, Kazuhiro
    Tsujino, Hiroshi
    Sugano, Shigeki
    EURASIP JOURNAL ON AUDIO SPEECH AND MUSIC PROCESSING, 2012, : 1 - 8
  • [2] Prevention of accomplishing synchronous multi-modal human-robot cooperation by using visual rhythms
    Yonekura, Kenta
    Kim, Chyon Hae
    Nakadai, Kazuhiro
    Tsujino, Hiroshi
    Yokoi, Kazuhito
    ADVANCED ROBOTICS, 2015, 29 (14) : 901 - 912
  • [3] Multi-Modal Interaction Device
    Kim, Yul Hee
    Byeon, Sang-Kyu
    Kim, Yu-Joon
    Choi, Dong-Soo
    Kim, Sang-Youn
    INTERNATIONAL CONFERENCE ON MECHANICAL DESIGN, MANUFACTURE AND AUTOMATION ENGINEERING (MDMAE 2014), 2014, : 327 - 330
  • [4] Multi-modal interaction in biomedicine
    Zudilova, EV
    Sloot, PMA
    AMBIENT INTELLIGENCE FOR SCIENTIFIC DISCOVERY: FOUNDATIONS, THEORIES, AND SYSTEMS, 2005, 3345 : 184 - 201
  • [5] Pedestrian Detection Based on Multi-modal Cooperation
    Zhang, Yan-ning
    Tong, Xiao-min
    Zhang, Xiu-wei
    Zheng, Jiang-bin
    Zhou, Jun
    You, Si-wei
    2008 IEEE 10TH WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, VOLS 1 AND 2, 2008, : 151 - +
  • [6] Physical Querying with Multi-Modal Sensing
    Baek, Iljoo
    Stine, Taylor
    Dash, Denver
    Xiao, Fanyi
    Sheikh, Yaser
    Movshovitz-Attias, Yair
    Chen, Mei
    Hebert, Martial
    Kanade, Takeo
    2014 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2014, : 183 - 190
  • [7] Implementation of ActiveCube for multi-modal interaction
    Itoh, Y
    Kitamura, Y
    Kawai, M
    Kishino, F
    HUMAN-COMPUTER INTERACTION - INTERACT'01, 2001, : 682 - 683
  • [8] Multi-Modal Interaction for Robotics Mules
    Taylor, Glenn
    Quist, Michael
    Lanting, Matthew
    Dunham, Cory
    Muench, Paul
    UNMANNED SYSTEMS TECHNOLOGY XIX, 2017, 10195
  • [9] QUALITY OF EXPERIENCING MULTI-MODAL INTERACTION
    Weiss, Benjamin
    Moeller, Sebastian
    Wechsung, Ina
    Kuehnel, Christine
    SPOKEN DIALOGUE SYSTEMS: TECHNOLOGY AND DESIGN, 2011, : 213 - 230
  • [10] Multi-modal interaction for UAS control
    Taylor, Glenn
    Purman, Ben
    Schermerhorn, Paul
    Garcia-Sampedro, Guillermo
    Hubal, Robert
    Crabtree, Kathleen
    Rowe, Allen
    Spriggs, Sarah
    UNMANNED SYSTEMS TECHNOLOGY XVII, 2015, 9468