Robot-Human Hand-Overs in Non-Anthropomorphic Robots

被引:0
|
作者
Sivakumar, Prasanna Kumar [1 ]
Srinivas, Chittaranjan S. [1 ]
Kiselev, Andrey [2 ]
Loutfi, Amy [2 ]
机构
[1] SASTRA Univ, Sch Mech Engn, Tanjore, India
[2] Univ Orebro, Ctr Appl Autonomous Sensor Syst, Orebro, Sweden
关键词
Robot-human hand-over; humar-robot interaction; spatial contrast; temporal contrast;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Robots that assist and interact with humans will inevitably require to successfully achieve the task of handing over objects. Whether it is to deliver desired objects for the elderly living in their homes or hand tools to a worker in a factory, the process of robot hand-overs is one worthy study within the human robot interaction community. While the study of object hand-overs have been studied in previous works [1], these works have mainly considered anthropomorphic robots, that is, robots that appear and move similar to humans. However, recent trends within robotics, and in particular domestic robotics have witnessed an increase in non-anthropomorphic robotic platforms such as moving tables [2], teleconferencing robots [3] and vacuum cleaners. The study of robot hand-over for non-anthropomorphic robots and in particular the study of what constitute a successful hand-over is at focus in this paper. For the purpose of investigation, the TurtleBot(1), which is a moving table like device is used in a home environment.
引用
收藏
页码:227 / +
页数:2
相关论文
共 50 条
  • [1] Using Spatial and Temporal Contrast for Fluent Robot-Human Hand-overs
    Cakmak, Maya
    Srinivasa, Siddhartha S.
    Lee, Min Kyung
    Kiesler, Sara
    Forlizzi, Jodi
    PROCEEDINGS OF THE 6TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTIONS (HRI 2011), 2011, : 489 - 496
  • [2] Goldfinger: a non-anthropomorphic, dextrous robot hand
    Ramos, Ann M.
    Gravagne, Ian A.
    Walker, Ian D.
    Proceedings - IEEE International Conference on Robotics and Automation, 1999, 2 : 913 - 919
  • [3] Goldfinger: A non-anthropomorphic, dextrous robot hand
    Ramos, AM
    Gravagne, IA
    Walker, ID
    ICRA '99: IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, PROCEEDINGS, 1999, : 913 - 919
  • [4] A virtual circle method for kinematic mapping from human hand to a non-anthropomorphic robot hand
    Wang, H
    Low, KH
    Gong, F
    Wang, MY
    2004 8TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION, VOLS 1-3, 2004, : 1297 - 1302
  • [5] DELTA BASED NON-ANTHROPOMORPHIC HAND
    Kumar, Rajesh
    Kansal, Sachin
    Mukherjee, Sudipto
    PROCEEDINGS OF ASME 2021 INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, IDETC-CIE2021, VOL 8B, 2021,
  • [6] The Interplay of Context and Emotion for Non-Anthropomorphic Robots
    Wiltgen, Bryan
    Beer, Jenay M.
    McGreggor, Keith
    Jiang, Karl
    Thomaz, Andrea
    2010 IEEE RO-MAN, 2010, : 658 - 663
  • [7] Robo Toons: Testing the Use of Animation Principles in Non-anthropomorphic Robots to Improve Human-robot Interaction
    Schmitt, B. J.
    Prahl, A.
    Ho, A. L.
    2021 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL ENGINEERING AND ENGINEERING MANAGEMENT (IEEE IEEM21), 2021, : 1313 - 1317
  • [8] Non-anthropomorphic robots as social entities on a neurophysiological level
    Hoenen, Matthias
    Luebke, Katrin T.
    Pause, Bettina M.
    COMPUTERS IN HUMAN BEHAVIOR, 2016, 57 : 182 - 186
  • [9] Learning Grasping Strategies for a Soft Non-Anthropomorphic Hand from Human Demonstrations
    Turco, Enrico
    Bo, Valerio
    Tavassoli, Mehrdad
    Pozzi, Maria
    Prattichizzo, Domenico
    2022 31ST IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (IEEE RO-MAN 2022), 2022, : 934 - 941
  • [10] Survey of Factors for the Prediction of Human Comfort with a Non-anthropomorphic Robot in Public Spaces
    David C. May
    Kristie J. Holler
    Cindy L. Bethel
    Lesley Strawderman
    Daniel W. Carruth
    John M. Usher
    International Journal of Social Robotics, 2017, 9 : 165 - 180