Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality

被引:41
|
作者
Kim, Minseok [1 ]
Lee, Jae Yeol [1 ]
机构
[1] Chonnam Natl Univ, Dept Ind Engn, 300 Yongbong Dong, Kwangju 500757, South Korea
关键词
Natural interaction; Hand gesture; Mobile augmented reality; 3D AR object manipulation;
D O I
10.1007/s11042-016-3355-9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Mobile augmented reality (AR) has been widely used in smart and mobile device-based applications such as entertainment, games, visual experience, and information visualization. However, most of the mobile AR applications have limitations in natural user interaction and do not fully support the direct manipulation of 3D AR objects. This paper proposes a new method for naturally and directly manipulating 3D AR objects through touch and hand gesture-based interactions in handheld devices. The touch gesture is used for the AR object selection and the natural hand gesture is used for the direct and interactive manipulation of the selected objects. Thus, the hybrid interaction makes the user more accurately interact with and manipulate AR objects in the real 3D space, not in the 2D space. In particular, natural hand gestures are detected by the Leap Motion sensor attached to the front or back of mobile devices. Thus the user can easily interacts with 3D AR objects for 3D transformation to enhance usability and usefulness. In this research, comprehensive comparative analyses were performed among the proposed approach and the widely used screen touch-based approach and vision-based approach in terms of quantitative and qualitative aspects. Quantitative analysis was conducted by measuring task completion time and failure rate to perform given tasks such as 3D object matching and grasp-hang-release operation. Both tasks require simultaneous 3D translation and 3D rotation. In addition, we have compared the gesture performance depending on whether the gesture sensor is located in the front or the back of the mobile device. Furthermore, to support other complex operations, an assembly task has also been evaluated. The assembly task consists of a sequence of combining parts into a sub-assembly. Qualitative analysis was performed through enquiring questionnaire after the experiment that examines factors such as ease-of-use, ease-of-natural interaction, etc. Both analyses showed that the proposed approach can provide more natural and intuitive interaction and manipulation of mobile AR objects. Several implementation results will also be given to show the advantage and effectiveness of the proposed approach.
引用
收藏
页码:16529 / 16550
页数:22
相关论文
共 50 条
  • [21] Vision based 3D Gesture Tracking using Augmented Reality and Virtual Reality for Improved Learning Applications
    Mahayuddin, Zainal Rasyid
    Saif, A. F. M. Saifuddin
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (12) : 631 - 638
  • [22] Hand Gesture User Interface for Transforming Objects in 3D Virtual Space
    Jeong, Ji-Seong
    Park, Chan
    Yoo, Kwan-Hee
    [J]. MULTIMEDIA, COMPUTER GRAPHICS AND BROADCASTING, PT I, 2011, 262 : 172 - +
  • [23] Mobile Augmented Reality based 3D Snapshots
    Keitler, Peter
    Pankratz, Frieder
    Schwerdtfeger, Bjoern
    Pustka, Daniel
    Roediger, Wolf
    Klinker, Gudrun
    Rauch, Christian
    Chathoth, Anup
    Collomosse, John
    Song, Yi-Zhe
    [J]. 2009 8TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY - SCIENCE AND TECHNOLOGY, 2009, : 199 - +
  • [24] Virtual Object Manipulation by Combining Touch and Head Interactions for Mobile Augmented Reality
    Oh, Ju Young
    Park, Hyung
    Park, Jung-Min
    [J]. APPLIED SCIENCES-BASEL, 2019, 9 (14):
  • [25] Hand Gesture-Based Virtual Reality Training Simulator for Collaboration Rescue of a Railway Accident
    Xu, Jianxi
    Tang, Zhao
    Zhao, Huiwen
    Zhang, Jianjun
    [J]. INTERACTING WITH COMPUTERS, 2019, 31 (06) : 577 - 588
  • [26] CAD-based 3D objects recognition in monocular images for mobile augmented reality
    Han, Pengfei
    Zhao, Gang
    [J]. COMPUTERS & GRAPHICS-UK, 2015, 50 : 36 - 46
  • [27] MoSART: Mobile Spatial Augmented Reality for 3D Interaction With Tangible Objects
    Cortes, Guillaume
    Marchand, Eric
    Brincin, Guillaume
    Lecuyer, Anatole
    [J]. FRONTIERS IN ROBOTICS AND AI, 2018, 5
  • [28] [Poster] Touch Gestures for Improved 3D Object Manipulation in Mobile Augmented Reality
    Tiefenbacher, Philipp
    Pflaum, Andreas
    Rigoll, Gerhard
    [J]. 2014 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR) - SCIENCE AND TECHNOLOGY, 2014, : 315 - 316
  • [29] Studying gesture-based interaction on a mobile augmented reality application for co-design activity
    Gul, Leman Figen
    [J]. JOURNAL ON MULTIMODAL USER INTERFACES, 2018, 12 (02) : 109 - 124
  • [30] COMPARISON BETWEEN 2D AND 3D HAND GESTURE INTERACTION FOR AUGMENTED REALITY APPLICATIONS
    Radkowski, Rafael
    Stritzke, Christian
    [J]. PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2011, VOL 2, PTS A AND B, 2012, : 1533 - 1543