Intelligent Human-Computer Interaction for Building Information Models Using Gesture Recognition

被引:0
|
作者
Zhang, Tianyi [1 ]
Wang, Yukang [1 ]
Zhou, Xiaoping [1 ]
Liu, Deli [1 ]
Ji, Jingyi [1 ]
Feng, Junfu [1 ]
机构
[1] Beijing Univ Civil Engn & Architecture, Beijing Key Lab Intelligent Proc Bldg Big Data, Beijing 102616, Peoples R China
基金
中国国家自然科学基金;
关键词
Human-computer interaction; Building Information Model; gesture interaction; computer vision; gesture understanding; TECHNOLOGY;
D O I
10.3390/inventions10010005
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Human-computer interaction (HCI) with three-dimensional (3D) Building Information Modelling/Model (BIM) is the crucial ingredient to enhancing the user experience and fostering the value of BIM. Current BIMs mostly use keyboard, mouse, or touchscreen as media for HCI. Using these hardware devices for HCI with BIM may lead to space constraints and a lack of visual intuitiveness. Somatosensory interaction represents an emergent modality of interaction, e.g., gesture interaction, which requires no equipment or direct touch, presents a potential approach to solving these problems. This paper proposes a computer-vision-based gesture interaction system for BIM. Firstly, a set of gestures for BIM model manipulation was designed, grounded in human ergonomics. These gestures include selection, translation, scaling, rotation, and restoration of the 3D model. Secondly, a gesture understanding algorithm dedicated to 3D model manipulation is introduced in this paper. Then, an interaction system for 3D models based on machine vision and gesture recognition was developed. A series of systematic experiments are conducted to confirm the effectiveness of the proposed system. In various environments, including pure white backgrounds, offices, and conference rooms, even when wearing gloves, the system has an accuracy rate of over 97% and a frame rate maintained between 26 and 30 frames. The final experimental results show that the method has good performance, confirming its feasibility, accuracy, and fluidity. Somatosensory interaction with 3D models enhances the interaction experience and operation efficiency between the user and the model, further expanding the application scene of BIM.
引用
收藏
页数:23
相关论文
共 50 条
  • [41] Vision-based Hand Gesture Recognition for Human-Computer Interaction using MobileNetV2
    Baumgartl, Hermann
    Sauter, Daniel
    Schenk, Christian
    Atik, Cem
    Buettner, Ricardo
    2021 IEEE 45TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2021), 2021, : 1667 - 1674
  • [42] Integration of audio/visual information for use in human-computer intelligent interaction
    Pavlovic, VI
    Berry, GA
    Huang, TS
    INTERNATIONAL CONFERENCE ON IMAGE PROCESSING - PROCEEDINGS, VOL I, 1997, : 121 - 124
  • [43] HMM-based Gesture Recognition System Using Kinect Sensor for Improvised Human-Computer Interaction
    Saha, Sriparna
    Lahiri, Rimita
    Konar, Amit
    Banerjee, Bonny
    Nagar, Atulya K.
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2776 - 2783
  • [44] Pen-based gesture recognition in multi-modal human-computer interaction
    Wang, Y.J.
    Yuan, B.Z.
    Beifang Jiaotong Daxue Xuebao/Journal of Northern Jiaotong University, 2001, 25 (02):
  • [45] Sandwich-structured flexible strain sensors for gesture recognition in human-computer interaction
    Chen, Guanzheng
    Zhang, Xin
    Sun, Zeng
    Luo, Xuanzi
    Fang, Guoqing
    Wu, Huaping
    Cheng, Lin
    Liu, Aiping
    EUROPEAN PHYSICAL JOURNAL-SPECIAL TOPICS, 2025,
  • [46] Emotion recognition for human-computer interaction
    Jianhua TAO
    虚拟现实与智能硬件(中英文), 2021, 3 (01) : 6 - 7
  • [47] Emotion recognition in human-computer interaction
    Fragopanagos, N
    Taylor, JG
    NEURAL NETWORKS, 2005, 18 (04) : 389 - 405
  • [48] Extending human-computer interaction by using computer vision and colour recognition
    de Oliveira, TB
    Schnitman, L
    Greve, FGP
    de Souza, JAMF
    Proceedings of the Eighth IASTED International Conference on Intelligent Systems and Control, 2005, : 339 - 344
  • [49] A Dynamic Head Gesture Recognition Method for Real-Time Human-Computer Interaction
    Xie, Jialong
    Zhang, Botao
    Chepinskiy, Sergey A.
    Zhilenkov, Anton A.
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT III, 2021, 13015 : 235 - 245
  • [50] Emotion recognition for human-computer interaction
    TAO, Jianhua
    Virtual Reality and Intelligent Hardware, 2021, 3 (01):