Intelligent Human-Computer Interaction for Building Information Models Using Gesture Recognition

被引:0
|
作者
Zhang, Tianyi [1 ]
Wang, Yukang [1 ]
Zhou, Xiaoping [1 ]
Liu, Deli [1 ]
Ji, Jingyi [1 ]
Feng, Junfu [1 ]
机构
[1] Beijing Univ Civil Engn & Architecture, Beijing Key Lab Intelligent Proc Bldg Big Data, Beijing 102616, Peoples R China
基金
中国国家自然科学基金;
关键词
Human-computer interaction; Building Information Model; gesture interaction; computer vision; gesture understanding; TECHNOLOGY;
D O I
10.3390/inventions10010005
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Human-computer interaction (HCI) with three-dimensional (3D) Building Information Modelling/Model (BIM) is the crucial ingredient to enhancing the user experience and fostering the value of BIM. Current BIMs mostly use keyboard, mouse, or touchscreen as media for HCI. Using these hardware devices for HCI with BIM may lead to space constraints and a lack of visual intuitiveness. Somatosensory interaction represents an emergent modality of interaction, e.g., gesture interaction, which requires no equipment or direct touch, presents a potential approach to solving these problems. This paper proposes a computer-vision-based gesture interaction system for BIM. Firstly, a set of gestures for BIM model manipulation was designed, grounded in human ergonomics. These gestures include selection, translation, scaling, rotation, and restoration of the 3D model. Secondly, a gesture understanding algorithm dedicated to 3D model manipulation is introduced in this paper. Then, an interaction system for 3D models based on machine vision and gesture recognition was developed. A series of systematic experiments are conducted to confirm the effectiveness of the proposed system. In various environments, including pure white backgrounds, offices, and conference rooms, even when wearing gloves, the system has an accuracy rate of over 97% and a frame rate maintained between 26 and 30 frames. The final experimental results show that the method has good performance, confirming its feasibility, accuracy, and fluidity. Somatosensory interaction with 3D models enhances the interaction experience and operation efficiency between the user and the model, further expanding the application scene of BIM.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] Continuous Body and Hand Gesture Recognition for Natural Human-Computer Interaction
    Song, Yale
    Davis, Randall
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 4212 - 4216
  • [22] Eye center localization and gaze gesture recognition for human-computer interaction
    Zhang, Wenhao
    Smith, Melvyn L.
    Smith, Lyndon N.
    Farooq, Abdul
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2016, 33 (03) : 314 - 325
  • [23] Motion recognition based on Kinect for human-computer intelligent interaction
    Pang, Xun
    Liang, Bin
    2018 INTERNATIONAL SYMPOSIUM ON POWER ELECTRONICS AND CONTROL ENGINEERING (ISPECE 2018), 2019, 1187
  • [24] A natural hand gesture system for intelligent human-computer interaction and medical assistance
    Zeng, Jinhua
    Sun, Yaoru
    Wang, Fang
    2012 THIRD GLOBAL CONGRESS ON INTELLIGENT SYSTEMS (GCIS 2012), 2012, : 382 - 385
  • [25] Dynamic hand gesture recognition using vision-based approach for human-computer interaction
    Singha, Joyeeta
    Roy, Amarjit
    Laskar, Rabul Hussain
    NEURAL COMPUTING & APPLICATIONS, 2018, 29 (04): : 1129 - 1141
  • [26] Review of constraints on vision-based gesture recognition for human-computer interaction
    Chakraborty, Biplab Ketan
    Sarma, Debajit
    Bhuyan, M. K.
    MacDorman, Karl F.
    IET COMPUTER VISION, 2018, 12 (01) : 3 - 15
  • [27] Interferometric Radar For Spatially-Persistent Gesture Recognition in Human-Computer Interaction
    Klinefelter, Eric
    Nanzer, Jeffrey A.
    2019 IEEE RADAR CONFERENCE (RADARCONF), 2019,
  • [28] Generic System for Human-Computer Gesture Interaction
    Trigueiros, Paulo
    Ribeiro, Fernando
    Reis, Luis Paulo
    2014 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC), 2014, : 175 - 180
  • [29] Real-Time Continuous Gesture Recognition for Natural Human-Computer Interaction
    Yin, Ying
    Davis, Randall
    2014 IEEE SYMPOSIUM ON VISUAL LANGUAGES AND HUMAN-CENTRIC COMPUTING (VL/HCC 2014), 2014, : 113 - 120
  • [30] Convolutional neural network for gesture recognition human-computer interaction system design
    Niu, Peixin
    PLOS ONE, 2025, 20 (02):