Near Real-Time 3D Reconstruction and Quality 3D Point Cloud for Time-Critical Construction Monitoring

被引:4
|
作者
Liu, Zuguang [1 ,2 ]
Kim, Daeho [3 ]
Lee, Sanghyun [4 ]
Zhou, Li [1 ,2 ]
An, Xuehui [1 ,2 ]
Liu, Meiyin [5 ]
机构
[1] Tsinghua Univ, Dept Hydraul Engn, Beijing 100084, Peoples R China
[2] Tsinghua Univ, State Key Lab Hydrosci & Engn, Beijing 100084, Peoples R China
[3] Univ Toronto, Dept Civil & Mineral Engn, 35 St George St, Toronto, ON M5S1A4, Canada
[4] Univ Michigan, Dept Civil & Environm Engn, 2350 Hayward St, Ann Arbor, MI 48109 USA
[5] Rutgers State Univ, Dept Civil & Environm Engn, 57 US Highway 1, New Brunswick, NJ 08901 USA
关键词
near real-time 3D reconstruction; direct sparse odometry with loop closure; progress monitoring;
D O I
10.3390/buildings13020464
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Improving the rapidity of 3D reconstruction is vital for time-critical construction tasks such as progress monitoring and hazard detection, but the majority of construction studies in this area have focused on improving its quality. We applied a Direct Sparse Odometry with Loop Closure (LDSO)-based 3D reconstruction method, improving the existing algorithm and tuning its hyper-parameter settings, to achieve both near real-time operation and quality 3D point cloud simultaneously. When validated using a benchmark dataset, the proposed method showed notable improvement in 3D point cloud density, as well as loop closure robustness, compared to the original LDSO. In addition, we conducted a real field test to validate the tuned LDSO's accuracy and speed at both object and site scales, where we demonstrated our method's near real-time operation and capability to produce a quality 3D point cloud comparable to that of the existing method. The proposed method improves the accessibility of the 3D reconstruction technique, which in turn helps construction professionals monitor their jobsite safety and progress in a more efficient manner.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] GridNet-3D: A Novel Real-Time 3D Object Detection Algorithm Based on Point Cloud
    YUE Yuanchen
    CAI Yunfei
    WANG Dongsheng
    ChineseJournalofElectronics, 2021, 30 (05) : 931 - 939
  • [42] Real-time 3D multi-pedestrian detection and tracking using 3D LiDAR point cloud for mobile robot
    Na, Ki-In
    Park, Byungjae
    ETRI JOURNAL, 2023, 45 (05) : 836 - 846
  • [43] GridNet-3D: A Novel Real-Time 3D Object Detection Algorithm Based on Point Cloud
    Yue Yuanchen
    Cai Yunfei
    Wang Dongsheng
    CHINESE JOURNAL OF ELECTRONICS, 2021, 30 (05) : 931 - 939
  • [44] High speed 3D scanner with real-time 3D processing
    Lavelle, JP
    Schuet, SR
    Schuet, DJ
    2004 IEEE INTERNATIONAL WORKSHOP ON IMAGING SYSTEMS AND TECHNIQUES (IST), 2004, : 13 - 17
  • [45] High speed 3D scanner with real-time 3D processing
    Lavelle, JP
    Schuet, SR
    Schuet, DJ
    PROCEEDINGS OF THE ISA/IEEE SENSORS FOR INDUSTRY CONFERENCE, 2004, : 102 - 108
  • [46] QUALITY EVALUATION FOR REAL-TIME 3D VIDEO SERVICES
    Hewage, Chaminda T. E. R.
    Martini, Maria G.
    2011 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2011,
  • [47] Real-Time Point Cloud Alignment for Vehicle Localization in a High Resolution 3D Map
    Nagy, Balazs
    Benedek, Csaba
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT I, 2019, 11129 : 226 - 239
  • [48] Real-time 3D Point Cloud Segmentation using Growing Neural Gas with Utility
    Toda, Yuichiro
    Yu, Hui
    Ju, Zhaojie
    Takesue, Naoyuki
    Wada, Kazuyoshi
    Kubota, Naoyuki
    2016 9TH INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTIONS (HSI), 2016, : 418 - 422
  • [49] Real-Time 3D Object Detection From Point Cloud Through Foreground Segmentation
    Wang, Bo
    Zhu, Ming
    Lu, Ying
    Wang, Jiarong
    Gao, Wen
    Wei, Hua
    IEEE ACCESS, 2021, 9 : 84886 - 84898
  • [50] Real-time point cloud registration for flexible hand-held 3D scanning
    Yang, Yang
    Chen, HaiLong
    Liu, Xiaoli
    Tang, Qijian
    Cai, Zewei
    Peng, Xiang
    SIXTH SYMPOSIUM ON NOVEL OPTOELECTRONIC DETECTION TECHNOLOGY AND APPLICATIONS, 2020, 11455