Automatic Walking Method of Construction Machinery Based on Binocular Camera Environment Perception

被引:3
|
作者
Fang, Zhen [1 ,2 ]
Lin, Tianliang [1 ,2 ]
Li, Zhongshen [1 ,2 ]
Yao, Yu [1 ,2 ]
Zhang, Chunhui [1 ,2 ]
Ma, Ronghua [1 ,2 ]
Chen, Qihuai [1 ,2 ]
Fu, Shengjie [1 ,2 ]
Ren, Haoling [1 ,2 ]
机构
[1] Huaqiao Univ, Coll Mech Engn & Automat, Xiamen 361021, Peoples R China
[2] Fujian Key Lab Green Intelligent Drive & Transmis, Xiamen 361021, Peoples R China
基金
中国国家自然科学基金;
关键词
construction machinery; unmanned driving; end-to-end; binocular detection; ranging; SYSTEM;
D O I
10.3390/mi13050671
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In this paper, we propose an end-to-end automatic walking system for construction machinery, which uses binocular cameras to capture images of construction machinery for environmental perception, detects target information in binocular images, estimates the relative distance between the current target and cameras, and predicts the real-time control signal of construction machinery. This system consists of two parts: the binocular recognition ranging model and the control model. Objects within 5 m can be quickly detected by the recognition ranging model, and at the same time, the distance of the object can be accurately ranged to ensure the full perception of the surrounding environment of the construction machinery. The distance information of the object, the feature information of the binocular image, and the control signal of the previous stage are sent to the control model; then, the prediction of the control signal of the construction machinery can be output in the next stage. In this way, the automatic walking experiment of the construction machinery in a specific scenario is completed, which proves that the model can control the machinery to complete the walking task smoothly and safely.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Camera-based PHM method in rotating machinery equipment micro-action scenarios
    An, Junfeng
    Liu, Jiqiang
    Zhen, Hao
    Lu, Mengmeng
    EKSPLOATACJA I NIEZAWODNOSC-MAINTENANCE AND RELIABILITY, 2023, 25 (01): : 1 - 15
  • [42] Computational imaging and occluded objects perception method based on polarization camera array
    Pu, Xiankun
    Wang, Xin
    Shi, Lei
    Ma, Yiming
    Wei, Chongfeng
    Gao, Xinjian
    Gao, Jun
    OPTICS EXPRESS, 2023, 31 (15) : 24633 - 24651
  • [43] Whether and how is a surveillance camera jittering? A ROR perception based framework and method
    Fei Gao
    Kaitao Mei
    Libo Weng
    Yaozhong Zhuang
    Applied Intelligence, 2023, 53 : 22105 - 22116
  • [44] VFAST-BEV: An Efficient BEV Perception Method Based on Virtual Camera
    Li, Yang-Mei
    Cai, Bo
    Yang, Ling-Ling
    2024 2ND ASIA CONFERENCE ON COMPUTER VISION, IMAGE PROCESSING AND PATTERN RECOGNITION, CVIPPR 2024, 2024,
  • [45] Binocular vision vehicle environment collision early warning method based on machine learning
    Mi H.
    Zheng Y.
    Mi, Hong (yongfeen@sina.com), 1600, Inderscience Enterprises Ltd., 29, route de Pre-Bois, Case Postale 856, CH-1215 Geneva 15, CH-1215, Switzerland (05): : 219 - 230
  • [46] Highly accurate 3D reconstruction based on a precise and robust binocular camera calibration method
    Hu, Guoliang
    Zhou, Zuofeng
    Cao, Jianzhong
    Huang, Huimin
    IET IMAGE PROCESSING, 2020, 14 (14) : 3588 - 3595
  • [47] High-precision binocular camera calibration method based on a 3D calibration object
    Zhang, Xiaowen
    Lv, Tiegang
    Dan, Wang
    Zhang, Minghao
    APPLIED OPTICS, 2024, 63 (10) : 1 - 16
  • [48] Automatic Extraction Method for Gait Parameters of Quadruped Walking Based on Computer Vision
    Chen Yao
    Zhang Yunwei
    Lei Jinhui
    Li Li
    LASER & OPTOELECTRONICS PROGRESS, 2022, 59 (08)
  • [49] PERCEPTION BASED METHOD FOR MEASURING THE AESTHETIC QUALITY OF THE URBAN ENVIRONMENT
    Nia, Hourakhsh Ahmad
    Atun, Resmiye Alpar
    Rahbarianyazd, Rokhsaneh
    OPEN HOUSE INTERNATIONAL, 2017, 42 (02) : 11 - 19
  • [50] Whether and how is a surveillance camera jittering? A ROR perception based framework and method
    Gao, Fei
    Mei, Kaitao
    Weng, Libo
    Zhuang, Yaozhong
    APPLIED INTELLIGENCE, 2023, 53 (19) : 22105 - 22116