Depth Estimation of Stereo Matching Based on Microarray Camera

被引:0
|
作者
Chen, Xiaoguang [1 ]
Li, Dan [1 ]
Zou, Jiancheng [1 ]
机构
[1] North China Univ ofTechnol, Inst Image Proc & Pattern Recognit, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
microarray cameras; camera calibration; markov random field; depth estimation; stereo matching;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Depth information affects greatly the accuracy of image measurement, 3D reconstruction and image recognition. In general, the methods of obtaining depth information are from 3D laser scanners, structured light, and depth cameras. The traditional method using binocular camera to obtain the depth information of images is based on the disparity between the left and right views, but it also brings some problems such as the occlusion area and mismatched points. To improve the accuracy, we proposed a novel method of depth estimation of stereo matching based on microarray camera. First, each lens in the camera is calibrated to compute the intrinsic and extrinsic parameters, which are used to rectify the captured images respectively. Then stereo matching between images is modeled by a Markov random field, and the energy cost function for the MRF is built and the minimization of it is solved with Graph-Cuts algorithm. Finally, the matching result is obtained with depth information simultaneously. In the minimization procedure, level division of depth is incorporated, and the gradient information of the reference image is utilized to refine the depth layers of the corresponding image. Experimental results showed the efficiency and accuracy of the proposed algorithm.
引用
收藏
页码:108 / 112
页数:5
相关论文
共 50 条
  • [1] Depth Estimation via Light Field Camera with a Hybrid Stereo Matching Method
    Ren, Shaojie
    Wu, Chunhong
    Sun, Mingxin
    Fu, Dongmei
    [J]. TENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING (ICGIP 2018), 2019, 11069
  • [2] DEPTH ESTIMATION IN LIGHT FIELD CAMERA ARRAYS BASED ON MULTI-STEREO MATCHING AND BELIEF PROPAGATION
    Rogge, Segolene
    Munteanu, Adrian
    [J]. 2018 - 3DTV-CONFERENCE: THE TRUE VISION - CAPTURE, TRANSMISSION AND DISPLAY OF 3D VIDEO (3DTV-CON), 2018,
  • [3] LiDAR - Stereo Camera Fusion for Accurate Depth Estimation
    Cholakkal, Hafeez Husain
    Mentasti, Simone
    Bersani, Mattia
    Arrigoni, Stefano
    Matteucci, Matteo
    Cheli, Federico
    [J]. 2020 AEIT INTERNATIONAL CONFERENCE OF ELECTRICAL AND ELECTRONIC TECHNOLOGIES FOR AUTOMOTIVE (AEIT AUTOMOTIVE), 2020,
  • [4] Energy-Based Iterative Cost Aggregation in Depth Estimation with a Stereo Camera
    Nguyen Xuan Truong
    Lee, Huyk-Jae
    [J]. 2016 INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC), 2016, : 319 - 320
  • [5] Depth Estimation by Combining Stereo Matching and Coded Aperture
    Wang, Chun
    Sahin, Erdem
    Suominen, Olli
    Gotchev, Atanas
    [J]. 2014 IEEE VISUAL COMMUNICATIONS AND IMAGE PROCESSING CONFERENCE, 2014, : 291 - 294
  • [6] A novel depth estimation approach based on bidirectional matching for stereo vision systems
    Okae, J.
    Du, J.
    Huang, T.
    [J]. ADVANCED ROBOTICS, 2020, 34 (15) : 998 - 1011
  • [7] Weakly supervised monocular depth estimation method based on stereo matching labels
    Zhang, Zhimin
    Qiao, Jianzhong
    Lin, Shukuan
    Liu, Han
    [J]. JOURNAL OF ELECTRONIC IMAGING, 2020, 29 (05)
  • [8] SDGE: Stereo Guided Depth Estimation for 360°Camera Sets
    Xu, Jialei
    Yin, Wei
    Gong, Dong
    Jiang, Junjun
    Liu, Xianming
    [J]. IEEE International Conference on Intelligent Robots and Systems, 2024, : 11179 - 11186
  • [9] Object-based Stereo Matching Using Adjustable-cross for Depth Estimation
    Wang, Li-Hung
    Tsai, Kai-Lung
    Wu, Chung-Bin
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TW), 2015, : 194 - 195
  • [10] An Improved Depth Estimation using Stereo Matching and Disparity Refinement Based on Deep Learning
    Deepa
    Jyothi, K.
    Udupa, Abhishek A.
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (11) : 552 - 559