Deep Depth from Defocus: How Can Defocus Blur Improve 3D Estimation Using Dense Neural Networks?

被引:14
|
作者
Carvalho, Marcela [1 ]
Le Saux, Bertrand [1 ]
Trouve-Peloux, Pauline [1 ]
Almansa, Andres [2 ]
Champagnat, Frederic [2 ]
机构
[1] Univ Paris Saclay, DTIS, ONERA, F-91123 Palaiseau, France
[2] Univ Paris 05, F-75006 Paris, France
来源
关键词
Depth from defocus; Domain adaptation; Depth estimation; Single-image depth prediction; BLIND DECONVOLUTION;
D O I
10.1007/978-3-030-11009-3_18
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Depth estimation is critical interest for scene understanding and accurate 3D reconstruction. Most recent approaches with deep learning exploit geometrical structures of standard sharp images to predict depth maps. However, cameras can also produce images with defocus blur depending on the depth of the objects and camera settings. Hence, these features may represent an important hint for learning to predict depth. In this paper, we propose a full system for single-image depth prediction in the wild using depth-from-defocus and neural networks. We carry out thorough experiments real and simulated defocused images using a realistic model of blur variation with respect to depth. We also investigate the influence of blur on depth prediction observing model uncertainty with a Bayesian neural network approach. From these studies, we show that out-of-focus blur greatly improves the depth-prediction network performances. Furthermore, we transfer the ability learned on a synthetic, indoor dataset to real, indoor and outdoor images. For this purpose, we present a new dataset with real all-focus and defocused images from a DSLR camera, paired with ground truth depth maps obtained with an active 3D sensor for indoor scenes. The proposed approach is successfully validated on both this new dataset and standard ones as NYUv2 or Depth-in-the-Wild. Code and new datasets are available at https://github.com/marcelampc/d3net_depth_estimation.
引用
收藏
页码:307 / 323
页数:17
相关论文
共 50 条
  • [1] Depth estimation using spectrally varying defocus blur
    Ishihara, Shin
    Sulc, Antonin
    Sato, Imari
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2021, 38 (08) : 1140 - 1149
  • [2] Static Object Depth Estimation Using Defocus Blur Levels Features
    Rajabzadeh, Tayebeh
    Vahedian, Abedin
    Pourreza, Hamidreza
    2010 6TH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS NETWORKING AND MOBILE COMPUTING (WICOM), 2010,
  • [3] SUPERPIXEL-BASED DEPTH MAP ESTIMATION USING DEFOCUS BLUR
    Mahmoudpour, Saeed
    Kim, Manbae
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 2613 - 2617
  • [4] 3-D Scene Reconstruction Using Depth from Defocus and Deep Learning
    Emerson, David R.
    Christopher, Lauren A.
    2019 IEEE APPLIED IMAGERY PATTERN RECOGNITION WORKSHOP (AIPR), 2019,
  • [5] 3D positioning and autofocus of the particle field based on the depth-from-defocus method and the deep networks
    Zhang, Xiaolei
    Dong, Zhao
    Wang, Huaying
    Sha, Xiaohui
    Wang, Wenjian
    Su, Xinyu
    Hu, Zhengsheng
    Yang, Shaokai
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2023, 4 (02):
  • [6] AFM Probes Depth Estimation from Convolutional Neural Networks Based Defocus Depth Measurement
    Yuan, Shuai
    Wang, Zebin
    Yang, Yongliang
    Instruments and Experimental Techniques, 2024, 67 (05) : 1024 - 1032
  • [7] Combining Defocus and Photoconsistency for Depth Map Estimation in 3D Integral Imaging
    Espinos-Morato, H.
    Latorre-Carmona, P.
    Martinez Sotoca, J.
    Pla, F.
    Javidi, B.
    PATTERN RECOGNITION AND IMAGE ANALYSIS (IBPRIA 2017), 2017, 10255 : 114 - 121
  • [8] 3D surface reconstruction by combination of photopolarimetry and depth from defocus
    d'angelo, P
    Wöhler, C
    PATTERN RECOGNITION, PROCEEDINGS, 2005, 3663 : 176 - 183
  • [9] Monocular 3D Metric Scale Reconstruction using Depth from Defocus and Image Velocity
    Shiozaki, Tomoyuki
    Dissanayake, Gamini
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 6723 - 6728
  • [10] Single frame coaxial 3D measurement using depth from defocus of projection system
    Kurihara, Toru
    Ando, Shigeru
    THREE-DIMENSIONAL IMAGE PROCESSING (3DIP) AND APPLICATIONS II, 2012, 8290