Learning Single Camera Depth Estimation using Dual-Pixels

被引:76
|
作者
Garg, Rahul [1 ]
Wadhwa, Neal [1 ]
Ansari, Sameer [1 ]
Barron, Jonathan T. [1 ]
机构
[1] Google Res, Mountain View, CA 94043 USA
来源
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019) | 2019年
关键词
D O I
10.1109/ICCV.2019.00772
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning techniques have enabled rapid progress in monocular depth estimation, but their quality is limited by the ill-posed nature of the problem and the scarcity of high quality datasets. We estimate depth from a single camera by leveraging the dual-pixel auto-focus hardware that is increasingly common on modern camera sensors. Classic stereo algorithms and prior learning-based depth estimation techniques underperform when applied on this dualpixel data, the former due to too-strong assumptions about RGB image matching, and the latter due to not leveraging the understanding of optics of dual-pixel image formation. To allow learning based methods to work well on dual-pixel imagery, we identify an inherent ambiguity in the depth estimated from dual-pixel cues, and develop an approach to estimate depth up to this ambiguity. Using our approach, existing monocular depth estimation techniques can be effectively applied to dual-pixel data, and much smaller models can be constructed that still infer high quality depth. To demonstrate this, we capture a large dataset of in-the-wild 5-viewpoint RGB images paired with corresponding dualpixel data, and show how view supervision with this data can be used to learn depth up to the unknown ambiguity. On our new task, our model is 30% more accurate than any prior work on learning-based monocular or stereoscopic depth estimation.
引用
收藏
页码:7627 / 7636
页数:10
相关论文
共 50 条
  • [41] Unsupervised Learning of Depth Estimation and Camera Pose With Multi-Scale GANs
    Xu, Yufan
    Wang, Yan
    Huang, Rui
    Lei, Zeyu
    Yang, Junyao
    Li, Zijian
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (10) : 17039 - 17047
  • [42] DNN Based Camera Attitude Estimation Using Aggregated Information from Camera and Depth Images
    Kawai, Hibiki
    Kuroda, Yoji
    2023 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION, SII, 2023,
  • [43] Distance estimation using a single computational camera with dual off-axis color filtered apertures
    Lee, Seungwon
    Hayes, Monson H.
    Paik, Joonki
    OPTICS EXPRESS, 2013, 21 (20): : 23116 - 23129
  • [44] Depth Estimation in Still Images and Videos Using a Motionless Monocular Camera
    Diamantas, Sotirios
    Astaras, Stefanos
    Pnevmatikakis, Aristodemos
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES (IST), 2016, : 129 - 134
  • [45] Deflection Estimation Methods of Structure Using Active Stereo Depth Camera
    Shin, Soojung
    Lee, Donghwan
    Cha, Gichun
    Joon, Yu Byoung
    Park, Seunghee
    JOURNAL OF THE KOREAN SOCIETY FOR NONDESTRUCTIVE TESTING, 2020, 40 (02) : 103 - 111
  • [46] FAST RESPONSE AGGREGATION FOR DEPTH ESTIMATION USING LIGHT FIELD CAMERA
    Yang, Cao
    Kang, Kai
    Zhang, Jing
    Wang, Zengfu
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 1636 - 1640
  • [47] Optimization of Camera Arrangement Using Correspondence Field to Improve Depth Estimation
    Fu, Shichao
    Safaei, Farzad
    Li, Wanqing
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (06) : 3038 - 3050
  • [48] Camera Motion Estimation Method using Depth-Normalized Criterion
    Lee, Seok
    JOURNAL OF IMAGING SCIENCE AND TECHNOLOGY, 2023, 67 (06)
  • [49] Accurate Depth Estimation Using Spatiotemporal Consistency in Arbitrary Camera Arrays
    Jang, Woo-Seok
    Ho, Yo-Sung
    STEREOSCOPIC DISPLAYS AND APPLICATIONS XXIV, 2013, 8648
  • [50] Monocular Fisheye Camera Depth Estimation Using Sparse LiDAR Supervision
    Kumar, Varun Ravi
    Milz, Stefan
    Witt, Christian
    Simon, Martin
    Amende, Karl
    Petzold, Johannes
    Yogamani, Senthil
    Pech, Timo
    2018 21ST INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2018, : 2853 - 2858