Fast Depth Densification for Occlusion-aware Augmented Reality

被引:2
|
作者
Holynski, Aleksander [1 ]
Kopf, Johannes [2 ]
机构
[1] Univ Washington, Seattle, WA 98195 USA
[2] Facebook, Cambridge, MA USA
来源
ACM TRANSACTIONS ON GRAPHICS | 2018年 / 37卷 / 06期
关键词
Augmented Reality; 3D Reconstruction; Video Analysis; Depth Estimation; Simultaneous Localization and Mapping;
D O I
暂无
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Current AR systems only track sparse geometric features but do not compute depth for all pixels. For this reason, most AR effects are pure overlays that can never be occluded by real objects. We present a novel algorithm that propagates sparse depth to every pixel in near realtime. The produced depth maps are spatio-temporally smooth but exhibit sharp discontinuities at depth edges. This enables AR effects that can fully interact with and be occluded by the real scene. Our algorithm uses a video and a sparse SLAM reconstruction as input. It starts by estimating soft depth edges from the gradient of optical flow fields. Because optical flow is unreliable near occlusions we compute forward and backward flow fields and fuse the resulting depth edges using a novel reliability measure. We then localize the depth edges by thinning and aligning them with image edges. Finally, we optimize the propagated depth smoothly but encourage discontinuities at the recovered depth edges. We present results for numerous real-world examples and demonstrate the effectiveness for several occlusion-aware AR video effects. To quantitatively evaluate our algorithm we characterize the properties that make depth maps desirable for AR applications, and present novel evaluation metrics that capture how well these are satisfied. Our results compare favorably to a set of competitive baseline algorithms in this context.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Fast Depth Densification for Occlusion-aware Augmented Reality
    Holynski, Aleksander
    Kopf, Johannes
    SIGGRAPH ASIA'18: SIGGRAPH ASIA 2018 TECHNICAL PAPERS, 2018,
  • [2] Bare-hand gesture occlusion-aware interactive augmented reality assembly
    Fang, Wei
    Hong, Jianhao
    Journal of Manufacturing Systems, 2022, 65 : 169 - 179
  • [3] Bare-hand gesture occlusion-aware interactive augmented reality assembly
    Fang, Wei
    Hong, Jianhao
    JOURNAL OF MANUFACTURING SYSTEMS, 2022, 65 : 169 - 179
  • [4] Depth from Defocus with Learned Optics for Imaging and Occlusion-aware Depth Estimation
    Ikoma, Hayato
    Nguyen, Cindy M.
    Metzler, Christopher A.
    Peng, Yifan
    Wetzstein, Gordon
    2021 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL PHOTOGRAPHY (ICCP), 2021,
  • [5] Occlusion-aware light field depth estimation with view attention
    Wang, Xucheng
    Tao, Chenning
    Zheng, Zhenrong
    OPTICS AND LASERS IN ENGINEERING, 2023, 160
  • [6] Occlusion-Aware Cost Constructor for Light Field Depth Estimation
    Wang, Yingqian
    Wang, Longguang
    Liang, Zhengyu
    Yang, Jungang
    An, Wei
    Guo, Yulan
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 19777 - 19786
  • [7] Occlusion-Aware Interfaces
    Vogel, Daniel
    Balakrishnan, Ravin
    CHI2010: PROCEEDINGS OF THE 28TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, VOLS 1-4, 2010, : 263 - 272
  • [8] ACCURATE LIGHT FIELD DEPTH ESTIMATION VIA AN OCCLUSION-AWARE NETWORK
    Guo, Chunle
    Jin, Jing
    Hou, Junhui
    Chen, Jie
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [9] Occlusion-Aware View Interpolation
    Serdar Ince
    Janusz Konrad
    EURASIP Journal on Image and Video Processing, 2008
  • [10] Occlusion-Aware Depth Map Coding Optimization Using Allowable Depth Map Distortions
    Gao, Pan
    Smolic, Aljosa
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (11) : 5266 - 5280