Depth Estimation From Light Field Using Graph-Based Structure-Aware Analysis

被引:25
|
作者
Zhang, Yuchen [1 ]
Dai, Wenrui [2 ]
Xu, Mingxing [1 ]
Zou, Junni [2 ]
Zhang, Xiaopeng [3 ]
Xiong, Hongkai [1 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Elect Engn, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai 200240, Peoples R China
[3] Huawei Technol Co Ltd, Noahs Ark Lab, Shanghai 201206, Peoples R China
基金
中国国家自然科学基金;
关键词
Light field; depth map; graph spectral analysis; graph Laplacian matrix; FOURIER-TRANSFORM;
D O I
10.1109/TCSVT.2019.2954948
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Existing light field depth map estimation approaches only utilize partial angular views in occlusion areas and local spatial dependencies in the optimization. This paper proposes a novel two-stage light field depth estimation method via graph spectral analysis to exploit the complete correlations and dependencies within angular patches and spatial images. The initial depth map estimation leverages the undirected graph to jointly consider occluded and unoccluded views within each angular patch. The estimated depth minimizes the structural incoherence of its corresponding angular patch with the focused one by evaluating the highest graph frequency component. Subsequently, depth map refinement optimizes the initial depth map with the color consistency and smoothness formulated by weighted adjacency matrix. The structural constraints are efficiently employed using low-pass graph filtering with Chebyshev polynomial approximation. Experimental results demonstrate that the proposed method improves the depth map estimation, especially in the edge regions.
引用
收藏
页码:4269 / 4283
页数:15
相关论文
共 50 条
  • [41] CONTEXT-AWARE GRAPH-BASED ANALYSIS FOR DETECTING ANOMALOUS ACTIVITIES
    Das Bhattacharjee, Sreyasee
    Yuan, Junsong
    Zhang Jiaqi
    Tan, Yap-Peng
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2017, : 1021 - 1026
  • [42] Depth Map Denoising using Graph-based Transform and Group Sparsity
    Hu, Wei
    Li, Xin
    Cheung, Gene
    Au, Oscar
    [J]. 2013 IEEE 15TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), 2013, : 1 - 6
  • [43] Structure Aware Experience Replay for Incremental Learning in Graph-based Recommender Systems
    Ahrabian, Kian
    Xu, Yishi
    Zhang, Yingxue
    Wu, Jiapeng
    Wang, Yuening
    Coates, Mark
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2832 - 2836
  • [44] Light Field Super-Resolution Using Edge-Preserved Graph-Based Regularization
    Ghassab, Vahid Khorasani
    Bouguila, Nizar
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (06) : 1447 - 1457
  • [45] A NONSMOOTH GRAPH-BASED APPROACH TO LIGHT FIELD SUPER-RESOLUTION
    Rossi, Mattia
    El Gheche, Mireille
    Frossard, Pascal
    [J]. 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 2590 - 2594
  • [46] Pre-Demosaic Graph-Based Light Field Image Compression
    Chao, Yung-Hsuan
    Hong, Haoran
    Cheung, Gene
    Ortega, Antonio
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 1816 - 1829
  • [47] A Novel Occlusion-Aware Vote Cost for Light Field Depth Estimation
    Han, Kang
    Xiang, Wei
    Wang, Eric
    Huang, Tao
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (11) : 8022 - 8035
  • [48] ACCURATE LIGHT FIELD DEPTH ESTIMATION VIA AN OCCLUSION-AWARE NETWORK
    Guo, Chunle
    Jin, Jing
    Hou, Junhui
    Chen, Jie
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [49] Structure-Aware 3D Hand Pose Regression from a Single Depth Image
    Malik, Jameel
    Elhayek, Ahmed
    Stricker, Didier
    [J]. VIRTUAL REALITY AND AUGMENTED REALITY, EUROVR 2018, 2018, 11162 : 3 - 17
  • [50] Occlusion-Aware Unsupervised Light Field Depth Estimation Based on Multi-Scale GANs
    Yan, Wenbin
    Zhang, Xiaogang
    Chen, Hua
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (07) : 6318 - 6333