Real-time dense-view imaging for three-dimensional light-field display based on image color calibration and self-supervised view synthesis

被引:2
|
作者
Guo, Xiao [1 ]
Sang, Xinzhu [1 ]
Yan, Binbin [1 ]
Wang, Huachun [1 ]
Ye, Xiaoqian [1 ]
Chen, Shuo [1 ]
Wan, Huaming [1 ]
Li, Ningchi [1 ]
Zeng, Zhehao [1 ]
Chen, Duo [1 ]
Wang, Peng [1 ]
Xing, Shujun [1 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Informat Photon & Opt Commun, Beijing 100876, Peoples R China
基金
中国国家自然科学基金;
关键词
Compilation and indexing terms; Copyright 2024 Elsevier Inc;
D O I
10.1364/OE.461789
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Three-Dimensional (3D) light-field display has achieved promising improvement in recent years. However, since the dense-view images cannot be collected fast in real-world 3D scenes, the real-time 3D light-field display is still challenging to achieve in real scenes, especially at the high-resolution 3D display. Here, a real-time 3D light-field display method with dense-view is proposed based on image color correction and self-supervised optical flow estimation, and a high-quality and high frame rate of 3D light-field display can be realized simultaneously. A sparse camera array is firstly used to capture sparse-view images in the proposed method. To eliminate the color deviation of the sparse views, the imaging process of the camera is analyzed, and a practical multi-layer perception (MLP) network is proposed to perform color calibration. Given sparse views with consistent color, the optical flow can be estimated by a lightweight convolutional neural network (CNN) at high speed, which uses the input image pairs to learn the optical flow in a self-supervised manner. With inverse warp operation, dense-view images can be synthesized in the end. Quantitative and qualitative experiments are performed to evaluate the feasibility of the proposed method. Experimental results show that over 60 dense-view images at a resolution of 1024 x 512 can be generated with 11 input views at a frame rate over 20 fps, which is 4x faster than previous optical flow estimation methods PWC-Net and LiteFlowNet3. Finally, large viewing angles and high-quality 3D light-field display at 3840 x 2160 resolution can be achieved in real-time. (C) 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement
引用
收藏
页码:22260 / 22276
页数:17
相关论文
共 28 条
  • [1] Dense-view synthesis for three-dimensional light-field display based on unsupervised learning
    Chen, Duo
    Sang, Xinzhu
    Wang, Peng
    Yu, Xunbo
    Yan, Binbin
    Wang, Huachun
    Ning, Mengyang
    Qi, Shuai
    Ye, Mown
    [J]. OPTICS EXPRESS, 2019, 27 (17) : 24624 - 24641
  • [2] Dense view synthesis for three-dimensional light-field display based on scene geometric reconstruction
    Qi, Shuai
    Sang, Xinzhu
    Yan, Binbin
    Wang, Peng
    Chen, Duo
    Wang, Huachun
    Ye, Xiaoqian
    Wan, Huaming
    [J]. OPTICS COMMUNICATIONS, 2022, 522
  • [3] A crosstalk-suppressed dense multi-view light-field display based on real-time light-field pickup and reconstruction
    Yang, Le
    Sang, Xinzhu
    Yu, Xunbo
    Liu, Boyang
    Yan, Binbin
    Wang, Kuiru
    Yu, Chongxiu
    [J]. OPTICS EXPRESS, 2018, 26 (26): : 34412 - 34427
  • [4] Large field of view real-time three-dimensional imaging for ports
    Gao Meijing
    Wu Weilong
    Gu Haihua
    Bi Weihong
    [J]. INTERNATIONAL SYMPOSIUM ON PHOTOELECTRONIC DETECTION AND IMAGING 2011: LASER SENSING AND IMAGING AND BIOLOGICAL AND MEDICAL APPLICATIONS OF PHOTONICS SENSING AND IMAGING, 2011, 8192
  • [5] Dense view synthesis for three-dimensional light-field displays based on position-guiding convolutional neural network
    Wang, Huachun
    Yan, Binbin
    Sang, Xinzhu
    Chen, Duo
    Wang, Peng
    Qi, Shuai
    Ye, Xiaoqian
    Guo, Xiao
    [J]. Optics and Lasers in Engineering, 2022, 153
  • [6] Dense view synthesis for three-dimensional light-field displays based on position-guiding convolutional neural network
    Wang, Huachun
    Yan, Binbin
    Sang, Xinzhu
    Chen, Duo
    Wang, Peng
    Qi, Shuai
    Ye, Xiaoqian
    Guo, Xiao
    [J]. OPTICS AND LASERS IN ENGINEERING, 2022, 153
  • [7] Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display
    Kim, Jonghyun
    Moon, Seokil
    Jeong, Youngmo
    Jang, Changwon
    Kim, Youngmin
    Lee, Byoungho
    [J]. JOURNAL OF BIOMEDICAL OPTICS, 2018, 23 (06)
  • [8] Almost real-time three-dimensional anatomical imaging: a surgeons view for the electrophysiologist?
    Knackstedt, C
    Franke, A
    Mischke, K
    Schummers, G
    Galuschky, C
    Stellbrink, C
    Hanrath, P
    Schauerte, P
    [J]. EUROPEAN HEART JOURNAL, 2004, 25 : 262 - 262
  • [9] Real-time optical reconstruction for a three-dimensional light-field display based on path-tracing and CNN super-resolution
    Guo, Xiao
    Sang, Xinzhu
    Chen, Duo
    Wang, Peng
    Wang, Huachun
    Liu, Xue
    Li, Yuanhang
    Xing, Shujun
    Yan, Binbin
    [J]. OPTICS EXPRESS, 2021, 29 (23) : 37862 - 37876
  • [10] Color three-dimensional display with omnidirectional view based on a light-emitting diode projector
    Yan, Caijie
    Liu, Xu
    Li, Haifeng
    Xia, Xinxing
    Lu, Haixia
    Zheng, Wenting
    [J]. APPLIED OPTICS, 2009, 48 (22) : 4490 - 4495