Burst photography for high dynamic range and low-light imaging on mobile cameras

被引:291
|
作者
Hasinoff, Samuel W. [1 ]
Sharlet, Dillon [1 ]
Geiss, Ryan [1 ]
Adams, Andrew [1 ]
Barron, Jonathan T. [1 ]
Kainz, Florian [1 ]
Chen, Jiawen [1 ]
Levoy, Marc [1 ]
机构
[1] Google Res, Mountain View, CA 94043 USA
来源
ACM TRANSACTIONS ON GRAPHICS | 2016年 / 35卷 / 06期
关键词
computational photography; high dynamic range;
D O I
10.1145/2980179.2980254
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Cell phone cameras have small apertures, which limits the number of photons they can gather, leading to noisy images in low light. They also have small sensor pixels, which limits the number of electrons each pixel can store, leading to limited dynamic range. We describe a computational photography pipeline that captures, aligns, and merges a burst of frames to reduce noise and increase dynamic range. Our system has several key features that help make it robust and efficient. First, we do not use bracketed exposures. Instead, we capture frames of constant exposure, which makes alignment more robust, and we set this exposure low enough to avoid blowing out highlights. The resulting merged image has clean shadows and high bit depth, allowing us to apply standard HDR tone mapping methods. Second, we begin from Bayer raw frames rather than the demosaicked RGB (or YUV) frames produced by hardware Image Signal Processors (ISPs) common on mobile platforms. This gives us more bits per pixel and allows us to circumvent the ISP's unwanted tone mapping and spatial denoising. Third, we use a novel FFT-based alignment algorithm and a hybrid 2D/3D Wiener filter to denoise and merge the frames in a burst. Our implementation is built atop Android's Camera2 API, which provides per-frame camera control and access to raw imagery, and is written in the Halide domain-specific language (DSL). It runs in 4 seconds on device (for a 12 Mpix image), requires no user intervention, and ships on several mass-produced cell phones.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] APPLICATIONS OF LOW-LIGHT IMAGING TO LIFE SCIENCES
    NICOLAS, JC
    JOURNAL OF BIOLUMINESCENCE AND CHEMILUMINESCENCE, 1994, 9 (03): : 139 - 144
  • [32] Handheld Mobile Photography in Very Low Light
    Liba, Orly
    Murthy, Kiran
    Tsai, Yun-Ta
    Brooks, Tim
    Xue, Tianfan
    Karnad, Nikhil
    He, Qiurui
    Barron, Jonathan T.
    Sharlet, Dillon
    Geiss, Ryan
    Hasinoff, Samuel W.
    Pritch, Yael
    Levoy, Marc
    ACM TRANSACTIONS ON GRAPHICS, 2019, 38 (06):
  • [33] Physics-Based Noise Modeling for Extreme Low-Light Photography
    Wei, Kaixuan
    Fu, Ying
    Zheng, Yinqiang
    Yang, Jiaolong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (11) : 8520 - 8537
  • [34] HIGH-RESOLUTION IMAGING AT LOW-LIGHT LEVELS THROUGH WEAK TURBULENCE
    SNYDER, DL
    SCHULZ, TJ
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 1990, 7 (07) : 1251 - 1265
  • [35] Learning to restore light fields under low-light imaging
    Zhang, Shansi
    Lam, Edmund Y.
    NEUROCOMPUTING, 2021, 456 : 76 - 87
  • [36] High Dynamic Range Imaging and Low Dynamic Range Expansion for Generating HDR Content
    Banterle, Francesco
    Debattista, Kurt
    Artusi, Alessandro
    Pattanaik, Sumanta
    Myszkowski, Karol
    Ledda, Patrick
    Chalmers, Alan
    COMPUTER GRAPHICS FORUM, 2009, 28 (08) : 2343 - 2367
  • [37] Anomaly Detection on the Edge Using Smart Cameras under Low-Light Conditions
    Abu Awwad, Yaser
    Rana, Omer
    Perera, Charith
    SENSORS, 2024, 24 (03)
  • [38] Enabling Effective Low-Light Perception using Ubiquitous Low-Cost Visible-Light Cameras
    Morawski, Igor
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 6915 - 6919
  • [39] Compact Approach for High Dynamic Range Imaging in Mobile Digital Camera
    Bae, Kyung-Hoon
    Park, Byung Kwan
    2015 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE), 2015, : 339 - 342
  • [40] ICCD, EMCCD, and sCMOS compete in low-light imaging
    Buchin, Michael P.
    LASER FOCUS WORLD, 2011, 47 (07): : 51 - 56