Fast and robust multiple ColorChecker detection using deep convolutional neural networks

被引:9
|
作者
Marrero Fernandez, Pedro D. [1 ]
Guerrero Pena, Fidel A. [1 ]
Ren, Tsang Ing [1 ]
Leandro, Jorge J. G. [2 ]
机构
[1] Univ Fed Pernambuco UFPE, Ctr Informat, Recife, PE, Brazil
[2] Motorola Mobil LLC, Sao Paulo, Brazil
关键词
ColorChecker detection; Photograph; Image quality; Color science; Color balance; Segmentation; Convolutional neural network;
D O I
10.1016/j.imavis.2018.11.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
ColorCheckers are reference standards that professional photographers and filmmakers use to ensure predictable results under every lighting condition. The objective of this work is to propose a new fast and robust method for automatic ColorChecker detection. The process is divided into two steps: (1) ColorCheckers localization and (2) ColorChecker patches recognition. For the ColorChecker localization, we trained a detection convolutional neural network using synthetic images. The synthetic images are created with the 3D models of the ColorChecker and different background images. The output of the neural networks are the bounding box of each possible ColorChecker candidates in the input image. Each bounding box defines a cropped image which is evaluated by a recognition system, and each image is canonized with regards to color and dimensions. Subsequently, all possible color patches are extracted and grouped with respect to the center's distance. Each group is evaluated as a candidate for a ColorChecker part, and its position in the scene is estimated. Finally, a cost function is applied to evaluate the accuracy of the estimation. The method is tested using real and synthetic images. The proposed method is fast, robust to overlaps and invariant to affine projections. The algorithm also performs well in case of multiple ColorCheckers detection. (C) 2018 Elsevier B.V. All rights reserved.
引用
下载
收藏
页码:15 / 24
页数:10
相关论文
共 50 条
  • [1] Fast and robust segmentation of the striatum using deep convolutional neural networks
    Choi, Hongyoon
    Jin, Kyong Hwan
    JOURNAL OF NEUROSCIENCE METHODS, 2016, 274 : 146 - 153
  • [2] Fast and Robust Compression of Deep Convolutional Neural Networks
    Wen, Jia
    Yang, Liu
    Shen, Chenyang
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 52 - 63
  • [3] Person Head Detection in Multiple Scales Using Deep Convolutional Neural Networks
    Saqib, Muhammad
    Khan, Sultan Daud
    Sharma, Nabin
    Blumenstein, Michael
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [4] Robust smile detection using convolutional neural networks
    Bianco, Simone
    Celona, Luigi
    Schettini, Raimondo
    JOURNAL OF ELECTRONIC IMAGING, 2016, 25 (06)
  • [5] ContlensNet: Robust Iris Contact Lens Detection Using Deep Convolutional Neural Networks
    Raghavendra, R.
    Raja, Kiran B.
    Busch, Christoph
    2017 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2017), 2017, : 1160 - 1167
  • [6] A Fast and Robust Deep Convolutional Neural Networks for Complex Human Activity Recognition Using Smartphone
    Qi, Wen
    Su, Hang
    Yang, Chenguang
    Ferrigno, Giancarlo
    De Momi, Elena
    Aliverti, Andrea
    SENSORS, 2019, 19 (17)
  • [7] Object Detection Using Deep Convolutional Neural Networks
    Qian, Huimin
    Xu, Jiawei
    Zhou, Jun
    2018 CHINESE AUTOMATION CONGRESS (CAC), 2018, : 1151 - 1156
  • [8] Fast Depth Reconstruction Using Deep Convolutional Neural Networks
    Maslov, Dmitrii
    Makarov, Ilya
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2021, PT I, 2021, 12861 : 456 - 467
  • [9] Robust Pixel-Level Crack Detection Using Deep Fully Convolutional Neural Networks
    Alipour, Mohamad
    Harris, Devin K.
    Miller, Gregory R.
    JOURNAL OF COMPUTING IN CIVIL ENGINEERING, 2019, 33 (06)
  • [10] Deep Multiple Instance Convolutional Neural Networks for Learning Robust Scene Representations
    Li, Zhili
    Xu, Kai
    Xie, Jiafen
    Bi, Qi
    Qin, Kun
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (05): : 3685 - 3702