共 50 条
Underwater image enhancement based on conditional generative adversarial network
被引:92
|作者:
Yang, Miao
[1
,2
,3
,4
]
Hu, Ke
[1
]
Du, Yixiang
[1
]
Wei, Zhiqiang
[3
]
Sheng, Zhibin
[1
]
Hu, Jintong
[1
]
机构:
[1] Jiangsu Ocean Univ, Sch Elect Engn, Lianyungang 222005, Peoples R China
[2] Jiangsu Univ Sci & Technol, Marine Equipment & Technol Inst, Zhenjiang 212000, Jiangsu, Peoples R China
[3] Qingdao Natl Lab Marine Sci & Technol, Qingdao 266100, Peoples R China
[4] Univ Virginia, Dept Elect & Comp Engn, Dept Biomed Engn, Charlottesville, VA 22903 USA
关键词:
Underwater image enhancement;
Conditional generative adversarial networks;
Adversarial learning;
Deep learning;
CO-SALIENCY DETECTION;
CONSTRAINT;
WATER;
D O I:
10.1016/j.image.2019.115723
中图分类号:
TM [电工技术];
TN [电子技术、通信技术];
学科分类号:
0808 ;
0809 ;
摘要:
Underwater images play an essential role in acquiring and understanding underwater information. High-quality underwater images can guarantee the reliability of underwater intelligent systems. Unfortunately, underwater images are characterized by low contrast, color casts, blurring, low light, and uneven illumination, which severely affects the perception and processing of underwater information. To improve the quality of acquired underwater images, numerous methods have been proposed, particularly with the emergence of deep learning technologies. However, the performance of underwater image enhancement methods is still unsatisfactory due to lacking sufficient training data and effective network structures. In this paper, we solve this problem based on a conditional generative adversarial network (cGAN), where the clear underwater image is achieved by a multi-scale generator. Besides, we employ a dual discriminator to grab local and global semantic information, which enforces the generated results by the multi-scale generator realistic and natural. Experiments on real-world and synthetic underwater images demonstrate that the proposed method performs favorable against the state-of-the-art underwater image enhancement methods.
引用
收藏
页数:9
相关论文