Underwater Image Super-resolution Using SRCNN

被引:1
|
作者
Ooyama, Shinnosuke [1 ]
Lu, Huimin [1 ]
Kamiya, Tohru [1 ]
Serikawa, Seiichi [1 ]
机构
[1] Kyushu Inst Technol, Sch Engn, Kitakyushu, Fukuoka 8048550, Japan
关键词
Super-resolution; Underwater image; Loss function; Deep convolutional neural network; ENHANCEMENT;
D O I
10.1117/12.2603761
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, energy minerals have become more important due to the rapid industrialization worldwide. Due to the rapid industrialization on a global scale, there is a shortage of mineral resources, and there are more opportunities to rely on alternative energy sources. Therefore, the exploration of marine resources, which are abundant in the ocean, is being promoted. However, it is dangerous and impractical for humans to dive and search for marine resources by hand Therefore, it is possible to proceed with underwater exploration safely by having a robot do the work instead. Robots have been used as a mainstream search tool in the underwater environment due to the existence of various hazardous environmental conditions. However, there are several problems associated with robot control in underwater environments, one of which is poor visibility in the water. One of the problems is the poor visibility in the water. To improve the visibility in the water, we are trying to increase the resolution of underwater images by using super-resolution technology. In this paper, we conduct experiments using SRCNN, which is a basic super-resolution technique for underwater images. In addition, we investigate the effectiveness of "Mish", which has been attracting attention in recent years for its potential to surpass the performance of "ReLU", although "ReLU" is a typical activation function of neural networks, on SRCNN.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Underwater Image Super-Resolution using Deep Residual Multipliers
    Islam, Md Jahidul
    Enan, Sadman Sakib
    Luo, Peigen
    Sattar, Junaed
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 900 - 906
  • [2] Super-Resolution Diffusion Tensor Imaging using SRCNN: A Feasibility Study
    Elsaid, Nahla M. H.
    Wu, Yu-Chien
    2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2019, : 2830 - 2834
  • [3] Underwater Image Super-Resolution by Descattering and Fusion
    Lu, Huimin
    Li, Yujie
    Nakashima, Shota
    Kim, Hyongseop
    Serikawa, Seiichi
    IEEE ACCESS, 2017, 5 : 670 - 679
  • [4] UNDERWATER IMAGE ENHANCEMENT AND SUPER-RESOLUTION USING IMPLICIT NEURAL NETWORKS
    Chu, Xueye
    Fu, Zhenqi
    Yu, Shaocong
    Tu, Xiaotong
    Huang, Yue
    Ding, Xinghao
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 1295 - 1299
  • [5] Progressive Attentional Learning for Underwater Image Super-Resolution
    Chen, Xuelei
    Wei, Shiqing
    Yi, Chao
    Quan, Lingwei
    Lu, Cunyue
    INTELLIGENT ROBOTICS AND APPLICATIONS, 2020, 12595 : 233 - 243
  • [6] Image quality assessment for determining efficacy and limitations of Super-Resolution Convolutional Neural Network (SRCNN)
    Ward, Chris M.
    Harguess, Josh
    Crabb, Brendan
    Parameswaran, Shibin
    APPLICATIONS OF DIGITAL IMAGE PROCESSING XL, 2017, 10396
  • [7] Image super-resolution using KPLS
    Wu W.
    Yang X.-M.
    Yu Y.-M.
    Shi Y.-X.
    He X.-H.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2011, 40 (01): : 105 - 110
  • [8] Learning hybrid dynamic transformers for underwater image super-resolution
    He, Xin
    Li, Junjie
    Jia, Tong
    FRONTIERS IN MARINE SCIENCE, 2024, 11
  • [9] Simultaneous restoration and super-resolution GAN for underwater image enhancement
    Wang, Huiqiang
    Zhong, Guoqiang
    Sun, Jinxuan
    Chen, Yang
    Zhao, Yuxiao
    Li, Shu
    Wang, Dong
    FRONTIERS IN MARINE SCIENCE, 2023, 10
  • [10] Underwater Image Super-Resolution Using Frequency-Domain Enhanced Attention Network
    Liu, Xin
    Gu, Zhengxiang
    Ding, Haiming
    Zhang, Min
    Wang, Li
    IEEE ACCESS, 2024, 12 : 6136 - 6147