Locating Waterfowl Farms from Satellite Images with Parallel Residual U-Net Architecture

被引:0
|
作者
Chang, Keng-Chih [1 ,2 ]
Liu, Tsung-Jung [1 ,2 ]
Liu, Kuan-Hsien [3 ]
Chao, Day-Yu [4 ]
机构
[1] Natl Chung Hsing Univ, Dept Elect Engn, Taichung 40227, Taiwan
[2] Natl Chung Hsing Univ, Grad Inst Commun Engn, Taichung 40227, Taiwan
[3] Natl Taichung Univ Sci & Technol, Dept Comp Sci & Informat Engn, Taichung 40401, Taiwan
[4] Natl Chung Hsing Univ, Grad Inst Microbiol & Publ Hlth, Taichung 40227, Taiwan
关键词
satellite images; semantic segmentation; convolutional neural network (CNN); loss function; waterfowl; QUALITY ASSESSMENT; SEGMENTATION;
D O I
10.1109/smc42975.2020.9283182
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
For the epidemic prevention of avian influenza, there exist lots of differences between ideality and reality. This is why the epidemic is usually out of control. One of the reasons is that many illegal waterfowl farms are built without government registration. In this work, we proposed a new method trying to directly locate waterfowl farms, including both registered and unregistered ones without the need of human labeling. This will not only save human labors, but also update the location and size information of waterfowl farms regularly due to the computing speed of computers. In this work, we proposed a new method for satellite image augmentation. The layers of the model we proposed are not deeper than the other deep neural network models. However, we show that using the existing simple U-Net combined with residual blocks has better performance than the other deep models in this task.
引用
收藏
页码:114 / 119
页数:6
相关论文
共 50 条
  • [1] Identifying Poultry Farms from Satellite Images with Residual Dense U-Net
    Wen, Kai-Yu
    Liu, Tsung-Jung
    Liu, Kuan-Hsien
    Chao, Day-Yu
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 102 - 107
  • [2] SAR U-Net: Spatial attention residual U-Net structure for water body segmentation from remote sensing satellite images
    Naga Surekha Jonnala
    Neha Gupta
    [J]. Multimedia Tools and Applications, 2024, 83 : 44425 - 44454
  • [3] SAR U-Net: Spatial attention residual U-Net structure for water body segmentation from remote sensing satellite images
    Jonnala, Naga Surekha
    Gupta, Neha
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (15) : 44425 - 44454
  • [4] A Lumen Segmentation Method in Ureteroscopy Images based on a Deep Residual U-Net architecture
    Lazo, Jorge E.
    Marzullo, Aldo
    Moccia, Sara
    Catellani, Michele
    Rosa, Benoit
    Calimeri, Francesco
    de Mathelin, Michel
    De Momi, Elena
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9203 - 9210
  • [5] GARP - A Hybrid Preprocessing Technique for Semantic Segmentation of Satellite Images with U-Net Architecture
    Sardar Patel Institute of Technology, India
    [J]. Int. Conf. Futur. Technol., INCOFT,
  • [6] Detecting Buildings and Nonbuildings from Satellite Images Using U-Net
    Alsabhan, Waleed
    Alotaiby, Turky
    Dudin, Basil
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [7] Residual dense U-Net for abnormal exposure restoration from single images
    Que, Yue
    Lee, Hyo Jong
    [J]. IET IMAGE PROCESSING, 2021, 15 (01) : 115 - 126
  • [8] An Efficient U-Net Model for Improved Landslide Detection from Satellite Images
    Naveen Chandra
    Suraj Sawant
    Himadri Vaidya
    [J]. PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science, 2023, 91 : 13 - 28
  • [9] An Efficient U-Net Model for Improved Landslide Detection from Satellite Images
    Chandra, Naveen
    Sawant, Suraj
    Vaidya, Himadri
    [J]. PFG-JOURNAL OF PHOTOGRAMMETRY REMOTE SENSING AND GEOINFORMATION SCIENCE, 2023, 91 (01): : 13 - 28
  • [10] Road Detection from Satellite Images by Improving U-Net with Difference of Features
    Kamiya, Ryosuke
    Hotta, Kazuhiro
    Oda, Kazuo
    Kakuta, Satomi
    [J]. PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS (ICPRAM 2018), 2018, : 603 - 607