On the Sufficient Condition for Solving the Gap-Filling Problem Using Deep Convolutional Neural Networks

被引:3
|
作者
Peppert, Felix [1 ]
von Kleist, Max [2 ]
Schutte, Christof [3 ,4 ]
Sunkara, Vikram [1 ]
机构
[1] Zuse Inst Berlin, Explainable AI Biol, D-14195 Berlin, Germany
[2] Robert Koch Inst, Syst Med Infect Dis P5, D-13353 Berlin, Germany
[3] Zuse Inst Berlin, Div Math Life & Mat Sci, D-14195 Berlin, Germany
[4] Free Univ Berlin, Biocomp Grp, D-14195 Berlin, Germany
关键词
Biomedical imaging; computational biology; deep convolutional neural networks (DCNNs); image inpainting; image segmentation; machine learning; IMAGE SEGMENTATION; FRAMEWORK;
D O I
10.1109/TNNLS.2021.3072746
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep convolutional neural networks (DCNNs) are routinely used for image segmentation of biomedical data sets to obtain quantitative measurements of cellular structures like tissues. These cellular structures often contain gaps in their boundaries, leading to poor segmentation performance when using DCNNs like the U-Net. The gaps can usually be corrected by post-hoc computer vision (CV) steps, which are specific to the data set and require a disproportionate amount of work. As DCNNs are Universal Function Approximators, it is conceivable that the corrections should be obsolete by selecting the appropriate architecture for the DCNN. In this article, we present a novel theoretical framework for the gap-filling problem in DCNNs that allows the selection of architecture to circumvent the CV steps. Combining information-theoretic measures of the data set with a fundamental property of DCNNs, the size of their receptive field, allows us to formulate statements about the solvability of the gap-filling problem independent of the specifics of model training. In particular, we obtain mathematical proof showing that the maximum proficiency of filling a gap by a DCNN is achieved if its receptive field is larger than the gap length. We then demonstrate the consequence of this result using numerical experiments on a synthetic and real data set and compare the gap-filling ability of the ubiquitous U-Net architecture with variable depths. Our code is available at https://github.com/ai-biology/dcnn-gap-filling.
引用
收藏
页码:6194 / 6205
页数:12
相关论文
共 50 条
  • [1] Efficiently gap-filling reaction networks
    Mario Latendresse
    BMC Bioinformatics, 15
  • [2] Efficiently gap-filling reaction networks
    Latendresse, Mario
    BMC BIOINFORMATICS, 2014, 15
  • [3] Complex Data Imputation by Auto-Encoders and Convolutional Neural Networks-A Case Study on Genome Gap-Filling
    Cappelletti, Luca
    Fontana, Tommaso
    Di Donato, Guido Walter
    Di Tucci, Lorenzo
    Casiraghi, Elena
    Valentini, Giorgio
    COMPUTERS, 2020, 9 (02)
  • [4] A gap-filling method for room temperature data based on autoencoder neural networks
    Liguori, Antonio
    Markovic, Romana
    Frisch, Jerome
    Wagner, Andreas
    Causone, Francesco
    van Treeck, Christoph
    PROCEEDINGS OF BUILDING SIMULATION 2021: 17TH CONFERENCE OF IBPSA, 2022, 17 : 2427 - 2434
  • [5] An Improved Estimation and Gap-Filling Technique for Sea Surface Wind Speeds Using NARX Neural Networks
    Silva, Murilo T.
    Gill, Eric W.
    Huang, Weimin
    JOURNAL OF ATMOSPHERIC AND OCEANIC TECHNOLOGY, 2018, 35 (07) : 1521 - 1532
  • [6] Solving Bilevel Optimal Bidding Problems Using Deep Convolutional Neural Networks
    Vlah, Domagoj
    Sepetanc, Karlo
    Pandzic, Hrvoje
    IEEE SYSTEMS JOURNAL, 2023, 17 (02): : 2767 - 2778
  • [7] Caching as an Image Characterization Problem using Deep Convolutional Neural Networks
    Wang, Yantong
    Friderikos, Vasilis
    ICC 2020 - 2020 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2020,
  • [8] Cloud gap-filling with deep learning for improved grassland monitoring
    Tsardanidis, Iason
    Koukos, Alkiviadis
    Sitokonstantinou, Vasileios
    Drivas, Thanassis
    Kontoes, Charalampos
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2025, 230
  • [9] LightNN: Filling the Gap between Conventional Deep Neural Networks and Binarized Networks
    Ding, Ruizhou
    Liu, Zeye
    Shi, Rongye
    Marculescu, Diana
    Blanton, R. D.
    PROCEEDINGS OF THE GREAT LAKES SYMPOSIUM ON VLSI 2017 (GLSVLSI' 17), 2017, : 35 - 40
  • [10] Filling in the Gap: a General Method Using Neural Networks
    Rodrigues, Rui
    COMPUTING IN CARDIOLOGY 2010, VOL 37, 2010, 37 : 453 - 456