ATTENTION-BASED DEEP MULTIPLE INSTANCE LEARNING WITH ADAPTIVE INSTANCE SAMPLING

被引:1
|
作者
Tarkhan, Aliasghar [1 ]
Trung Kien Nguyen [2 ]
Simon, Noah [1 ]
Bengtsson, Thomas [2 ]
Ocampo, Paolo [2 ]
Dai, Jian [2 ]
机构
[1] Univ Washington, Dept Biostat, Seattle, WA 98195 USA
[2] Genentech Inc, PHC Imaging Grp, San Francisco, CA 94080 USA
关键词
Attention; computational pathology; deep learning; multiple instance learning; prostate cancer; transfer learning; weekly supervised learning; PREDICTION;
D O I
10.1109/ISBI52829.2022.9761661
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
One challenge of training deep neural networks with gigapixel whole-slide images (WSIs) in computational pathology is the lack of annotation at pixel level or regional level due to the high cost and time-consuming labeling effort. Multiple instance learning (MIL) and its attention-based versions are typical weakly supervised learning methods, which allow us to use slide-level labels directly, without the need for pixel or region labels, thus reducing the cost of annotation. However, training a deep neural network with thousands of image regions (patches) per slide is computationally expensive, and it needs a lot of time for convergence. This paper proposes a fast adaptive attention-based deep MIL approach. This approach adaptively selects image regions that are highly predictive of outcome and ignores image regions with little or no information. We empirically show that our proposed approach outperforms the random sampling approach while it is faster than the standard attention-based MIL method (which uses all image regions for training).
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Attention-based Deep Multiple Instance Learning
    Ilse, Maximilian
    Tomczak, Jakub M.
    Welling, Max
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [2] MILL: Channel Attention-based Deep Multiple Instance Learning for Landslide Recognition
    Tang, Xiaochuan
    Liu, Mingzhe
    Zhong, Hao
    Ju, Yuanzhen
    Li, Weile
    Xu, Qiang
    [J]. ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2021, 17 (02)
  • [3] Generalized attention-based deep multi-instance learning
    Zhao, Lu
    Yuan, Liming
    Hao, Kun
    Wen, Xianbin
    [J]. MULTIMEDIA SYSTEMS, 2023, 29 (01) : 275 - 287
  • [4] Generalized attention-based deep multi-instance learning
    Lu Zhao
    Liming Yuan
    Kun Hao
    Xianbin Wen
    [J]. Multimedia Systems, 2023, 29 : 275 - 287
  • [5] Attention-Based Target Localization Using Multiple Instance Learning
    Sankaranarayanan, Karthik
    Davis, James W.
    [J]. ADVANCES IN VISUAL COMPUTING, PT I, 2010, 6453 : 381 - 392
  • [6] Lung cancer diagnosis using deep attention-based multiple instance learning and radiomics
    Chen, Junhua
    Zeng, Haiyan
    Zhang, Chong
    Shi, Zhenwei
    Dekker, Andre
    Wee, Leonard
    Bermejo, Inigo
    [J]. MEDICAL PHYSICS, 2022, 49 (05) : 3134 - 3143
  • [7] OutfitNet: Fashion Outfit Recommendation with Attention-Based Multiple Instance Learning
    Lin, Yusan
    Moosaei, Maryam
    Yang, Hao
    [J]. WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 77 - 87
  • [8] Ensemble Learning With Attention-Based Multiple Instance Pooling for Classification of SPT
    Zhou, Qinghua
    Zhang, Xin
    Zhang, Yu-Dong
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2022, 69 (03) : 1927 - 1931
  • [9] Loss-Based Attention for Deep Multiple Instance Learning
    Shi, Xiaoshuang
    Xing, Fuyong
    Xie, Yuanpu
    Zhang, Zizhao
    Cui, Lei
    Yang, Lin
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5742 - 5749
  • [10] Probabilistic Attention Based on Gaussian Processes for Deep Multiple Instance Learning
    Schmidt, Arne
    Morales-Alvarez, Pablo
    Molina, Rafael
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 10909 - 10922