Generalized attention-based deep multi-instance learning

被引:4
|
作者
Zhao, Lu [1 ]
Yuan, Liming [2 ]
Hao, Kun [1 ]
Wen, Xianbin [2 ]
机构
[1] Tianjin Chengjian Univ, Sch Comp & Informat Engn, 26 Jinjing Rd, Tianjin 300384, Peoples R China
[2] Tianjin Univ Technol, Sch Comp Sci & Engn, 391 Bin Shui Xi Dao Rd, Tianjin 300384, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-instance learning; Deep learning; Similarity-based loss; Attention-based pooling; Interpretability; NEURAL-NETWORKS;
D O I
10.1007/s00530-022-00992-w
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Attention-based deep multi-instance learning (MIL) is an effective and interpretable model. Its interpretability is attributed to the learnability of its inner attention-based MIL pooling. Its main problem is to learn a unique instance-level target concept for weighting instances. Another implicative issue is to assume that the bag and instance concepts are located in the same semantic space. In this paper, we relax these constraints as: (i) There exist multiple instance concepts; (ii) The bag and instance concepts live in different semantic spaces. Upon the two relaxed constraints, we propose a two-level attention-based MIL pooling that first learns several instance concepts in a low-level semantic space and subsequently captures the bag concept in a high-level semantic space. To effectively capture different types of instance concepts, we also present a new similarity-based loss. The experimental results show that our method achieves higher or very comparable performance with state-of-the-art methods on benchmark data sets and surpasses them in terms of performance and interpretability on a synthetic data set.
引用
下载
收藏
页码:275 / 287
页数:13
相关论文
共 50 条
  • [31] A multi-instance multi-label learning algorithm based on instance correlations
    Chanjuan Liu
    Tongtong Chen
    Xinmiao Ding
    Hailin Zou
    Yan Tong
    Multimedia Tools and Applications, 2016, 75 : 12263 - 12284
  • [32] Classification of colorectal cancer consensus molecular subtypes using attention-based multi-instance learning network on whole-slide images
    Xu, Huilin
    Wu, Aoshen
    Ren, He
    Yu, Chenghang
    Liu, Gang
    Liu, Lei
    ACTA HISTOCHEMICA, 2023, 125 (06)
  • [33] Multi-Instance Learning with Key Instance Shift
    Zhang, Ya-Lin
    Zhou, Zhi-Hua
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3441 - 3447
  • [34] EFFICIENT INSTANCE ANNOTATION IN MULTI-INSTANCE LEARNING
    Pham, Anh T.
    Raich, Raviv
    Fern, Xiaoli Z.
    2014 IEEE WORKSHOP ON STATISTICAL SIGNAL PROCESSING (SSP), 2014, : 137 - 140
  • [35] Learning Multi-Instance Deep Discriminative Patterns for Image Classification
    Tang, Peng
    Wang, Xinggang
    Feng, Bin
    Liu, Wenyu
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (07) : 3385 - 3396
  • [36] Disambiguated Attention Embedding for Multi-Instance Partial-Label Learning
    Tang, Wei
    Zhang, Weijia
    Zhang, Min-Ling
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [37] Multi-instance Multi-label Learning Based on Parallel Attention and Local Label Manifold Correlation
    Yang, Mei
    Tang, Wen-Tao
    Min, Fan
    2022 IEEE 9TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2022, : 11 - 20
  • [38] A multi-instance learning algorithm based on nonparallel classifier
    Qi, Zhiquan
    Tian, Yingjie
    Yu, Xiaodan
    Shi, Yong
    APPLIED MATHEMATICS AND COMPUTATION, 2014, 241 : 233 - 241
  • [39] Combining visual attention model with multi-instance learning for tag ranking
    Feng, Songhe
    Bao, Hong
    Lang, Congyan
    Xu, De
    NEUROCOMPUTING, 2011, 74 (17) : 3619 - 3627
  • [40] A two-level learning method for generalized multi-instance problems
    Weidmann, N
    Frank, E
    Pfahringer, B
    MACHINE LEARNING: ECML 2003, 2003, 2837 : 468 - 479