MICCLLR: Multiple-Instance Learning Using Class Conditional Log Likelihood Ratio

被引:0
|
作者
EL-Manzalawy, Yasser [1 ]
Honavar, Vasant [1 ]
机构
[1] Iowa State Univ, Dept Comp Sci, Ames, IA 50011 USA
来源
关键词
multiple-instance learning; image retrieval; drug activity prediction; ensemble of multiple-instance learning classifiers; boosted multiple-instance learning; REGRESSION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiple-instance learning (MIL) is a generalization of the supervised learning problem where each training observation is a labeled bag of unlabeled instances. Several supervised learning algorithms have been successfully adapted for the multiple-instance learning settings. We explore the adaptation of the Naive Bayes (NB) classifier and the utilization of its sufficient statistics for developing novel multiple-instance learning methods. Specifically, we introduce MICCLLR (multiple-instance class conditional log likelihood ratio), a method for mapping each bag of instances as a single meta-instance using class conditional log likelihood ratio statistics such that any supervised base classifier can be applied to the meta data. The results of our experiments with MICCLLR using different base classifiers suggest that no single base classifier consistently outperforms other base classifiers on all data sets. We show that a substantial improvement in performance is obtained using ail ensemble of MICCLLR. classifiers trained using different base learners. We also show that an extra gain in classification accuracy is obtained by applying AdaBoost.M1 to weak MICCLLR, classifiers. Overall, our results suggest that the predictive performance of the three proposed variants of MICCLLR are competitive to some of the state-of-the-art MIL methods.
引用
收藏
页码:80 / 91
页数:12
相关论文
共 50 条
  • [1] ON GENERALIZED MULTIPLE-INSTANCE LEARNING
    Scott, Stephen
    Zhang, Jun
    Brown, Joshua
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2005, 5 (01) : 21 - 35
  • [2] Compact Multiple-Instance Learning
    Chai, Jing
    Liu, Weiwei
    Tsang, Ivor W.
    Shen, Xiaobo
    [J]. CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 2007 - 2010
  • [3] MULTIPLE CLASS MULTIPLE-INSTANCE LEARNING AND ITS APPLICATION TO IMAGE CATEGORIZATION
    Xu, Xinyu
    Li, Baoxin
    [J]. INTERNATIONAL JOURNAL OF IMAGE AND GRAPHICS, 2007, 7 (03) : 427 - 444
  • [4] On multiple-instance learning of halfspaces
    Diochnos, D. I.
    Sloan, R. H.
    Turan, Gy
    [J]. INFORMATION PROCESSING LETTERS, 2012, 112 (23) : 933 - 936
  • [5] A framework for multiple-instance learning
    Maron, O
    Lozano-Perez, T
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 10, 1998, 10 : 570 - 576
  • [6] A Probabilistic Bag-to-Class Approach to Multiple-Instance Learning
    Mollersen, Kajsa
    Hardeberg, Jon Yngve
    Godtliebsen, Fred
    [J]. DATA, 2020, 5 (02) : 1 - 25
  • [7] Improving Representation of the Positive Class in Imbalanced Multiple-Instance Learning
    Mera, Carlos
    Orozco-Alzate, Mauricio
    Branch, John
    [J]. IMAGE ANALYSIS AND RECOGNITION, ICIAR 2014, PT I, 2014, 8814 : 266 - 273
  • [8] UNSUPERVISED MULTIPLE-INSTANCE LEARNING FOR INSTANCE SEARCH
    Wang, Zhenzhen
    Yuan, Junsong
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2018,
  • [9] Salient Instance Selection for Multiple-Instance Learning
    Yuan, Liming
    Liu, Songbo
    Huang, Qingcheng
    Liu, Jiafeng
    Tang, Xianglong
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2012, PT III, 2012, 7665 : 58 - 67
  • [10] MULTIPLE-INSTANCE LEARNING WITH PAIRWISE INSTANCE SIMILARITY
    Yuan, Liming
    Liu, Jiafeng
    Tang, Xianglong
    [J]. INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, 2014, 24 (03) : 567 - 577