Joint Gaussian Based Measures for Multiple-Instance Learning

被引:1
|
作者
Zhou, Linfei [1 ]
Plant, Claudia [2 ]
Boehm, Christian [1 ]
机构
[1] Ludwig Maximilians Univ Munchen, Inst Comp Sci, Munich, Germany
[2] Univ Vienna, Dept Comp Sci, Vienna, Austria
关键词
DISTANCE;
D O I
10.1109/ICDE.2017.75
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As an actively investigated topic in machine learning, Multiple-Instance Learning (MIL) has many proposed solutions, including both supervised and unsupervised methods. Most of these solutions are restricted to the original assumption that comes with the notion of MIL: the label of a multiple-instance object is directly determined by the labels of its instances. However, this assumption faces adverse circumstances when there is no clear relation between the over-all label and the labels of instances. Most previous approaches avoid this problem in practice by taking each multiple-instance object as a whole instead of starting with learning in instance spaces, but they either lose information or are time consuming. In this paper, we introduce two joint Gaussian based measures for MIL, Joint Gaussian Similarity (JGS)and Joint Gaussian Distance (JGD), which require no prior knowledge of relations between the labels of multiple-instance objects and their instances. JGS is a measure of similarity while JGD is a metric of which the properties are necessary for many techniques like clustering and embedding. JGS and JGD take all the information into account and many traditional machine learning methods can be introduced to MIL. Extensive experimental evaluations on various real-world data demonstrate the effectiveness of both measures, and better performances than state-of-the-art MIL algorithms on benchmark tasks.
引用
收藏
页码:203 / 206
页数:4
相关论文
共 50 条
  • [1] An Iterative Instance Selection Based Framework for Multiple-Instance Learning
    Yuan, Liming
    Wen, Xianbin
    Zhao, Lu
    Xu, Haixia
    [J]. 2018 IEEE 30TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI), 2018, : 772 - 779
  • [2] Compact Multiple-Instance Learning
    Chai, Jing
    Liu, Weiwei
    Tsang, Ivor W.
    Shen, Xiaobo
    [J]. CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 2007 - 2010
  • [3] ON GENERALIZED MULTIPLE-INSTANCE LEARNING
    Scott, Stephen
    Zhang, Jun
    Brown, Joshua
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2005, 5 (01) : 21 - 35
  • [4] A framework for multiple-instance learning
    Maron, O
    Lozano-Perez, T
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 10, 1998, 10 : 570 - 576
  • [5] On multiple-instance learning of halfspaces
    Diochnos, D. I.
    Sloan, R. H.
    Turan, Gy
    [J]. INFORMATION PROCESSING LETTERS, 2012, 112 (23) : 933 - 936
  • [6] UNSUPERVISED MULTIPLE-INSTANCE LEARNING FOR INSTANCE SEARCH
    Wang, Zhenzhen
    Yuan, Junsong
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2018,
  • [7] MULTIPLE-INSTANCE LEARNING WITH PAIRWISE INSTANCE SIMILARITY
    Yuan, Liming
    Liu, Jiafeng
    Tang, Xianglong
    [J]. INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, 2014, 24 (03) : 567 - 577
  • [8] Salient Instance Selection for Multiple-Instance Learning
    Yuan, Liming
    Liu, Songbo
    Huang, Qingcheng
    Liu, Jiafeng
    Tang, Xianglong
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2012, PT III, 2012, 7665 : 58 - 67
  • [9] ENSEMBLE-BASED INSTANCE RELEVANCE ESTIMATION IN MULTIPLE-INSTANCE LEARNING
    Waqas, Muhammd
    Tahir, Muhammad Atif
    Qureshi, Rizwan
    [J]. PROCEEDINGS OF THE 2021 9TH EUROPEAN WORKSHOP ON VISUAL INFORMATION PROCESSING (EUVIP), 2021,
  • [10] Automatic image annotation based on the multiple-instance learning
    Wang, Keping
    Wang, Xiaojie
    [J]. Journal of Information and Computational Science, 2010, 7 (13): : 2781 - 2788