Multi-instance clustering with applications to multi-instance prediction

被引:1
|
作者
Min-Ling Zhang
Zhi-Hua Zhou
机构
[1] Nanjing University,National Key Laboratory for Novel Software Technology
[2] Hohai University,College of Computer and Information Engineering
来源
Applied Intelligence | 2009年 / 31卷
关键词
Machine learning; Multi-instance learning; Clustering; Representation transformation;
D O I
暂无
中图分类号
学科分类号
摘要
In the setting of multi-instance learning, each object is represented by a bag composed of multiple instances instead of by a single instance in a traditional learning setting. Previous works in this area only concern multi-instance prediction problems where each bag is associated with a binary (classification) or real-valued (regression) label. However, unsupervised multi-instance learning where bags are without labels has not been studied. In this paper, the problem of unsupervised multi-instance learning is addressed where a multi-instance clustering algorithm named Bamic is proposed. Briefly, by regarding bags as atomic data items and using some form of distance metric to measure distances between bags, Bamic adapts the popular k-Medoids algorithm to partition the unlabeled training bags into k disjoint groups of bags. Furthermore, based on the clustering results, a novel multi-instance prediction algorithm named Bartmip is developed. Firstly, each bag is re-represented by a k-dimensional feature vector, where the value of the i-th feature is set to be the distance between the bag and the medoid of the i-th group. After that, bags are transformed into feature vectors so that common supervised learners are used to learn from the transformed feature vectors each associated with the original bag’s label. Extensive experiments show that Bamic could effectively discover the underlying structure of the data set and Bartmip works quite well on various kinds of multi-instance prediction problems.
引用
收藏
页码:47 / 68
页数:21
相关论文
共 50 条
  • [21] Feature selection in multi-instance learning
    Rui Gan
    Jian Yin
    Neural Computing and Applications, 2013, 23 : 907 - 912
  • [22] Multi-Instance Learning with Distribution Change
    Zhang, Wei-Jia
    Zhou, Zhi-Hua
    PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 2184 - 2190
  • [23] Research on Ensemble Multi-Instance Learning
    Huang, Bo
    Cai, Zhihua
    Tao, Duoxiu
    Gu, Qiong
    PROGRESS IN INTELLIGENCE COMPUTATION AND APPLICATIONS, 2008, : 200 - 204
  • [24] Multi-Instance Nonparallel Tube Learning
    Xiao, Yanshan
    Liu, Bo
    Hao, Zhifeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [25] Scalable Algorithms for Multi-Instance Learning
    Wei, Xiu-Shen
    Wu, Jianxin
    Zhou, Zhi-Hua
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (04) : 975 - 987
  • [26] Diversified dictionaries for multi-instance learning
    Qiao, Maoying
    Liu, Liu
    Yu, Jun
    Xu, Chang
    Tao, Dacheng
    PATTERN RECOGNITION, 2017, 64 : 407 - 416
  • [27] Multi-Instance Learning with Incremental Classes
    Wei X.
    Xu S.
    An P.
    Yang J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2022, 59 (08): : 1723 - 1731
  • [28] Contracts for Multi-instance UML Activities
    Slatten, Vidar
    Herrmann, Peter
    FORMAL TECHNIQUES FOR DISTRIBUTED SYSTEMS, 2011, 6722 : 304 - 318
  • [29] Ensembles of multi-instance neural networks
    Zhang, ML
    Zhou, ZH
    INTELLIGENT INFORMATION PROCESSING II, 2005, 163 : 471 - 474
  • [30] Multi-instance Object Segmentation with Exemplars
    He, Xuming
    Gould, Stephen
    2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2013, : 1 - 4