Information cut for clustering using a gradient descent approach

被引:20
|
作者
Jenssen, Robert [1 ]
Erdogmus, Deniz
Hild, Kenneth E., II
Principe, Jose C.
Eltoft, Torbjorn
机构
[1] Univ Tromso, Dept Phys & Technol, N-9037 Tromso, Norway
[2] Oregon Grad Inst, OHSU, Beaverton, OR 97006 USA
[3] Univ Calif San Francisco, Biomagnet Image Lab, San Francisco, CA 94143 USA
[4] Univ Florida, Dept Elect & Comp Engn, Computat NeuroEngn Lab, Gainesville, FL 32611 USA
基金
美国国家科学基金会;
关键词
graph theoretic cut; information theory; Parzen window density estimation; clustering; gradient descent optimization; annealing;
D O I
10.1016/j.patcog.2006.06.028
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce a new graph cut for clustering which we call the Information Cut. It is derived using Parzen windowing to estimate an information theoretic distance measure between probability density functions. We propose to optimize the Information Cut using a gradient descent-based approach. Our algorithm has several advantages compared to many other graph-based methods in terms of determining an appropriate affinity measure, computational complexity, memory requirements and coping with different data scales. We show that our method may produce clustering and image segmentation results comparable or better than the state-of-the art graph-based methods. (c) 2006 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:796 / 806
页数:11
相关论文
共 50 条
  • [1] An Effective Partitional Crisp Clustering Method Using Gradient Descent Approach
    Shalileh, Soroosh
    [J]. MATHEMATICS, 2023, 11 (12)
  • [2] A hybrid clustering and gradient descent approach for fuzzy modeling
    Wong, CC
    Chen, CC
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1999, 29 (06): : 686 - 693
  • [3] IMPROVED CLUSTERING USING DETERMINISTIC ANNEALING WITH A GRADIENT DESCENT TECHNIQUE
    QIU, G
    VARLEY, MR
    TERRELL, TJ
    [J]. PATTERN RECOGNITION LETTERS, 1994, 15 (06) : 607 - 610
  • [4] Tangent-cut optimizer on gradient descent: an approach towards Hybrid Heuristics
    Saptarshi Biswas
    Subhrapratim Nath
    Sumagna Dey
    Utsha Majumdar
    [J]. Artificial Intelligence Review, 2022, 55 : 1121 - 1147
  • [5] Tangent-cut optimizer on gradient descent: an approach towards Hybrid Heuristics
    Biswas, Saptarshi
    Nath, Subhrapratim
    Dey, Sumagna
    Majumdar, Utsha
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2022, 55 (02) : 1121 - 1147
  • [6] Information cut and information forces for clustering
    Jenssen, R
    Principe, JC
    Eltoft, T
    [J]. 2003 IEEE XIII WORKSHOP ON NEURAL NETWORKS FOR SIGNAL PROCESSING - NNSP'03, 2003, : 459 - 468
  • [7] Stochastic Gradient Descent Support Vector Clustering
    Tung Pham
    Hang Dang
    Trung Le
    Hoang-Thai Le
    [J]. PROCEEDINGS OF 2015 2ND NATIONAL FOUNDATION FOR SCIENCE AND TECHNOLOGY DEVELOPMENT CONFERENCE ON INFORMATION AND COMPUTER SCIENCE NICS 2015, 2015, : 88 - 93
  • [8] GRADIENT DESCENT BATCH CLUSTERING FOR IMAGE CLASSIFICATION
    Park, Jae Sam
    [J]. IMAGE ANALYSIS & STEREOLOGY, 2023, 42 (02): : 133 - 144
  • [9] Text-based information retrieval using exponentiated gradient descent
    Papka, R
    Callan, JP
    Barto, AG
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 9: PROCEEDINGS OF THE 1996 CONFERENCE, 1997, 9 : 3 - 9
  • [10] A new stochastic gradient descent possibilistic clustering algorithm
    Koutsimpela, Angeliki
    Koutroumbas, Konstantinos D.
    [J]. AI COMMUNICATIONS, 2022, 35 (02) : 47 - 64