Weighted fuzzy clustering approach with adaptive spatial information and Kullback-Leibler divergence for skin lesion segmentation

被引:0
|
作者
Kumari, Pinki [1 ]
Agrawal, R. K. [1 ]
Priya, Aditi [1 ]
机构
[1] Jawaharlal Nehru Univ, Sch Comp & Syst Sci, New Mehrauli Rd, New Delhi 110067, India
关键词
Fuzzy-based clustering; Kullback-Leibler divergence; Skin lesion segmentation; REGION; NET; ALGORITHM; DIAGNOSIS; NETWORK; IMAGES; FCM;
D O I
10.1007/s13042-025-02575-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Segmentation of dermoscopic images is an important phase of computer-aided diagnosis and classification of skin cancers. The varied shapes, sizes, and colors of the lesion, differences in the intensity of pixels within a lesion, presence of hair, and fuzziness at the lesion boundary make precise segmentation more difficult. To overcome all these problems, we propose a novel weighted fuzzy-based clustering method named WFC_AS_KL , which involves three terms in its objective function. The first term is a weighted combination of Fuzzy C-Mean and Fuzzy k-Plane Clustering, which is suitable to handle asymmetrical data distribution and changes in the color of dermoscopic images. To mitigate the effect of noise, weighted region-level and local-level spatial information is used as the second term. The third term is Kullback-Leibler information, which reduces the fuzziness of the fuzzy partition matrix. It also handles the cluster-center overlapping problem and unbalanced cluster size problem. An adaptive constraint is also used to control the original and spatial information of the image. We demonstrate the superior performance of the proposed WFC_AS_KL method on two publicly available medical image datasets against 13 fuzzy-based clustering methods in terms of sensitivity, specificity, precision, false positive rate, accuracy, dice similarity coefficient, and jaccard index. We also compared the performance of the proposed method against twelve traditional skin lesion segmentation methods and twelve existing deep learning methods. We also performed a non-parametric statistical test to demonstrate the statistically superior performance of the proposed WFC_AS_KL method in comparison to 13 fuzzy-based segmentation methods.
引用
收藏
页数:21
相关论文
共 50 条
  • [21] Kullback-Leibler Divergence-Based Fuzzy C-Means Clustering Incorporating Morphological Reconstruction and Wavelet Frames for Image Segmentation
    Wang, Cong
    Pedrycz, Witold
    Li, ZhiWu
    Zhou, MengChu
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (08) : 7612 - 7623
  • [22] Kullback-Leibler Divergence and Mutual Information of Partitions in Product MV Algebras
    Markechova, Dagmar
    Riecan, Beloslav
    ENTROPY, 2017, 19 (06)
  • [23] A proficient video recommendation framework using hybrid fuzzy C means clustering and Kullback-Leibler divergence algorithms
    H. Anwar Basha
    S. K. B Sangeetha
    S. Sasikumar
    J. Arunnehru
    M. Subramaniam
    Multimedia Tools and Applications, 2023, 82 : 20989 - 21004
  • [24] A proficient video recommendation framework using hybrid fuzzy C means clustering and Kullback-Leibler divergence algorithms
    Basha, H. Anwar
    Sangeetha, S. K. B.
    Sasikumar, S.
    Arunnehru, J.
    Subramaniam, M.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (14) : 20989 - 21004
  • [25] Markov random field based on Kullback-Leibler divergence and its applications to geo-spatial image segmentation
    Nishii, R
    6TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL XVII, PROCEEDINGS: INDUSTRIAL SYSTEMS AND ENGINEERING III, 2002, : 399 - 405
  • [26] An iterative approach to variable selection based on the Kullback-Leibler information
    Hughes, AW
    King, ML
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1999, 28 (05) : 1043 - 1057
  • [27] Kullback-Leibler Divergence and Akaike Information Criterion in General Hidden Markov Models
    Fuh, Cheng-Der
    Kao, Chu-Lan Michael
    Pang, Tianxiao
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (08) : 5888 - 5909
  • [28] AN IMPROVED REPRESENTATION OF MEASUREMENT INFORMATION CONTENT VIA THE DISTRIBUTION OF THE KULLBACK-LEIBLER DIVERGENCE
    Gualdoni, Matthew J.
    DeMars, Kyle J.
    ASTRODYNAMICS 2017, PTS I-IV, 2018, 162 : 1889 - 1908
  • [29] Learning a Robust Consensus Matrix for Clustering Ensemble via Kullback-Leibler Divergence Minimization
    Zhou, Peng
    Du, Liang
    Wang, Hanmo
    Shi, Lei
    Shen, Yi-Dong
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 4112 - 4118
  • [30] Attribute Value Weighted Averaged One-Dependence Estimators with Kullback-Leibler Divergence
    Zhu, Changjian
    Chen, Shenglei
    Ke, Huihang
    Zhang, Chengzhen
    JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2025, 34 (01)