Feature Contrastive Learning for No-Reference Segmentation Quality Evaluation

被引:0
|
作者
Li, Xiaofan [1 ]
Peng, Bo [1 ]
Xie, Zhuyang [1 ]
机构
[1] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu 610031, Peoples R China
关键词
segmentation quality evaluation; contrastive learning; meta-measure; IMAGE; NETWORKS;
D O I
10.3390/electronics12102339
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
No-reference segmentation quality evaluation aims to evaluate the quality of image segmentation without any reference image during the application process. It usually depends on certain quality criteria to describe a good segmentation with some prior knowledge. Therefore, there is a need for a precise description of the objects in the segmentation and an integration of the representation in the evaluation process. In this paper, from the perspective of understanding the semantic relationship between the original image and the segmentation results, we propose a feature contrastive learning method. This method can enhance the performance of no-reference segmentation quality evaluations and be applied in semantic segmentation scenarios. By learning the pixel-level similarity between the original image and the segmentation result, a contrastive learning step is performed in the feature space. In addition, a class activation map (CAM) is used to guide the evaluation, making the score more consistent with the human visual judgement. Experiments were conducted on the PASCAL VOC2012 dataset, with segmentation results obtained by state-of-the-art (SoA) segmentation methods. We adopted two meta-measure criteria to validate the efficiency of the proposed method. Compared with other no-reference evaluation methods, our method achieves a higher accuracy which is comparable to the supervised evaluation methods and partly even exceeds them.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] A Multibranch Network With Multilayer Feature Fusion for No-Reference Image Quality Assessment
    Zhao, Wenqing
    Li, Mengwei
    Xu, Lijiao
    Sun, Yue
    Zhao, Zhenbing
    Zhai, Yongjie
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [32] No-reference stereoscopic images quality assessment based on binocular feature combination
    Li, Ke-Meng
    Shao, Feng
    Jiang, Qiu-Ping
    Jiang, Gang-Yi
    Yu, Mei
    Guangdianzi Jiguang/Journal of Optoelectronics Laser, 2015, 26 (11): : 2224 - 2230
  • [33] A NO-REFERENCE IMAGE QUALITY EVALUATION BASED ON POWER SPECTRUM
    Zhang, Yan
    An, Ping
    Zhang, Qiuwen
    Shen, Liquan
    Zhang, Zhaoyang
    2011 3DTV CONFERENCE: THE TRUE VISION - CAPTURE, TRANSMISSION AND DISPLAY OF 3D VIDEO (3DTV-CON), 2011,
  • [34] No-Reference Image Quality Metric Based on Features Evaluation
    Alaql, Omar
    Ghazinour, Kambiz
    Lu, Cheng Chang
    2017 IEEE 7TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE IEEE CCWC-2017, 2017,
  • [35] LEARNING QUALITY-AWARE FILTERS FOR NO-REFERENCE IMAGE QUALITY ASSESSMENT
    Gu, Zhongyi
    Zhang, Lin
    Liu, Xiaoxu
    Li, Hongyu
    Lu, Jianwei
    2014 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2014,
  • [36] EMAN: The Human Visual Feature Based No-Reference Subjective Quality Metric
    Podder, Pallab Kanti
    Paul, Manoranjan
    Murshed, Manzur
    IEEE ACCESS, 2019, 7 : 46152 - 46164
  • [37] No-Reference Image Quality Assessment Based on Statistics of Convolution Feature Maps
    Lv, Xiaoxin
    Qin, Min
    Chen, Xiaohui
    Wei, Guo
    ADVANCES IN MATERIALS, MACHINERY, ELECTRONICS II, 2018, 1955
  • [38] A complete no-reference image quality assessment method based on local feature
    Wu, Jiang
    Jiang, Ping
    INTERNATIONAL JOURNAL OF IMAGE AND DATA FUSION, 2019, 10 (03) : 165 - 176
  • [39] Dual-Feature Aggregation Network for No-Reference Image Quality Assessment
    Chen, Yihua
    Chen, Zhiyuan
    Yu, Mengzhu
    Tang, Zhenjun
    MULTIMEDIA MODELING, MMM 2023, PT I, 2023, 13833 : 149 - 161
  • [40] Deep feature importance awareness based no-reference image quality prediction
    Yang, Xiaohan
    Li, Fan
    Liu, Hantao
    NEUROCOMPUTING, 2020, 401 : 209 - 223