Towards Model-Agnostic Dataset Condensation by Heterogeneous Models

被引:0
|
作者
Moon, Jun-Yeong [1 ]
Kim, Jung Uk [1 ]
Park, Gyeong-Moon [1 ]
机构
[1] Kyung Hee Univ, Yongin, South Korea
来源
关键词
Dataset condensation; Model agnostic; Heterogeneous;
D O I
10.1007/978-3-031-73397-0_14
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The advancement of deep learning has coincided with the proliferation of both models and available data. The surge in dataset sizes and the subsequent surge in computational requirements have led to the development of the Dataset Condensation (DC). While prior studies have delved into generating synthetic images through methods like distribution alignment and training trajectory tracking for more efficient model training, a significant challenge arises when employing these condensed images practically. Notably, these condensed images tend to be specific to particular models, constraining their versatility and practicality. In response to this limitation, we introduce a novel method, Heterogeneous Model Dataset Condensation (HMDC), designed to produce universally applicable condensed images through cross-model interactions. To address the issues of gradient magnitude difference and semantic distance in models when utilizing heterogeneous models, we propose the Gradient Balance Module (GBM) and Mutual Distillation (MD) with the Spatial-Semantic Decomposition method. By balancing the contribution of each model and maintaining their semantic meaning closely, our approach overcomes the limitations associated with model-specific condensed images and enhances the broader utility. The source code is available in https://github.com/KHU-AGI/HMDC.
引用
收藏
页码:234 / 250
页数:17
相关论文
共 50 条
  • [21] On the transferability of local model-agnostic explanations of machine learning models to unseen data
    Lopez Gonzalez, Alba Maria
    Garcia-Cuesta, Esteban
    IEEE CONFERENCE ON EVOLVING AND ADAPTIVE INTELLIGENT SYSTEMS 2024, IEEE EAIS 2024, 2024, : 243 - 252
  • [22] Model-Agnostic Multi-objective Approach for the Evolutionary Discovery of Mathematical Models
    Hvatov, Alexander
    Maslyaev, Mikhail
    Polonskaya, Iana S.
    Sarafanov, Mikhail
    Merezhnikov, Mark
    Nikitin, Nikolay O.
    OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, OL2A 2021, 2021, 1488 : 72 - 85
  • [23] Model-Agnostic Syntactical Information for Pre-Trained Programming Language Models
    Saberi, Iman
    Fard, Fatemeh H.
    2023 IEEE/ACM 20TH INTERNATIONAL CONFERENCE ON MINING SOFTWARE REPOSITORIES, MSR, 2023, : 183 - 193
  • [24] Model-Agnostic Gender Debiased Image Captioning
    Hirota, Yusuke
    Nakashima, Yuta
    Garcia, Noa
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 15191 - 15200
  • [25] Optimizing model-agnostic random subspace ensembles
    Vân Anh Huynh-Thu
    Pierre Geurts
    Machine Learning, 2024, 113 : 993 - 1042
  • [26] Model-Agnostic Bias Measurement in Link Prediction
    Schwertmann, Lena
    Ravi, Manoj Prabhakar Kannan
    De Melo, Gerard
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1632 - 1648
  • [27] Model-Agnostic Nonconformity Functions for Conformal Classification
    Johansson, Ulf
    Linusson, Henrik
    Lofstrom, Tuve
    Bostrom, Henrik
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2072 - 2079
  • [28] Model-Agnostic Adversarial Detection by Random Perturbations
    Huang, Bo
    Wang, Yi
    Wang, Wei
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4689 - 4696
  • [29] Model-Agnostic Counterfactual Explanations in Credit Scoring
    Dastile, Xolani
    Celik, Turgay
    Vandierendonck, Hans
    IEEE ACCESS, 2022, 10 : 69543 - 69554
  • [30] Model-Agnostic Learning to Meta-Learn
    Devos, Arnout
    Dandi, Yatin
    NEURIPS 2020 WORKSHOP ON PRE-REGISTRATION IN MACHINE LEARNING, VOL 148, 2020, 148 : 155 - 175