Towards Model-Agnostic Dataset Condensation by Heterogeneous Models

被引:0
|
作者
Moon, Jun-Yeong [1 ]
Kim, Jung Uk [1 ]
Park, Gyeong-Moon [1 ]
机构
[1] Kyung Hee Univ, Yongin, South Korea
来源
关键词
Dataset condensation; Model agnostic; Heterogeneous;
D O I
10.1007/978-3-031-73397-0_14
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The advancement of deep learning has coincided with the proliferation of both models and available data. The surge in dataset sizes and the subsequent surge in computational requirements have led to the development of the Dataset Condensation (DC). While prior studies have delved into generating synthetic images through methods like distribution alignment and training trajectory tracking for more efficient model training, a significant challenge arises when employing these condensed images practically. Notably, these condensed images tend to be specific to particular models, constraining their versatility and practicality. In response to this limitation, we introduce a novel method, Heterogeneous Model Dataset Condensation (HMDC), designed to produce universally applicable condensed images through cross-model interactions. To address the issues of gradient magnitude difference and semantic distance in models when utilizing heterogeneous models, we propose the Gradient Balance Module (GBM) and Mutual Distillation (MD) with the Spatial-Semantic Decomposition method. By balancing the contribution of each model and maintaining their semantic meaning closely, our approach overcomes the limitations associated with model-specific condensed images and enhances the broader utility. The source code is available in https://github.com/KHU-AGI/HMDC.
引用
收藏
页码:234 / 250
页数:17
相关论文
共 50 条
  • [31] Bayesian Model-Agnostic Meta-Learning
    Yoon, Jaesik
    Kim, Taesup
    Dia, Ousmane
    Kim, Sungwoong
    Bengio, Yoshua
    Ahn, Sungjin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [32] MIIND : A Model-Agnostic Simulator of Neural Populations
    Osborne, Hugh
    Lai, Yi Ming
    Lepperod, Mikkel Elle
    Sichau, David
    Deutz, Lukas
    de Kamps, Marc
    FRONTIERS IN NEUROINFORMATICS, 2021, 15
  • [33] Probabilistic Model-Agnostic Meta-Learning
    Finn, Chelsea
    Xu, Kelvin
    Levine, Sergey
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [34] Optimizing model-agnostic random subspace ensembles
    Huynh-Thu, Van Anh
    Geurts, Pierre
    MACHINE LEARNING, 2024, 113 (02) : 993 - 1042
  • [35] MaNtLE: A Model-agnostic Natural Language Explainer
    Menon, Rakesh R.
    Zaman, Kerem
    Srivastava, Shashank
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 13493 - 13511
  • [36] Model-Agnostic Augmentation for Accurate Graph Classification
    Yoo, Jaemin
    Shim, Sooyeon
    Kang, U.
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 1281 - 1291
  • [37] Model-Agnostic Counterfactual Explanations for Consequential Decisions
    Karimi, Amir-Hossein
    Barthe, Gilles
    Balle, Borja
    Valera, Isabel
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 895 - 904
  • [38] Semantic Reasoning from Model-Agnostic Explanations
    Perdih, Timen Stepisnik
    Lavrac, Nada
    Skrlj, Blaz
    2021 IEEE 19TH WORLD SYMPOSIUM ON APPLIED MACHINE INTELLIGENCE AND INFORMATICS (SAMI 2021), 2021, : 105 - 110
  • [39] MAME : Model-Agnostic Meta-Exploration
    Gurumurthy, Swaminathan
    Kumar, Sumit
    Sycara, Katia
    CONFERENCE ON ROBOT LEARNING, VOL 100, 2019, 100
  • [40] Cyberbullying Classifiers are Sensitive to Model-Agnostic Perturbations
    Emmery, Chris
    Kadar, Akos
    Chrupala, Grzegorz
    Daelemans, Walter
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 2976 - 2988