Language Semantic Graph Guided Data-Efficient Learning

被引:0
|
作者
Ma, Wenxuan [1 ]
Li, Shuang [1 ]
Cai, Lincan [1 ]
Kang, Jingxuan [2 ]
机构
[1] Beijing Inst Technol, Beijing, Peoples R China
[2] Univ Liverpool, Liverpool, Merseyside, England
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Developing generalizable models that can effectively learn from limited data and with minimal reliance on human supervision is a significant objective within the machine learning community, particularly in the era of deep neural networks. Therefore, to achieve data-efficient learning, researchers typically explore approaches that can leverage more related or unlabeled data without necessitating additional manual labeling efforts, such as Semi-Supervised Learning (SSL), Transfer Learning (TL), and Data Augmentation (DA). SSL leverages unlabeled data in the training process, while TL enables the transfer of expertise from related data distributions. DA broadens the dataset by synthesizing new data from existing examples. However, the significance of additional knowledge contained within labels has been largely overlooked in research. In this paper, we propose a novel perspective on data efficiency that involves exploiting the semantic information contained in the labels of the available data. Specifically, we introduce a Language Semantic Graph (LSG) which is constructed from labels manifest as natural language descriptions. Upon this graph, an auxiliary graph neural network is trained to extract high-level semantic relations and then used to guide the training of the primary model, enabling more adequate utilization of label knowledge. Across image, video, and audio modalities, we utilize the LSG method in both TL and SSL scenarios and illustrate its versatility in significantly enhancing performance compared to other data-efficient learning approaches. Additionally, our in-depth analysis shows that the LSG method also expedites the training process.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Data-Efficient Graph Learning
    Ding, Kaize
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 20, 2024, : 22663 - 22663
  • [2] Data-efficient graph learning: Problems, progress, and prospects
    Ding, Kaize
    Liu, Yixin
    Zhang, Chuxu
    Wang, Jianling
    [J]. AI Magazine, 2024, 45 (04) : 549 - 560
  • [3] Data-Efficient Graph Embedding Learning for PCB Component Detection
    Kuo, Chia-Wen
    Ashmore, Jacob D.
    Huggins, David
    Kira, Zsolt
    [J]. 2019 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2019, : 551 - 560
  • [4] Shielded Planning Guided Data-Efficient and Safe Reinforcement Learning
    Wang, Hao
    Qin, Jiahu
    Kan, Zhen
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 12
  • [5] Data-Efficient Hierarchical Reinforcement Learning
    Nachum, Ofir
    Gu, Shixiang
    Lee, Honglak
    Levine, Sergey
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [6] Uniform Priors for Data-Efficient Learning
    Sinha, Samarth
    Roth, Karsten
    Goyal, Anirudh
    Ghassemi, Marzyeh
    Akata, Zeynep
    Larochelle, Hugo
    Garg, Animesh
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 4026 - 4037
  • [7] Data-Efficient Reinforcement Learning for Malaria Control
    Zou, Lixin
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 507 - 513
  • [8] Data-efficient Learning of Morphology and Controller for a Microrobot
    Liao, Thomas
    Wang, Grant
    Yang, Brian
    Lee, Rene
    Pister, Kristofer
    Levine, Sergey
    Calandra, Roberto
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 2488 - 2494
  • [9] Data-efficient performance learning for configurable systems
    Jianmei Guo
    Dingyu Yang
    Norbert Siegmund
    Sven Apel
    Atrisha Sarkar
    Pavel Valov
    Krzysztof Czarnecki
    Andrzej Wasowski
    Huiqun Yu
    [J]. Empirical Software Engineering, 2018, 23 : 1826 - 1867
  • [10] Pretraining Representations for Data-Efficient Reinforcement Learning
    Schwarzer, Max
    Rajkumar, Nitarshan
    Noukhovitch, Michael
    Anand, Ankesh
    Charlin, Laurent
    Hjelm, Devon
    Bachman, Philip
    Courville, Aaron
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34