FEATURE ADVERSARIAL DISTILLATION FOR POINT CLOUD CLASSIFICATION

被引:0
|
作者
Lee, YuXing [1 ]
Wu, Wei [1 ]
机构
[1] Inner Mongolia Univ, Dept Comp Sci, Hohhot, Peoples R China
基金
中国国家自然科学基金;
关键词
point cloud classification; knowledge distillation; feature adversarial;
D O I
10.1109/ICIP49359.2023.10222554
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Due to the point cloud's irregular and unordered geometry structure, conventional knowledge distillation technology lost a lot of information when directly used on point cloud tasks. In this paper, we propose Feature Adversarial Distillation (FAD) method, a generic adversarial loss function in point cloud distillation, to reduce loss during knowledge transfer. In the feature extraction stage, the features extracted by the teacher are used as the discriminator, and the students continuously generate new features in the training stage. The feature of the student is obtained by attacking the feedback from the teacher and getting a score to judge whether the student has learned the knowledge well or not. In experiments on standard point cloud classification on ModelNet40 and ScanObjectNN datasets, our method reduced the information loss of knowledge transfer in distillation in 40x model compression while maintaining competitive performance.
引用
收藏
页码:970 / 974
页数:5
相关论文
共 50 条
  • [21] Point Cloud Classification Method Based on Graph Convolution and Multilayer Feature Fusion
    Sheng, Tian
    Anyang, Long
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (14)
  • [22] Corrupted Point Cloud Classification Through Deep Learning with Local Feature Descriptor
    Wu, Xian
    Guo, Xueyi
    Peng, Hang
    Su, Bin
    Ahamod, Sabbir
    Han, Fenglin
    SENSORS, 2024, 24 (23)
  • [23] CardioDefense: Defending against adversarial attack in ECG classification with adversarial distillation training
    Shao, Jiahao
    Geng, Shijia
    Fu, Zhaoji
    Xu, Weilun
    Liu, Tong
    Hong, Shenda
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 91
  • [24] Link prediction via adversarial knowledge distillation and feature aggregation
    Li, Wen
    Song, Xiaoning
    Zhang, Wenjie
    Hua, Yang
    Wu, Xiaojun
    MULTIMEDIA SYSTEMS, 2025, 31 (02)
  • [25] Feature Distillation in Deep Attention Network Against Adversarial Examples
    Chen, Xin
    Weng, Jian
    Deng, Xiaoling
    Luo, Weiqi
    Lan, Yubin
    Tian, Qi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (07) : 3691 - 3705
  • [26] Adversarial neighbor perception network with feature distillation for anomaly detection
    Su, Yuting
    Su, Enqi
    Wang, Weiming
    Jing, Peiguang
    Ma, Dubuke
    Wang, Fu Lee
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 274
  • [27] Feature-map-level Online Adversarial Knowledge Distillation
    Chung, Inseop
    Park, SeongUk
    Kim, Jangho
    Kwak, Nojun
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [28] Feature-map-level Online Adversarial Knowledge Distillation
    Chung, Inseop
    Park, SeongUk
    Kim, Jangho
    Kwak, Nojun
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [29] On Adversarial Robustness of Point Cloud Semantic Segmentation
    Xu, Jiacen
    Zhou, Zhe
    Feng, Boyuan
    Ding, Yufei
    Li, Zhou
    2023 53RD ANNUAL IEEE/IFIP INTERNATIONAL CONFERENCE ON DEPENDABLE SYSTEMS AND NETWORKS, DSN, 2023, : 531 - 544
  • [30] MULTI-ATTRIBUTE JOINT POINT CLOUD SUPER-RESOLUTION WITH ADVERSARIAL FEATURE GRAPH NETWORKS
    Zhou, Yichen
    Zhang, Xinfeng
    Wang, Shanshe
    Li, Lin
    2022 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (IEEE ICMEW 2022), 2022,