TOWARDS GENERALIZABLE DEEPFAKE FACE FORGERY DETECTION WITH SEMI-SUPERVISED LEARNING AND KNOWLEDGE DISTILLATION

被引:5
|
作者
Lin, Yuzhen [1 ]
Chen, Han [1 ]
Li, Bin [1 ]
Wu, Junqiang [1 ]
机构
[1] Shenzhen Univ, Shenzhen Inst Articial Intelligence & Robot Soc, Guangdong Key Lab Intelligent Informat Proc, Shenzhen Key Lab Media Secur, Shenzhen 518060, Peoples R China
关键词
Deepfake detection; self-supervised contrastive learning; knowledge distillation;
D O I
10.1109/ICIP46576.2022.9897792
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing methods for deepfake face forgery detection have already achieved tremendous progress in well-controlled laboratory conditions. However, under wild scenarios where the training and testing forgeries are synthesized by different algorithms and when labeled data are insufficient, the performance always drops greatly. In this work, we present a Semi-supervised Contrastive Learning and Knowledge Distillation-based framework (SCL-KD) for deepfake detection to reduce the aforementioned performance gap. Our proposed framework contains three stages: self-supervised pre-training, supervised training, and knowledge distillation. Specifically, a feature encoder is firstly trained in a self-supervised manner with a large number of unlabeled samples through a momentum contrastive mechanism. Secondly, a fully-connected classifier on top of the feature encoder is trained in a supervised manner with a small amount of labeled samples to build a teacher model. Finally, a compact student model is trained with the help of the teacher model using knowledge distillation, in order to avoid overfitting to labeled data and have better generalizability on mismatched datasets. Evaluations on several benchmark datasets corroborate the good performance of our approach in cross-dataset situations and few labeled data scenarios. It reveals the potential of our proposed method for real-world deepfake detection.
引用
收藏
页码:576 / 580
页数:5
相关论文
共 50 条
  • [1] Semi-supervised Learning for Generalizable Intracranial Hemorrhage Detection and Segmentation
    Lin, Emily
    Yuh, Esther L.
    RADIOLOGY-ARTIFICIAL INTELLIGENCE, 2024, 6 (03)
  • [2] A Semi-Supervised Federated Learning Scheme via Knowledge Distillation for Intrusion Detection
    Zhao, Ruijie
    Yang, Linbo
    Wang, Yijun
    Xue, Zhi
    Gui, Guan
    Ohtsukit, Tomoaki
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2688 - 2693
  • [3] Semi-supervised Deep Domain Adaptation for Deepfake Detection
    Seraj, Md Shamim
    Singh, Ankita
    Chakraborty, Shayok
    2024 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WORKSHOPS, WACVW 2024, 2024, : 1061 - 1071
  • [4] Knowledge Distillation Meets Open-Set Semi-supervised Learning
    Yang, Jing
    Zhu, Xiatian
    Bulat, Adrian
    Martinez, Brais
    Tzimiropoulos, Georgios
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2025, 133 (01) : 315 - 334
  • [5] Knowledge Distillation for Semi-supervised Domain Adaptation
    Orbes-Arteainst, Mauricio
    Cardoso, Jorge
    Sorensen, Lauge
    Igel, Christian
    Ourselin, Sebastien
    Modat, Marc
    Nielsen, Mads
    Pai, Akshay
    OR 2.0 CONTEXT-AWARE OPERATING THEATERS AND MACHINE LEARNING IN CLINICAL NEUROIMAGING, 2019, 11796 : 68 - 76
  • [6] Semi-supervised Campus Network Intrusion Detection Based on Knowledge Distillation
    Chen, Junjun
    Guo, Qiang
    Fu, Zhongnan
    Shang, Qun
    Ma, Hao
    Wang, Nai
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [7] A Survey on Face Forgery Detection of Deepfake
    Zhang, Ying
    Gao, Feng
    Zhou, Zichen
    Guo, Hong
    THIRTEENTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2021), 2021, 11878
  • [8] Learning Semi-Supervised Representation Towards a Unified Optimization Framework for Semi-Supervised Learning
    Li, Chun-Guang
    Lin, Zhouchen
    Zhang, Honggang
    Guo, Jun
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 2767 - 2775
  • [9] LiteGrasp: A Light Robotic Grasp Detection via Semi-Supervised Knowledge Distillation
    Peng, Linpeng
    Cai, Rongyao
    Xiang, Jingyang
    Zhu, Junyu
    Liu, Weiwei
    Gao, Wang
    Liu, Yong
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7995 - 8002
  • [10] Collaborative deep semi-supervised learning with knowledge distillation for surface defect classification
    Manivannan, Siyamalan
    COMPUTERS & INDUSTRIAL ENGINEERING, 2023, 186