Device adaptation free-KDA based on multi-teacher knowledge distillation

被引:0
|
作者
Yafang Yang
Bin Guo
Yunji Liang
Kaixing Zhao
Zhiwen Yu
机构
[1] Northwestern Polytechnical University,School of Computer Science
关键词
Device adaptation; Free-KDA; Multi-teacher knowledge distillation; One-class classification; Adaptive confidence gate;
D O I
10.1007/s12652-024-04836-5
中图分类号
学科分类号
摘要
The keyboard, a major mean of interaction between human and internet devices, should beset right for good performance during authentication task. To guarantee that one legitimate user can interleave or simultaneously interact with two or more devices with protecting user privacy, it is essential to build device adaptation free-text keystroke dynamics authentication (free-KDA) model based on multi-teacher knowledge distillation methods. Some multi-teacher knowledge distillation methods have shown effective in C-way classification task. However, it is unreasonable for free-KDA model, since free-KDA model is one-class classification task. Instead of using soft-label to capture useful knowledge of source for target device, we propose a device adaptation free-KDA model. When one user builds the authentication model for target device with limited training samples, we propose a novel optimization objective by decreasing the distance discrepancy in Euclidean distance and cosine similarity between source and target device. And then, we adopt an adaptive confidence gate strategy to solve different correlation for each user between different source devices and target device. It is verified on two keystroke datasets with different types of keyboards, and compared its performance with the existing dominant multi-teacher knowledge distillation methods. Extensive experimental results demonstrate that AUC of target device reaches up to 95.17%, which is 15.28% superior to state-of-the-art multi-teacher knowledge distillation methods.
引用
收藏
页码:3603 / 3615
页数:12
相关论文
共 50 条
  • [1] Decoupled Multi-teacher Knowledge Distillation based on Entropy
    Cheng, Xin
    Tang, Jialiang
    Zhang, Zhiqiang
    Yu, Wenxin
    Jiang, Ning
    Zhou, Jinjia
    [J]. 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
  • [2] Anomaly detection based on multi-teacher knowledge distillation
    Ma, Ye
    Jiang, Xu
    Guan, Nan
    Yi, Wang
    [J]. JOURNAL OF SYSTEMS ARCHITECTURE, 2023, 138
  • [3] Correlation Guided Multi-teacher Knowledge Distillation
    Shi, Luyao
    Jiang, Ning
    Tang, Jialiang
    Huang, Xinlei
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
  • [4] Reinforced Multi-Teacher Selection for Knowledge Distillation
    Yuan, Fei
    Shou, Linjun
    Pei, Jian
    Lin, Wutao
    Gong, Ming
    Fu, Yan
    Jiang, Daxin
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
  • [5] Knowledge Distillation via Multi-Teacher Feature Ensemble
    Ye, Xin
    Jiang, Rongxin
    Tian, Xiang
    Zhang, Rui
    Chen, Yaowu
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 566 - 570
  • [6] CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION
    Zhang, Hailin
    Chen, Defang
    Wang, Can
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4498 - 4502
  • [7] Adaptive multi-teacher multi-level knowledge distillation
    Liu, Yuang
    Zhang, Wei
    Wang, Jun
    [J]. NEUROCOMPUTING, 2020, 415 : 106 - 113
  • [8] Knowledge Distillation via Multi-Teacher Feature Ensemble
    Ye, Xin
    Jiang, Rongxin
    Tian, Xiang
    Zhang, Rui
    Chen, Yaowu
    [J]. IEEE Signal Processing Letters, 2024, 31 : 566 - 570
  • [9] Robust Semantic Segmentation With Multi-Teacher Knowledge Distillation
    Amirkhani, Abdollah
    Khosravian, Amir
    Masih-Tehrani, Masoud
    Kashiani, Hossein
    [J]. IEEE ACCESS, 2021, 9 : 119049 - 119066
  • [10] Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
    Zhang, Hailin
    Chen, Defang
    Wang, Can
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1943 - 1948