Test-time adaptation via self-training with future information

被引:0
|
作者
Wen, Xin [1 ]
Shen, Hao [1 ]
Zhao, Zhongqiu [1 ,2 ,3 ]
机构
[1] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Hefei, Peoples R China
[2] Hefei Univ Technol, Intelligent Interconnected Syst Lab Anhui Prov, Hefei, Peoples R China
[3] Guangxi Acad Sci, Nanning, Peoples R China
基金
中国国家自然科学基金;
关键词
domain adaptation; test-time adaptation; self-training; distribution shift; image classification;
D O I
10.1117/1.JEI.33.3.033012
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
. Test-time adaptation (TTA) aims to address potential differences in data distribution between the training and testing phases by modifying a pretrained model based on each specific test sample. This process is especially crucial for deep learning models, as they often encounter frequent changes in the testing environment. Currently, popular TTA methods rely primarily on pseudo-labels (PLs) as supervision signals and fine-tune the model through backpropagation. Consequently, the success of the model's adaptation depends directly on the quality of the PLs. High-quality PLs can enhance the model's performance, whereas low-quality ones may lead to poor adaptation results. Intuitively, if the PLs predicted by the model for a given sample remain consistent in both the current and future states, it suggests a higher confidence in that prediction. Using such consistent PLs as supervision signals can greatly benefit long-term adaptation. Nevertheless, this approach may induce overconfidence in the model's predictions. To counter this, we introduce a regularization term that penalizes overly confident predictions. Our proposed method is highly versatile and can be seamlessly integrated with various TTA strategies, making it immensely practical. We investigate different TTA methods on three widely used datasets (CIFAR10C, CIFAR100C, and ImageNetC) with different scenarios and show that our method achieves competitive or state-of-the-art accuracies on all of them.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] TeST: Test-time Self-Training under Distribution Shift
    Sinha, Samarth
    Gehler, Peter
    Locatello, Francesco
    Schiele, Bernt
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2758 - 2768
  • [2] Revisiting Realistic Test-Time Training: Sequential Inference and Adaptation by Anchored Clustering Regularized Self-Training
    Su, Yongyi
    Xu, Xun
    Li, Tianrui
    Jia, Kui
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (08) : 5524 - 5540
  • [3] Towards Real-World Test-Time Adaptation: Tri-net Self-Training with Balanced Normalization
    Su, Yongyi
    Xu, Xun
    Jia, Kui
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 13, 2024, : 15126 - 15135
  • [4] On the Robustness of Open-World Test-Time Training: Self-Training with Dynamic Prototype Expansion
    Li, Yushu
    Xu, Xun
    Su, Yongyi
    Jia, Kui
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11802 - 11812
  • [5] Adversarial Domain Adaptation Enhanced via Self-training
    Altinel, Fazil
    Akkaya, Ibrahim Batuhan
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [6] MT3: Meta Test-Time Training for Self-Supervised Test-Time Adaption
    Bartler, Alexander
    Buehler, Andre
    Wiewel, Felix
    Doebler, Mario
    Yang, Bin
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [7] Contrastive Test-Time Adaptation
    Chen, Dian
    Wang, Dequan
    Darrell, Trevor
    Ibrahimi, Sayna
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 295 - 305
  • [8] Self-supervised Test-time Adaptation on Video Data
    Azimi, Fatemeh
    Palacio, Sebastian
    Raue, Federico
    Hees, Joern
    Bertinetto, Luca
    Dengel, Andreas
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 2603 - 2612
  • [9] Test-Time Adaptation via Conjugate Pseudo-labels
    Goyal, Sachin
    Sun, Mingjie
    Raghunathan, Aditi
    Kolter, Zico
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [10] ClusT3: Information Invariant Test-Time Training
    Hakim, Gustavo A. Vargas
    Osowiechi, David
    Noori, Mehrdad
    Cheraghalikhani, Milad
    Bahri, Ali
    Ben Ayed, Ismail
    Desrosiers, Christian
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6113 - 6122