Test-time Domain Adaptation for Monocular Depth Estimation

被引:2
|
作者
Li, Zhi [1 ,2 ]
Sh, Shaoshuai [1 ]
Schiele, Bernt [1 ]
Dai, Dengxin [1 ]
机构
[1] Max Planck Inst Informat, Saarbrucken, Germany
[2] Saarland Univ Campus, Saarbrucken, Germany
关键词
D O I
10.1109/ICRA48891.2023.10161304
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Test-time domain adaptation, i.e. adapting source-pretrained models to the test data on-the-fly in a source-free, unsupervised manner, is a highly practical yet very challenging task. Due to the domain gap between source and target data, inference quality on the target domain can drop drastically especially in terms of absolute scale of depth. In addition, unsupervised adaptation can degrade the model performance due to inaccurate pseudo labels. Furthermore, the model can suffer from catastrophic forgetting when errors are accumulated over time. We propose a test-time domain adaptation framework for monocular depth estimation which achieves both stability and adaptation performance by benefiting from both self-training of the supervised branch and pseudo labels from self-supervised branch, and is able to tackle the above problems: our scale alignment scheme aligns the input features between source and target data, correcting the absolute scale inference on the target domain; with pseudo label consistency check, we select confident pixels thus improve pseudo label quality; regularisation and self-training schemes are applied to help avoid catastrophic forgetting. Without requirement of further supervisions on the target domain, our method adapts the source-trained models to the test data with significant improvements over the direct inference results, providing scale-aware depth map outputs that outperform the state-of-the-arts. Code is available at https://github.com/Malefikus/ada-depth.
引用
收藏
页码:4873 / 4879
页数:7
相关论文
共 50 条
  • [1] Continual Test-Time Domain Adaptation
    Wang, Qin
    Fink, Olga
    Van Gool, Luc
    Dai, Dengxin
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 7191 - 7201
  • [2] Improved Test-Time Adaptation for Domain Generalization
    Chen, Liang
    Zhang, Yong
    Song, Yibing
    Shan, Ying
    Liu, Lingqiao
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24172 - 24182
  • [3] Domain Alignment Meets Fully Test-Time Adaptation
    Thopalli, Kowshik
    Turaga, Pavan
    Thiagarajan, Jayaraman J.
    Proceedings of Machine Learning Research, 2022, 189 : 1006 - 1021
  • [4] Category-Aware Test-Time Training Domain Adaptation
    Feng, Yangqin
    Xu, Xinxing
    Fu, Huazhu
    Wang, Yan
    Wang, Zizhou
    Zhen, Liangli
    Goh, Rick Siow Mong
    Liu, Yong
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 300 - 306
  • [5] Multiple Teacher Model for Continual Test-Time Domain Adaptation
    Wang, Ran
    Zuo, Hua
    Fang, Zhen
    Lu, Jie
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI 2023, PT I, 2024, 14471 : 304 - 314
  • [6] Exploring Safety Supervision for Continual Test-time Domain Adaptation
    Yang, Xu
    Gu, Yanan
    Wei, Kun
    Deng, Cheng
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1649 - 1657
  • [7] Online Adaptive Fault Diagnosis With Test-Time Domain Adaptation
    Wu, Kangkai
    Li, Jingjing
    Meng, Lichao
    Li, Fengling
    Lu, Ke
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (01) : 107 - 117
  • [8] Noise-Robust Continual Test-Time Domain Adaptation
    Yu, Zhiqi
    Li, Jingjing
    Du, Zhekai
    Li, Fengling
    Zhu, Lei
    Yang, Yang
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 2654 - 2662
  • [9] Test-Time Optimization for Video Depth Estimation Using Pseudo Reference Depth
    Zeng, Libing
    Kalantari, Nima Khademi
    COMPUTER GRAPHICS FORUM, 2023, 42 (01) : 195 - 205
  • [10] Test-Time Synthetic-to-Real Adaptive Depth Estimation
    Yi, Eojindl
    Kim, Junmo
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 4938 - 4944