TTA-COPE: Test-Time Adaptation for Category-Level Object Pose Estimation

被引:14
|
作者
Lee, Taeyeop [1 ]
Tremblay, Jonathan [2 ]
Blukis, Valts [2 ]
Wen, Bowen [2 ]
Lee, Byeong-Uk [1 ]
Shin, Inkyu [1 ]
Birchfield, Stan [2 ]
Kweon, In So [1 ]
Yoon, Kuk-Jin [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Daejeon, South Korea
[2] NVIDIA, San Francisco, CA USA
关键词
D O I
10.1109/CVPR52729.2023.02039
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Test-time adaptation methods have been gaining attention recently as a practical solution for addressing source-to-target domain gaps by gradually updating the model without requiring labels on the target data. In this paper, we propose a method of test-time adaptation for category-level object pose estimation called TTA-COPE. We design a pose ensemble approach with a self-training loss using pose-aware confidence. Unlike previous unsupervised domain adaptation methods for category-level object pose estimation, our approach processes the test data in a sequential, online manner, and it does not require access to the source domain at runtime. Extensive experimental results demonstrate that the proposed pose ensemble and the self-training loss improve category-level object pose performance during test time under both semi-supervised and unsupervised settings.
引用
收藏
页码:21285 / 21295
页数:11
相关论文
共 50 条
  • [31] Category-Level Articulated Object 9D Pose Estimation via Reinforcement Learning
    Liu, Liu
    Du, Jianming
    Wu, Hao
    Yang, Xun
    Liu, Zhenguang
    Hong, Richang
    Wang, Meng
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 728 - 736
  • [32] Dual-COPE: A novel prior-based category-level object pose estimation network with dual Sim2Real unsupervised domain adaptation module
    Ren, Xi
    Guo, Nan
    Zhu, Zichen
    Jiang, Xinbei
    COMPUTERS & GRAPHICS-UK, 2024, 124
  • [33] Generative Category-Level Shape and Pose Estimation with Semantic Primitives
    Li, Guanglin
    Li, Yifeng
    Ye, Zhichao
    Zhang, Qihang
    Kong, Tao
    Cui, Zhaopeng
    Zhang, Guofeng
    CONFERENCE ON ROBOT LEARNING, VOL 205, 2022, 205 : 1390 - 1400
  • [34] Synthetic Depth Image-Based Category-Level Object Pose Estimation With Effective Pose Decoupling and Shape Optimization
    Yu, Sheng
    Zhai, Di-Hua
    Xia, Yuanqing
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 1
  • [35] SSP-Pose: Symmetry-Aware Shape Prior Deformation for Direct Category-Level Object Pose Estimation
    Zhang, Ruida
    Di, Yan
    Manhardt, Fabian
    Tombari, Federico
    Ji, Xiangyang
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 7452 - 7459
  • [36] GSNet: Model Reconstruction Network for Category-level 6D Object Pose and Size Estimation
    Liu, Penglei
    Zhang, Qieshi
    Cheng, Jun
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 2898 - 2904
  • [37] LaPose: Laplacian Mixture Shape Modeling for RGB-Based Category-Level Object Pose Estimation
    Zhang, Ruida
    Huang, Ziqin
    Wang, Gu
    Zhang, Chenyangguang
    Die, Yan
    Zuo, Xingxing
    Tang, Jiwen
    Ji, Xiangyang
    COMPUTER VISION - ECCV 2024, PT XXV, 2025, 15083 : 467 - 484
  • [38] CLIPose: Category-Level Object Pose Estimation With Pre-Trained Vision-Language Knowledge
    Lin, Xiao
    Zhu, Minghao
    Dang, Ronghao
    Zhou, Guangliang
    Shu, Shaolong
    Lin, Feng
    Liu, Chengju
    Chen, Qijun
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (10) : 9125 - 9138
  • [39] Category-Level 6-D Object Pose Estimation With Shape Deformation for Robotic Grasp Detection
    Yu, Sheng
    Zhai, Di-Hua
    Guan, Yuyin
    Xia, Yuanqing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1857 - 1871
  • [40] Keypoint-Based Category-Level Object Pose Tracking from an RGB Sequence with Uncertainty Estimation
    Lin, Yunzhi
    Tremblay, Jonathan
    Tyree, Stephen
    Vela, Patricio A.
    Birchfield, Stan
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022,