Deep Learning-Based 6-DoF Object Pose Estimation Considering Synthetic Dataset

被引:0
|
作者
Zheng, Tianyu [1 ]
Zhang, Chunyan [1 ]
Zhang, Shengwen [1 ]
Wang, Yanyan [1 ]
机构
[1] Jiangsu Univ Sci & Technol, Sch Mech Engineer, Zhenjiang 212100, Peoples R China
关键词
6-DoF object pose estimation; synthetic dataset; deep learning; bilateral filtering; CBAM-CDAE;
D O I
10.3390/s23249854
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Due to the difficulty in generating a 6-Degree-of-Freedom (6-DoF) object pose estimation dataset, and the existence of domain gaps between synthetic and real data, existing pose estimation methods face challenges in improving accuracy and generalization. This paper proposes a methodology that employs higher quality datasets and deep learning-based methods to reduce the problem of domain gaps between synthetic and real data and enhance the accuracy of pose estimation. The high-quality dataset is obtained from Blenderproc and it is innovatively processed using bilateral filtering to reduce the gap. A novel attention-based mask region-based convolutional neural network (R-CNN) is proposed to reduce the computation cost and improve the model detection accuracy. Meanwhile, an improved feature pyramidal network (iFPN) is achieved by adding a layer of bottom-up paths to extract the internalization of features of the underlying layer. Consequently, a novel convolutional block attention module-convolutional denoising autoencoder (CBAM-CDAE) network is proposed by presenting channel attention and spatial attention mechanisms to improve the ability of AE to extract images' features. Finally, an accurate 6-DoF object pose is obtained through pose refinement. The proposed approach is compared to other models using the T-LESS and LineMOD datasets. Comparison results demonstrate the proposed approach outperforms the other estimation models.
引用
收藏
页数:24
相关论文
共 50 条
  • [41] Deep Learning-based Mobile Robot Target Object Localization and Pose Estimation Research
    He, Caixia
    He, Laiyun
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (06) : 1325 - 1333
  • [42] Multi-Modal Pose Representations for 6-DOF Object Tracking
    Majcher, Mateusz
    Kwolek, Bogdan
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2024, 110 (04)
  • [43] Review of Deep Learning-Based Human Pose Estimation
    Lu Jian
    Yang Tengfei
    Zhao Bo
    Wang Hangying
    Luo Maoxin
    Zhou Yanran
    Li Zhe
    LASER & OPTOELECTRONICS PROGRESS, 2021, 58 (24)
  • [44] Deep Learning-based Human Pose Estimation: A Survey
    Zheng, Ce
    Wu, Wenhan
    Chen, Chen
    Yang, Taojiannan
    Zhu, Sijie
    Shen, Ju
    Kehtarnavaz, Nasser
    Shah, Mubarak
    ACM COMPUTING SURVEYS, 2024, 56 (01)
  • [45] Enhanced vision-based 6-DoF pose estimation for robotic rebar tying
    Liu, Mi
    Guo, Jingjing
    Deng, Lu
    Wang, Songyue
    Wang, Huiguang
    AUTOMATION IN CONSTRUCTION, 2025, 171
  • [46] Accurate estimation of 6-DoF tooth pose in 3D intraoral scans for dental applications using deep learning
    Ding, Wanghui
    Sun, Kaiwei
    Yu, Mengfei
    Lin, Hangzheng
    Feng, Yang
    Li, Jianhua
    Liu, Zuozhu
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2024, 25 (09) : 1240 - 1249
  • [47] 6-DOF Pose Estimation of a Portable Navigation Aid for the Visually Impaired
    Tamjidi, Amirhossein
    Ye, Cang
    Hong, Soonhac
    2013 IEEE INTERNATIONAL SYMPOSIUM ON ROBOTIC AND SENSORS ENVIRONMENTS (ROSE 2013), 2013,
  • [48] 3D pose estimation dataset and deep learning-based ergonomic risk assessment in construction
    Fan, Chao
    Mei, Qipei
    Li, Xinming
    AUTOMATION IN CONSTRUCTION, 2024, 164
  • [49] Robust Ego and Object 6-DoF Motion Estimation and Tracking
    Zhang, Jun
    Henein, Mina
    Mahony, Robert
    Ila, Viorela
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 5017 - 5023
  • [50] Pose-guided Auto-Encoder and Feature-Based Refinement for 6-DoF Object Pose Regression
    Li, Zhigang
    Ji, Xiangyang
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 8397 - 8403