A 3D Estimation Method Using an Omnidirectional Camera and a Spherical Mirror

被引:1
|
作者
Hiruta, Yuya [1 ]
Xie, Chun [2 ]
Shishido, Hidehiko [2 ]
Kitahara, Itaru [2 ]
机构
[1] Univ Tsukuba, Doctoral Program Empowerment Informat, Tsukuba 3058577, Japan
[2] Univ TsukuTsukuba, Ctr Computat Sci, CX, HS, Tsukuba 3058577, Japan
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 14期
关键词
3D estimation; catadioptric imaging system; omnidirectional camera; spherical mirror; epipolar geometry;
D O I
10.3390/app13148348
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
As the demand for 3D information continues to grow in various fields, technologies are rapidly being used to acquire such information. Laser-based estimation and multi-view images are popular methods for sensing 3D information, while deep learning techniques are also being developed. However, the former method requires precise sensing equipment or large observation systems, while the latter relies on substantial prior information in the form of extensive learning datasets. Given these limitations, our research aims to develop a method that is independent of learning and makes it possible to capture a wide range of 3D information using a compact device. This paper introduces a novel approach for estimating the 3D information of an observed scene utilizing a monocular image based on a catadioptric imaging system employing an omnidirectional camera and a spherical mirror. By employing a curved mirror, it is possible to capture a large area in a single observation. At the same time, using an omnidirectional camera enables the creation of a simplified imaging system. The proposed method focuses on a spherical or spherical cap-shaped mirror in the scene. It estimates the mirror's position from the captured images, allowing for the estimation of the scene with great flexibility. Simulation evaluations are conducted to validate the characteristics and effectiveness of our proposed method.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] PRECISE 3D POSITIONING OF A ROBOTIC ARM USING A SINGLE CAMERA AND A FLAT MIRROR
    Kulpate, Chaiyapol
    Paranjape, Raman
    Mehrandezh, Mehran
    [J]. INTERNATIONAL JOURNAL OF OPTOMECHATRONICS, 2008, 2 (03) : 205 - 232
  • [22] An Accurate User Position Estimation Method Using a Single Camera for 3D Display without Glasses
    Kim, Byeoung-su
    Lee, Cho-il
    Ju, Seong-hwan
    Kim, Whoi-Yul
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2013, E96D (06): : 1344 - 1350
  • [23] Accuracy estimation of a new omnidirectional 3D vision sensor
    Orghidan, R
    Salvi, J
    Mouaddib, EM
    [J]. 2005 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), VOLS 1-5, 2005, : 2745 - 2748
  • [24] 3D Model Based Pose Estimation For Omnidirectional Stereovision
    Caron, Guillaume
    Marchand, Eric
    Mouaddib, El Mustapha
    [J]. 2009 IEEE-RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2009, : 5228 - +
  • [25] 3D measurement and camera attitude estimation method based on Trifocal Tensor
    Chen, Shengyi
    Liu, Haibo
    Yao, Linshen
    Yu, Qifeng
    [J]. OPTICAL METROLOGY AND INSPECTION FOR INDUSTRIAL APPLICATIONS IV, 2016, 10023
  • [26] 3D global optimization of calibration parameters of deflectometry system by using a spherical mirror
    Gao, Yiqian
    Tian, Ziyang
    Wei, Haoyun
    Li, Yan
    [J]. MEASUREMENT, 2023, 219
  • [27] 3D Head pose estimation and camera mouse implementation using a monocular video camera
    Masoomeh Nabati
    Alireza Behrad
    [J]. Signal, Image and Video Processing, 2015, 9 : 39 - 44
  • [28] 3D Head pose estimation and camera mouse implementation using a monocular video camera
    Nabati, Masoomeh
    Behrad, Alireza
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2015, 9 (01) : 39 - 44
  • [29] Estimation of camera pose using 2D to 3D corner correspondence
    Shi, FH
    Liu, YC
    [J]. ITCC 2004: INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY: CODING AND COMPUTING, VOL 2, PROCEEDINGS, 2004, : 805 - 809
  • [30] 3D Pose Estimation of Daily Objects Using an RGB-D Camera
    Choi, Changhyun
    Christensen, Henrik I.
    [J]. 2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 3342 - 3349