First-Person Indoor Navigation via Vision-Inertial Data Fusion

被引:0
|
作者
Farnoosh, Amirreza [1 ]
Nabian, Mohsen [1 ]
Closas, Pau [1 ]
Ostadabbas, Sarah [1 ]
机构
[1] Northeastern Univ, Elect & Comp Engn Dept, Boston, MA 02115 USA
关键词
Computer vision; Data fusion; Expectation maximization algorithm; In-door navigation; Simultaneous localization and mapping (SLAM);
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we aim to enhance the first-person indoor navigation and scene understanding experience by fusing inertial data collected from a smartphone carried by the user with the vision information obtained through the phone's camera. We employed the concept of vanishing directions together with the orthogonality constraints of the man-made environments in an expectation maximization framework to estimate person's orientation with respect to the known indoor coordinates from video frames. This framework allows to include prior information about camera rotation axis for better estimations as well as to select candidate edge-lines for estimation of hallways' depth and width from monocular video frames, and 31) modeling of the scene. Our proposed algorithm concurrently combines the vision-based estimated orientation with the inertial data using a Kalman filter in order to reline estimations and remove substantial measurement drift from inertial sensors. We evaluated the performance of our vision-inertial data fusion method on an IMU-augmented video recorded from a rotary hallway in which a participant completed a full lap. We demonstrated that this fusion provides virtually drift-free instantaneous information about the person's relative orientation. We were able to estimate hallways' depth and width, and generate a closed-path map from the rotary hallway over a roughly 60-meter lap.
引用
下载
收藏
页码:1213 / 1222
页数:10
相关论文
共 50 条
  • [21] An immersive first-person navigation task for abstract knowledge acquisition
    Kuhrt, Doerte
    St John, Natalie R.
    Bellmund, Jacob L. S.
    Kaplan, Raphael
    Doeller, Christian F.
    SCIENTIFIC REPORTS, 2021, 11 (01)
  • [22] Integrated vision/inertial navigation method of UAVs in indoor environment
    Wang T.
    Cai Z.
    Wang Y.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2018, 44 (01): : 176 - 186
  • [23] Using First-Person Data About Consciousness
    Spener, Maja
    JOURNAL OF CONSCIOUSNESS STUDIES, 2011, 18 (01) : 165 - 179
  • [24] A study on dual quaternion based cooperative relative navigation of multiple UAVs with monocular vision-inertial integration
    Byungjin LEE
    Sangkyung SUNG
    Chinese Journal of Aeronautics, 2024, 37 (11) : 335 - 354
  • [25] USABILITY OF FIRST-PERSON VISION AMONG OLDER ADULTS AND THEIR SPOUSES
    Matthews, J. T.
    Hebert, M.
    Devyver, M.
    Dodds, A.
    Mecca, L. Person
    Beach, S.
    Schulz, R.
    GERONTOLOGIST, 2011, 51 : 60 - 60
  • [26] Hand-Writing Motion Tracking with Vision-Inertial Sensor Fusion: Calibration and Error Correction
    Zhou, Shengli
    Fei, Fei
    Zhang, Guanglie
    Liu, Yunhui
    Li, Wen J.
    SENSORS, 2014, 14 (09): : 15641 - 15657
  • [27] A study on dual quaternion based cooperative relative navigation of multiple UAVs with monocular vision-inertial integration
    LEE, Byungjin
    SUNG, Sangkyung
    Chinese Journal of Aeronautics, 37 (11): : 335 - 354
  • [28] An improved inertial/wifi/magnetic fusion structure for indoor navigation
    Li, You
    Zhuang, Yuan
    Zhang, Peng
    Lan, Haiyu
    Niu, Xiaoji
    El-Sheimy, Naser
    INFORMATION FUSION, 2017, 34 : 101 - 119
  • [29] Indoor location method based on UWB and inertial navigation fusion
    Liang, Yan
    Zhang, Qingdong
    Zhao, Ning
    Li, Chuanmiao
    Hongwai yu Jiguang Gongcheng/Infrared and Laser Engineering, 2021, 50 (09):
  • [30] Indoor Navigation with a Smartphone Fusing Inertial and WiFi Data via Factor Graph Optimization
    Nowicki, Michal
    Skrzypczynski, Piotr
    MOBILE COMPUTING, APPLICATIONS, AND SERVICES (MOBICASE 2015), 2015, 162 : 280 - 298