HeadTrack: Real-Time Human-Computer Interaction via Wireless Earphones

被引:1
|
作者
Hu, Jingyang [1 ,2 ]
Jiang, Hongbo [1 ,2 ]
Xiao, Zhu [1 ,2 ]
Chen, Siyu [1 ,2 ]
Dustdar, Schahram [3 ]
Liu, Jiangchuan [4 ,5 ]
机构
[1] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha 410082, Peoples R China
[2] Hunan Univ, Shenzhen Res Inst, Shenzhen 518055, Peoples R China
[3] TU Wien, Res Div Distributed Syst, A-1040 Vienna, Austria
[4] Simon Fraser Univ, Sch Comp Sci, Burnaby, BC V5A 1S6, Canada
[5] Jiangxing Intelligent Res & Dev Dept Inc, Nanjing 210000, Peoples R China
基金
中国国家自然科学基金;
关键词
Human-computer interaction; acoustic sensing; acoustic ranging; head motion tracking; HEAD POSE ESTIMATION;
D O I
10.1109/JSAC.2023.3345381
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Accurate head movement tracking is crucial for virtual reality and Metaverse in ubiquitous human-computer interaction (HCI) applications. Existing works for head tracking with wearable VR kits and wireless signals require expensive devices and heavy algorithmic processing. To resolve this problem, we propose HeadTrack, a low-cost, high-precision head motion tracking system that uses commercially available wireless earphones to capture the user's head motion in real-time. HeadTrack uses smartphones as 'sound anchors' and emits inaudible chirps picked up by the user's wireless earphones. By measuring the time-of-flight of these signals from the smartphone to each microphone on the earphone, we can deduce the user's face orientation and distance relative to the smartphone, enabling us to accurately track the user's head movement. To realize HeadTrack, we use the cross-correlation method to optimize the Frequency Modulated Continuous Wave (FMCW) based acoustic ranging method, which solves the problem of insufficient wireless earphone bandwidth. Moreover, we solve the problems of asynchronous startup time between devices and the existence of sampling frequency offset. We conduct excessive experiments in real scenarios, and the results prove that HeadTrack can continuously track the direction of the user's head, with an average error under 6.3(degrees) in pitch and 4.9(degrees) in yaw.
引用
收藏
页码:990 / 1002
页数:13
相关论文
共 50 条
  • [1] Real-Time Human-Computer Interaction Using Eye Gazes
    Chen, Haodong
    Zendehdel, Niloofar
    Leu, Ming C.
    Yin, Zhaozheng
    MANUFACTURING LETTERS, 2023, 35 : 883 - 894
  • [2] A Novel Real-Time Eye Detection in Human-Computer Interaction
    Yan Chao
    Wang Yuanqing
    Zhang Zhaoyang
    INNOVATIVE COMPUTING AND INFORMATION, PT II, 2011, 232 : 530 - +
  • [3] Real-Time Human-Computer Interaction Using Eye Gazes
    Chen, Haodong
    Zendehdel, Niloofar
    Leu, Ming C.
    Yin, Zhaozheng
    MANUFACTURING LETTERS, 2023, 35 : 883 - 894
  • [4] A Novel Real-Time Eye Detection in Human-Computer Interaction
    Yan, Chao
    Wang, Yuanqing
    Zhang, Zhaoyang
    2010 SECOND INTERNATIONAL CONFERENCE ON E-LEARNING, E-BUSINESS, ENTERPRISE INFORMATION SYSTEMS, AND E-GOVERNMENT (EEEE 2010), VOL I, 2010, : 57 - 62
  • [5] Real-Time Continuous Gesture Recognition for Natural Human-Computer Interaction
    Yin, Ying
    Davis, Randall
    2014 IEEE SYMPOSIUM ON VISUAL LANGUAGES AND HUMAN-CENTRIC COMPUTING (VL/HCC 2014), 2014, : 113 - 120
  • [6] Real-time visual recognition of facial gestures for human-computer interaction
    Zelinsky, A
    Heinzmann, J
    PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, 1996, : 351 - 356
  • [7] Cross-Topic Opinion Mining for Real-Time Human-Computer Interaction
    Balahur, Alexandra
    Boldrini, Ester
    Montoyo, Andres
    Martinez-Barco, Patricio
    NATURAL LANGUAGE PROCESSING AND COGNITIVE SCIENCE, PROCEEDINGS, 2009, : 13 - 22
  • [8] A Dynamic Head Gesture Recognition Method for Real-Time Human-Computer Interaction
    Xie, Jialong
    Zhang, Botao
    Chepinskiy, Sergey A.
    Zhilenkov, Anton A.
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT III, 2021, 13015 : 235 - 245
  • [9] A Real-time Database that Supports Human-computer Interaction System of Larger and Complex Environment
    Wang, Kui-Sheng
    Zhuang, Jie
    Wang, Zhi-Xiong
    INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND COMMUNICATION ENGINEERING (CSCE 2015), 2015, : 590 - 596
  • [10] A Real-Time 3D Hair Animation System for Human-Computer Interaction
    Yu, Jun
    Jiang, Chen
    Luo, Chang-wei
    Li, Rui
    Wang, Zeng-fu
    2015 12TH INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY (FSKD), 2015, : 2303 - 2307