TeethFa: Real-Time, Hand-Free Teeth Gestures Interaction Using Fabric Sensors

被引:0
|
作者
Wu, Yuan [1 ,2 ]
Bai, Shoudu [1 ,2 ]
Fu, Meiqin [1 ,2 ]
Hu, Xinrong [1 ,2 ]
Zhong, Weibing [3 ]
Ding, Lei [1 ,2 ]
Chen, Yanjiao [4 ]
机构
[1] Wuhan Text Univ, Sch Comp Sci & Artificial Intelligence, Wuhan 430200, Peoples R China
[2] Wuhan Text Univ, Engn Res Ctr Hubei Prov Clothing Informat, Wuhan 430200, Peoples R China
[3] Wuhan Text Univ, Key Lab Text Fiber & Prod, Minist Educ, Wuhan 430200, Peoples R China
[4] Zhejiang Univ, Coll Elect Engn, Hangzhou 310027, Peoples R China
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 21期
关键词
Human-machine interaction; teeth gestures recognition; wearable fabric sensors;
D O I
10.1109/JIOT.2024.3434657
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The interaction mode of smart eyewear has garnered significant research attention. Most smart eyewear relies on touchpads for user interaction. This article identifies a drawback arising from the use of touchpads, which can be obtrusive and unfriendly to users. In this article, we propose TeethFa, a novel fabric sensor-based system for recognizing teeth gestures. TeethFa serves as a hands-free interaction method for smart eyewear. TeethFa utilizes fabric sensors embedded in the glasses frame to capture pressure changes induced by facial muscle movements linked to teeth movements. This enables the identification of subtle teeth gestures. To detect teeth gestures, TeethFa designs a novel template-based signal segmentation method to determine the boundary of teeth gestures from fabric sensors, even in the presence of motion interference. To improve TeethFa's generalization, we employ a meta-learning technique based on generalization adjustment to extend the model to new users. We conduct extensive experiments to assess TeethFa's performance on 30 volunteers. The results demonstrate that our system accurately identifies five different teeth gestures with an average accuracy of 93.57%, and even for new users, the accuracy can reach 89.58%. TeethFa shows promise in offering a new interaction paradigm for smart eyewear in the future.
引用
收藏
页码:35223 / 35237
页数:15
相关论文
共 50 条
  • [31] Real-Time 3D Hand Gestures Recognition for Manipulation of Industrial Robots
    Cerlinca, T.
    Pentiuc, S. G.
    Vlad, V.
    ELEKTRONIKA IR ELEKTROTECHNIKA, 2013, 19 (02) : 3 - 8
  • [32] Real-time estimation of hand gestures based on manifold learning from monocular videos
    Yi Wang
    ZhongXuan Luo
    JunCheng Liu
    Xin Fan
    HaoJie Li
    Yunzhen Wu
    Multimedia Tools and Applications, 2014, 71 : 555 - 574
  • [33] Toward natural interaction through visual recognition of body gestures in real-time
    Varona, Javier
    Jaume-i-Capo, Antoni
    Gonzalez, Jordi
    Perales, Francisco J.
    INTERACTING WITH COMPUTERS, 2009, 21 (1-2) : 3 - 10
  • [34] Real-time visual recognition of facial gestures for human-computer interaction
    Zelinsky, A
    Heinzmann, J
    PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, 1996, : 351 - 356
  • [35] Real-time Motion-based Hand Gestures Recognition from Time-of-Flight Video
    Molina, Javier
    Antonio Pajuelo, Jose
    Martinez, Jose M.
    JOURNAL OF SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2017, 86 (01): : 17 - 25
  • [36] Real-time Motion-based Hand Gestures Recognition from Time-of-Flight Video
    Javier Molina
    José Antonio Pajuelo
    José M. Martínez
    Journal of Signal Processing Systems, 2017, 86 : 17 - 25
  • [37] TRAINWEAR: a Real-Time Assisted Training Feedback System with Fabric Wearable Sensors
    Zhou, Bo
    Bahle, Gernot
    Fuerg, Lorenzo
    Singh, Monit Shah
    Cruz, Heber Zurian
    Lukowicz, Paul
    2017 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS (PERCOM WORKSHOPS), 2017,
  • [38] Approach to tracking deformable hand gesture for real-time interaction
    Laboratory of Human-Computer Interaction and Intelligent Information Processing, Institute of Software, Chinese Academy of Sciences, Beijing 100080, China
    Ruan Jian Xue Bao, 2007, 10 (2423-2433):
  • [39] Real-Time Hand Gesture Recognition for Human Robot Interaction
    Correa, Mauricio
    Ruiz-del-Solar, Javier
    Verschae, Rodrigo
    Lee-Ferny, Jong
    Castillo, Nelson
    ROBOCUP 2009: ROBOT SOCCER WORLD CUP XIII, 2010, 5949 : 46 - 57
  • [40] Barehanded Music: Real-time Hand Interaction for Virtual Piano
    Liang, Hui
    Wang, Jin
    Sun, Qian
    Liu, Yong-Jin
    Yuan, Junsong
    Luo, Jun
    He, Ying
    PROCEEDINGS I3D 2016: 20TH ACM SIGGRAPH SYMPOSIUM ON INTERACTIVE 3D GRAPHICS AND GAMES, 2016, : 87 - 94