Decoupled Temperature-Pressure Sensing System for Deep Learning Assisted Human-Machine Interaction

被引:23
|
作者
Chen, Zhaoyang [1 ]
Liu, Shun [1 ]
Kang, Pengyuan [1 ]
Wang, Yalong [1 ]
Liu, Hu [1 ]
Liu, Chuntai [1 ]
Shen, Changyu [1 ]
机构
[1] Zhengzhou Univ, Natl Engn Res Ctr Adv Polymer Proc Technol, State Key Lab Struct Anal Optimizat & CAE Software, Zhengzhou 450002, Henan, Peoples R China
基金
中国国家自然科学基金;
关键词
decoupled temperature-pressure sensing; deep learning algorithm; dual-mode sensor; human-machine interface; thermoelectric effects; SENSORS;
D O I
10.1002/adfm.202411688
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
With the rapid development of intelligent wearable technology, multimodal tactile sensors capable of data acquisition, decoupling of intermixed signals, and information processing have attracted increasing attention. Herein, a decoupled temperature-pressure dual-mode sensor is developed based on single-walled carbon nanotubes (SWCNT) and poly(3,4-ethylenedioxythiophene): poly(styrenesulfonate) (PEDOT:PSS) decorated porous melamine foam (MF), integrating with a deep learning algorithm to obtain a multimodal input terminal. Importantly, the synergistic effect of PEDOT:PSS and SWCNT facilitates the sensor with ideal decoupling capability and sensitivity toward both temperature (38.2 mu V K-1) and pressure (10.8% kPa-1) based on the thermoelectric and piezoresistive effects, respectively. Besides, the low thermal conductivity and excellent compressibility of MF also endow it with the merits of a low-temperature detection limit (0.03 K), fast pressure response (120 ms), and long-term stability. Benefiting from the outstanding sensing characteristics, the assembled sensor array showcases good capacity for identifying spatial distribution of temperature and pressure signals. With the assistance of a deep learning algorithm, it displays high recognition accuracy of 99% and 98% corresponding to "touch" and "press" actions, respectively, and realizes the encrypted transmission of information and accurate identification of random input sequences, providing a promising strategy for the design of high-accuracy multimodal sensing platform in human-machine interaction. SWCNT/PEDOT:PSS@melamine foam (SPMF) sensor with 3D porous structure are available for fully decoupled temperature-pressure dual-mode sensing with high sensitivity, ultralow temperature detection limits, fast response times, and excellent fatigue resistance. Meanwhile, with the assistance of deep learning model, the multimodal input terminal of SPMF sensor arrays achieved encrypted transmission of information and accurate identification of random input sequences. image
引用
收藏
页数:12
相关论文
共 50 条
  • [31] A stochastic model of human-machine interaction for learning dialog strategies
    Levin, E
    Pieraccini, R
    Eckert, W
    IEEE TRANSACTIONS ON SPEECH AND AUDIO PROCESSING, 2000, 8 (01): : 11 - 23
  • [32] Remote situational intelligent sensing system for human-machine integration
    Niu W.
    Fan M.
    Li Y.
    Peng X.
    Xie W.
    Ren J.
    Yang Z.
    Guofang Keji Daxue Xuebao/Journal of National University of Defense Technology, 2021, 43 (06): : 85 - 94
  • [33] Determinants of human-machine interaction technology usage: An automated machine learning approach
    Alon, Ilan
    Bretas, Vanessa P. G.
    Galetti, Jefferson R. B.
    Gotz, Marta
    Jankowska, Barbara
    TECHNOVATION, 2025, 143
  • [34] Recent advancements in wearable sensors: integration with machine learning for human-machine interaction
    Mu, Guangrui
    Zhang, Yang
    Yan, Zhonghong
    Yu, Qinming
    Wang, Qifan
    RSC ADVANCES, 2025, 15 (10) : 7844 - 7854
  • [35] Roadmap to Human-Machine Interaction through Triboelectric Nanogenerator and Machine Learning Convergence
    Babu, Anand
    Mandal, Dipankar
    ACS APPLIED ENERGY MATERIALS, 2024, 7 (03) : 822 - 833
  • [36] Human-machine interface evaluation in a computer assisted surgical system
    Fernádez-Lozano, J
    Gómez-de-Gabriel, JM
    Muñoz, VF
    García-Morales, I
    Melgar, D
    Vara, C
    García-Cerezo, A
    2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, : 231 - 236
  • [37] Multifunctional Human-Computer Interaction System Based on Deep Learning-Assisted Strain Sensing Array
    Gu, Hao
    Jiang, Ke
    Yu, Fei
    Wang, Liying
    Yang, Xijia
    Li, Xuesong
    Jiang, Yi
    Lu, Wei
    Sun, Xiaojuan
    ACS APPLIED MATERIALS & INTERFACES, 2024, 16 (40) : 54496 - 54507
  • [38] Toward a sociology of machine learning explainability: Human-machine interaction in deep neural network-based automated trading
    Borch, Christian
    Hee Min, Bo
    BIG DATA & SOCIETY, 2022, 9 (02):
  • [39] A Visual Analytic in Deep Learning Approach to Eye Movement for Human-Machine Interaction Based on Inertia Measurement
    Fahim, Shahriar Rahman
    Datta, Dristi
    Sheikh, MD. Rafiqul Islam
    Dey, Sanjay
    Sarker, Yeahia
    Sarker, Subrata K.
    Badal, Faisal R.
    Das, Sajal K.
    IEEE ACCESS, 2020, 8 : 45924 - 45937
  • [40] Deep Learning-Based Hand Gesture Recognition System and Design of a Human-Machine Interface
    Sen, Abir
    Mishra, Tapas Kumar
    Dash, Ratnakar
    NEURAL PROCESSING LETTERS, 2023, 55 (09) : 12569 - 12596