Federated Closed-Loop Learning for Cross-Modal Generation of Tactile Friction in Tactile Internet

被引:0
|
作者
Zhang, Liping [1 ]
Wang, Haoming [1 ]
Yang, Lijing [1 ]
Liu, Guohong [1 ]
Wang, Cong [1 ]
Lv, Liheng [1 ]
机构
[1] Jilin Univ, Coll Commun Engn, Changchun 130025, Peoples R China
来源
IEEE INTERNET OF THINGS JOURNAL | 2025年 / 12卷 / 06期
关键词
Friction; Accuracy; Servers; Data models; Tactile Internet; Training; Visualization; Federated learning; Computational modeling; Force measurement; Closed-loop learning (CLL); cross-modal generation; federated learning (FL); tactile friction; tactile Internet; FRAMEWORK;
D O I
10.1109/JIOT.2024.3492274
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Tactile Internet, as a novel industrial network, allows fully immersive multisensory remote exploration of real or virtual environments. An important technological aspect in tactile Internet is the acquisition, compression, transmission, and display of haptic information. This article focuses on the cross-modal acquisition of fingertip's tactile friction from visual measurements. In tactile Internet applications, these tactile friction data are transmitted to surface haptic devices for high-fidelity haptic rendering of shapes and textures on touchscreens. To ensure the reliability and latency for such tactile friction acquisition, we develop a federated closed-loop learning (FedCLL) method that is based on the optimized federated learning and the closed-loop learning. The former builds the global model in the centric server, by utilizing deep reinforcement learning to determine aggregation weights of local tactile devices, which improves the acquisition accuracy; The latter generates tactile friction for local devices, by exploring feedback mechanism to achieve improved accuracy and reduced complexity. The proposed FedCLL is numerically evaluated, using HapTex dataset. The results show that FedCLL outperforms existent methods in both acquisition accuracy and computational complexity.
引用
收藏
页码:7026 / 7036
页数:11
相关论文
共 50 条
  • [31] Flavour-tactile cross-modal sensory interactions: The case for astringency
    Niimi, Jun
    Liu, Minyu
    Bastian, Susan E. P.
    FOOD QUALITY AND PREFERENCE, 2017, 62 : 106 - 110
  • [32] Cross-Modal Reconstruction for Tactile Signal in Human-Robot Interaction
    Chen, Mingkai
    Xie, Yu
    SENSORS, 2022, 22 (17)
  • [33] Cross-modal tactile-visual neural representations in bumble bees
    James, Thomas W.
    LEARNING & BEHAVIOR, 2020, 48 (04) : 393 - 394
  • [34] Cross-modal mere exposure effects between visual and tactile modalities
    Suzuki, M
    Gyoba, J
    PERCEPTION, 2005, 34 : 84 - 84
  • [35] Visuo-tactile cross-modal associations in cortical somatosensory cells
    Zhou, YD
    Fuster, JM
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2000, 97 (17) : 9777 - 9782
  • [36] Central cross-modal stochastic resonance in human tactile blink reflex
    Yasuda, Hideaki
    Miyaoka, Tsuyoshi
    Horiguchi, Jun
    Yamamoto, Yoshiharu
    NOISE AND FLUCTUATIONS, 2007, 922 : 545 - +
  • [37] Cross-modal visuo-tactile matching in a patient with a semantic disorder
    Forti, S
    Humphreys, GW
    NEUROPSYCHOLOGIA, 2005, 43 (11) : 1568 - 1579
  • [38] A cross-modal tactile sensor design for measuring robotic grasping forces
    Fang, Bin
    Xue, Hongxiang
    Sun, Fuchun
    Yang, Yiyong
    Zhu, Renxiang
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2019, 46 (03): : 337 - 344
  • [39] Cross-modal tactile-visual neural representations in bumble bees
    Thomas W. James
    Learning & Behavior, 2020, 48 : 393 - 394
  • [40] Cross-modal feedback of tactile and auditory stimuli for cyclists in noisy environments
    Uemura, Ryosuke
    Asakura, Takumi
    SENSORS AND ACTUATORS A-PHYSICAL, 2024, 380