Controllable Visual-Tactile Synthesis

被引:2
|
作者
Gao, Ruihan [1 ]
Yuan, Wenzhen [1 ]
Zhu, Jun-Yan [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
关键词
GRASP;
D O I
10.1109/ICCV51070.2023.00648
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep generative models have various content creation applications such as graphic design, e-commerce, and virtual try-on. However, current works mainly focus on synthesizing realistic visual outputs, often ignoring other sensory modalities, such as touch, which limits physical interaction with users. In this work, we leverage deep generative models to create a multi-sensory experience where users can touch and see the synthesized object when sliding their fingers on a haptic surface. The main challenges lie in the significant scale discrepancy between vision and touch sensing and the lack of explicit mapping from touch sensing data to a haptic rendering device. To bridge this gap, we collect high-resolution tactile data with a GelSight sensor and create a new visuotactile clothing dataset. We then develop a conditional generative model that synthesizes both visual and tactile outputs from a single sketch. We evaluate our method regarding image quality and tactile rendering accuracy. Finally, we introduce a pipeline to render high-quality visual and tactile outputs on an electroadhesion-based haptic device for an immersive experience, allowing for challenging materials and editable sketch inputs.
引用
收藏
页码:7017 / 7029
页数:13
相关论文
共 50 条
  • [21] Visual-tactile multisensory integration in primate parietal operculum
    Hihara, Sayaka
    Iriki, Atsushi
    Taoka, Miki
    Tanaka, Michio
    JOURNAL OF PHYSIOLOGICAL SCIENCES, 2013, 63 : S253 - S253
  • [22] Visual or visual-tactile examination to detect and inform the diagnosis of enamel caries
    Macey, Richard
    Walsh, Tanya
    Riley, Philip
    Glenny, Anne-Marie
    Worthington, Helen, V
    O'Malley, Lucy
    Clarkson, Janet E.
    Ricketts, David
    COCHRANE DATABASE OF SYSTEMATIC REVIEWS, 2021, (06):
  • [23] Responses to visual, tactile and visual-tactile forward collision warnings while gaze on and off the road
    Lylykangas, Jani
    Surakka, Veikko
    Salminen, Katri
    Farooq, Ahmed
    Raisamo, Roope
    TRANSPORTATION RESEARCH PART F-TRAFFIC PSYCHOLOGY AND BEHAVIOUR, 2016, 40 : 68 - 77
  • [24] Generative Partial Visual-Tactile Fused Object Clustering
    Zhang, Tao
    Cong, Yang
    Sun, Gan
    Dong, Jiahua
    Liu, Yuyang
    Ding, Zhenming
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 6156 - 6164
  • [25] Active Visual-Tactile Cross-Modal Matching
    Liu, Huaping
    Wang, Feng
    Sun, Fuchun
    Zhang, Xinyu
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2019, 11 (02) : 176 - 187
  • [26] AvTF: A visual-tactile fingertip with active sensing and manipulation
    Zhao, Jie
    Wang, Sicheng
    Shan, Jianhua
    Sun, Yuhao
    Zhang, Shixin
    Sun, Fuchun
    Fang, Bin
    2024 WRC SYMPOSIUM ON ADVANCED ROBOTICS AND AUTOMATION, WRC SARA, 2024, : 451 - 457
  • [27] Visual-tactile expectations and peripersonal space representations in infancy
    Orioli, Giulia
    Parisi, Irene
    Van Velzen, Jose
    Bremner, Andrew
    COGNITIVE PROCESSING, 2021, 22 (SUPPL 1) : 27 - 27
  • [28] Effects of Viewing Distance on Visual and Visual-Tactile Evaluation of Black Fabric
    Isami, Chiari
    Kondo, Aki
    Goto, Aya
    Sukigara, Sachiko
    JOURNAL OF FIBER SCIENCE AND TECHNOLOGY, 2021, 77 (02): : 56 - 65
  • [29] Vision of a pictorial hand modulates visual-tactile interactions
    Igarashi, Yuka
    Kitagawa, Norimichi
    Ichihara, Shigeru
    COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE, 2004, 4 (02) : 182 - 192
  • [30] Generalized Visual-Tactile Transformer Network for Slip Detection
    Cui, Shaowei
    Wei, Junhang
    Li, Xiaocan
    Wang, Rui
    Wang, Yu
    Wang, Shuo
    IFAC PAPERSONLINE, 2020, 53 (02): : 9529 - 9534