Controllable Visual-Tactile Synthesis

被引:2
|
作者
Gao, Ruihan [1 ]
Yuan, Wenzhen [1 ]
Zhu, Jun-Yan [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
关键词
GRASP;
D O I
10.1109/ICCV51070.2023.00648
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep generative models have various content creation applications such as graphic design, e-commerce, and virtual try-on. However, current works mainly focus on synthesizing realistic visual outputs, often ignoring other sensory modalities, such as touch, which limits physical interaction with users. In this work, we leverage deep generative models to create a multi-sensory experience where users can touch and see the synthesized object when sliding their fingers on a haptic surface. The main challenges lie in the significant scale discrepancy between vision and touch sensing and the lack of explicit mapping from touch sensing data to a haptic rendering device. To bridge this gap, we collect high-resolution tactile data with a GelSight sensor and create a new visuotactile clothing dataset. We then develop a conditional generative model that synthesizes both visual and tactile outputs from a single sketch. We evaluate our method regarding image quality and tactile rendering accuracy. Finally, we introduce a pipeline to render high-quality visual and tactile outputs on an electroadhesion-based haptic device for an immersive experience, allowing for challenging materials and editable sketch inputs.
引用
收藏
页码:7017 / 7029
页数:13
相关论文
共 50 条
  • [41] Modeling spatial effects in visual-tactile saccadic reaction time
    Diederich, Aiele
    Colonius, Hans
    PERCEPTION & PSYCHOPHYSICS, 2007, 69 (01): : 56 - 67
  • [42] Visual-Tactile Sensory Map Calibration of a Biomimetic Whiskered Robot
    Assaf, Tareq
    Wilson, Emma D.
    Anderson, Sean
    Dean, Paul
    Porrill, John
    Pearson, Martin J.
    2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2016, : 967 - 972
  • [43] Intracranial Cortical Responses during Visual-Tactile Integration in Humans
    Quinn, Brian T.
    Carlson, Chad
    Doyle, Werner
    Cash, Sydney S.
    Devinsky, Orrin
    Spence, Charles
    Halgren, Eric
    Thesen, Thomas
    JOURNAL OF NEUROSCIENCE, 2014, 34 (01): : 171 - 181
  • [44] A mixed reality framework for microsurgery simulation with visual-tactile perception
    Xiang, Nan
    Liang, Hai-Ning
    Yu, Lingyun
    Yang, Xiaosong
    Zhang, Jian J.
    VISUAL COMPUTER, 2023, 39 (08): : 3661 - 3673
  • [45] Visual-tactile shape perception in the visually restored with artificial vision
    Stiles, Noelle R. B.
    Weiland, James D.
    Patel, Vivek R.
    JOURNAL OF VISION, 2022, 22 (02):
  • [46] Lifelong Visual-Tactile Spectral Clustering for Robotic Object Perception
    Liu, Yuyang
    Cong, Yang
    Sun, Gan
    Ding, Zhengming
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (02) : 818 - 829
  • [47] Seeing through touch: a conceptual framework of visual-tactile interplay
    Eklund, Andreas Aldogan
    Helmefalk, Miralem
    JOURNAL OF PRODUCT AND BRAND MANAGEMENT, 2018, 27 (05): : 498 - 513
  • [48] A mixed reality framework for microsurgery simulation with visual-tactile perception
    Nan Xiang
    Hai-Ning Liang
    Lingyun Yu
    Xiaosong Yang
    Jian J. Zhang
    The Visual Computer, 2023, 39 : 3661 - 3673
  • [49] Visual-Tactile Spatial Multisensory Interaction in Adults With Autism and Schizophrenia
    Noel, Jean-Paul
    Failla, Michelle D.
    Quinde-Zlibut, Jennifer M.
    Williams, Zachary J.
    Gerdes, Madison
    Tracy, John M.
    Zoltowski, Alisa R.
    Foss-Feig, Jennifer H.
    Nichols, Heathman
    Armstrong, Kristan
    Heckers, Stephan H.
    Blake, Randolph R.
    Wallace, Mark T.
    Park, Sohee
    Cascio, Carissa J.
    FRONTIERS IN PSYCHIATRY, 2020, 11
  • [50] Tactile cues are more intrinsically linked to motor timing than visual cues in visual-tactile sensorimotor synchronization
    Michelle K. Huntley
    An Nguyen
    Matthew A. Albrecht
    Welber Marinovic
    Attention, Perception, & Psychophysics, 2024, 86 : 1022 - 1037