Response Modality vs. Target Modality: Sensory Transformations and Comparisons in Cross-modal Slant Matching Tasks

被引:1
|
作者
Liu, Juan [1 ]
Ando, Hiroshi
机构
[1] Natl Inst Informat & Commun Technol NICT, Ctr Informat & Neural Networks CiNet, Osaka, Japan
来源
SCIENTIFIC REPORTS | 2018年 / 8卷
基金
日本科学技术振兴机构;
关键词
REFERENCE FRAMES; HAPTIC PERCEPTION; PROPRIOCEPTION; INFORMATION; MULTIPLE; ORIENTATION; SPACE;
D O I
10.1038/s41598-018-29375-w
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Humans constantly combine multi-sensory spatial information to successfully interact with objects in peripersonal space. Previous studies suggest that sensory inputs of different modalities are encoded in different reference frames. In cross-modal tasks where the target and response modalities are different, it is unclear which reference frame these multiple sensory signals are transformed to for comparison. The current study used a slant perception and parallelity paradigm to explore this issue. Participants perceived (either visually or haptically) the slant of a reference board and were asked to either adjust an invisible test board by hand manipulation or to adjust a visible test board through verbal instructions to be physically parallel to the reference board. We examined the patterns of constant error and variability of unimodal and cross-modal tasks with various reference slant angles at different reference/test locations. The results revealed that rather than a mixture of the patterns of unimodal conditions, the pattern in cross-modal conditions depended almost entirely on the response modality and was not substantially affected by the target modality. Deviations in haptic response conditions could be predicted by the locations of the reference and test board, whereas the reference slant angle was an important predictor in visual response conditions.
引用
收藏
页数:12
相关论文
共 13 条
  • [1] Response Modality vs. Target Modality: Sensory Transformations and Comparisons in Cross-modal Slant Matching Tasks
    Juan Liu
    Hiroshi Ando
    Scientific Reports, 8
  • [2] Listen to Your Heart: Examining Modality Dominance Using Cross-Modal Oddball Tasks
    Robinson, Christopher W.
    Chadwick, Krysten R.
    Parker, Jessica L.
    Sinnett, Scott
    FRONTIERS IN PSYCHOLOGY, 2020, 11
  • [3] Is approximate numerical judgment truly modality-independent? Visual, auditory, and cross-modal comparisons
    Midori Tokita
    Yui Ashitani
    Akira Ishiguchi
    Attention, Perception, & Psychophysics, 2013, 75 : 1852 - 1861
  • [4] Is approximate numerical judgment truly modality-independent? Visual, auditory, and cross-modal comparisons
    Tokita, Midori
    Ashitani, Yui
    Ishiguchi, Akira
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2013, 75 (08) : 1852 - 1861
  • [5] HUMAN INFORMATION-PROCESSING AND SENSORY MODALITY - CROSS-MODAL FUNCTIONS, INFORMATION COMPLEXITY, MEMORY, AND DEFICIT
    FREIDES, D
    PSYCHOLOGICAL BULLETIN, 1974, 81 (05) : 284 - 310
  • [6] Enhancing target detection accuracy through cross-modal spatial perception and dual-modality fusion
    Zhang, Ning
    Zhu, Wenqing
    FRONTIERS IN PHYSICS, 2024, 12
  • [7] Attention-shift vs. response-priming explanations for the spatial cueing effect in cross-modal tasks
    Paavilainen, Petri
    Illi, Janne
    Moisseinen, Nella
    Niinisalo, Maija
    Ojala, Karita
    Reinikainen, Johanna
    Vainio, Lari
    SCANDINAVIAN JOURNAL OF PSYCHOLOGY, 2016, 57 (03) : 185 - 192
  • [8] HUMAN INFORMATION-PROCESSING + SENSORY MODALITY - CROSS-MODAL FUNCTIONS, INFORMATION COMPLEXITY, MEMORY, AND DEFICIT - REPLY
    FREIDES, D
    PSYCHOLOGICAL BULLETIN, 1975, 82 (06) : 948 - 948
  • [9] HUMAN INFORMATION-PROCESSING AND SENSORY MODALITY - CROSS-MODAL FUNCTIONS, INFORMATION COMPLEXITY, MEMORY, AND DEFICIT - COMMENT
    RUDEL, RG
    TEUBER, HL
    PSYCHOLOGICAL BULLETIN, 1975, 82 (06) : 947 - 947
  • [10] Does cross-modal correspondence modulate modality-specific perceptual processing? Study using timing judgment tasks
    Kyuto Uno
    Kazuhiko Yokosawa
    Attention, Perception, & Psychophysics, 2024, 86 : 273 - 284