A Biologically-Inspired Visual Saliency Model to Test Different Strategies of Saccade Programming

被引:0
|
作者
Ho-Phuoc, Tien [1 ]
Guerin-Dugue, Anne [1 ]
Guyader, Nathalie [1 ]
机构
[1] GIPSA Lab, F-38402 Grenoble, France
来源
BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES | 2010年 / 52卷
关键词
Saccade programming; Saliency map; Spatially variant retinal resolution; GLOBAL FEATURES; EYE-MOVEMENTS; ATTENTION; PERCEPTION; SEARCH;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Saliency models provide a saliency map that is a topographically arranged map to represent the saliency of the visual scene. Saliency map is used to sequentially select particular locations of the scene to predict a subject's eye scan-path when viewing the corresponding scene. A saliency map is most of the time computed using the same point of view or foveated point. Few models were interested in saccade programming strategies. In visual search tasks, studies shown that people can plan from one foveated point the next two saccades (and so, the next two fixations): this is called concurrent saccade programming. In this paper, we tested if such strategy occurs during natural scene free viewing. We tested different saccade programming strategies depending on the number of programmed saccades. The results showed that the strategy of programming one saccade at a time from the foveated point best matches the experimental data from free viewing of natural images. Because saccade programming models depend on the foveated point, we took into account the spatially variant retinal resolution. We showed that the predicted eye fixations were more effective when this retinal resolution was combined with the saccade programming strategies.
引用
收藏
页码:187 / 199
页数:13
相关论文
共 50 条
  • [31] Modulating Vision with Motor Plans: A Biologically-inspired Efficient Allocation of Visual Resources
    Lukic, Luka
    Billard, Aude
    Santos-Victor, Jose
    2013 13TH IEEE-RAS INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2013, : 161 - 168
  • [32] A concurrent real-time biologically-inspired visual object recognition system
    Holzbach, Andreas
    Cheng, Gordon
    2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2014, : 3201 - 3206
  • [33] Rapid biologically-inspired scene classification using features shared with visual attention
    Siagian, Christian
    Itti, Laurent
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (02) : 300 - 312
  • [34] A biologically-inspired computational model for perceiving the TROIs from texture images
    Lee, Woobeom
    Kim, Wookhyun
    PRICAI 2006: TRENDS IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2006, 4099 : 1114 - 1118
  • [35] Towards a Biologically-Inspired Model for Relational Mapping Using Spiking Neurons
    Biancaniello, Paul
    Rosenbluth, David
    Szumowski, Tom
    Darvill, John
    Hinnerschitz, Nick
    Hummel, John
    Mihalas, Stefan
    BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES 2011, 2011, 233 : 38 - +
  • [36] Visual guidance of a small mobile robot using active, biologically-inspired, eye movements
    Mura, F
    Shimoyama, I
    1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, 1998, : 1859 - 1864
  • [37] A multi-sensory robot for testing biologically-inspired odor plume tracking strategies
    Bailey, JK
    Willis, MA
    Quinn, RD
    2005 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS, VOLS 1 AND 2, 2005, : 1477 - 1481
  • [38] A novel biologically-inspired target detection method based on saliency analysis for synthetic aperture radar (SAR) imagery
    Ma, Fei
    Gao, Fei
    Wang, Jun
    Hussain, Amir
    Zhou, Huiyu
    NEUROCOMPUTING, 2020, 402 (402) : 66 - 79
  • [39] Biologically-inspired model for multi-order coloring texture boundary detection
    Chen, Tianding
    2006 IEEE International Conference on Information Acquisition, Vols 1 and 2, Conference Proceedings, 2006, : 183 - 188
  • [40] A New Intelligent Biologically-Inspired Model for Fault Tolerance in Distributed Embedded Systems
    Mehalaine, Ridha
    Boutekkouk, Fateh
    INTERNATIONAL JOURNAL OF EMBEDDED AND REAL-TIME COMMUNICATION SYSTEMS (IJERTCS), 2020, 11 (03): : 22 - 47