A Biologically-Inspired Visual Saliency Model to Test Different Strategies of Saccade Programming

被引:0
|
作者
Ho-Phuoc, Tien [1 ]
Guerin-Dugue, Anne [1 ]
Guyader, Nathalie [1 ]
机构
[1] GIPSA Lab, F-38402 Grenoble, France
来源
BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES | 2010年 / 52卷
关键词
Saccade programming; Saliency map; Spatially variant retinal resolution; GLOBAL FEATURES; EYE-MOVEMENTS; ATTENTION; PERCEPTION; SEARCH;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Saliency models provide a saliency map that is a topographically arranged map to represent the saliency of the visual scene. Saliency map is used to sequentially select particular locations of the scene to predict a subject's eye scan-path when viewing the corresponding scene. A saliency map is most of the time computed using the same point of view or foveated point. Few models were interested in saccade programming strategies. In visual search tasks, studies shown that people can plan from one foveated point the next two saccades (and so, the next two fixations): this is called concurrent saccade programming. In this paper, we tested if such strategy occurs during natural scene free viewing. We tested different saccade programming strategies depending on the number of programmed saccades. The results showed that the strategy of programming one saccade at a time from the foveated point best matches the experimental data from free viewing of natural images. Because saccade programming models depend on the foveated point, we took into account the spatially variant retinal resolution. We showed that the predicted eye fixations were more effective when this retinal resolution was combined with the saccade programming strategies.
引用
收藏
页码:187 / 199
页数:13
相关论文
共 50 条
  • [21] Biologically-Inspired Abstraction Model to Analyze Sound Signal
    bin Hamzah, Hammuzamer Irwan
    bin Abdullah, Azween
    Candrawati, Ria
    2009 IEEE STUDENT CONFERENCE ON RESEARCH AND DEVELOPMENT: SCORED 2009, PROCEEDINGS, 2009, : 180 - 183
  • [22] Biologically-inspired Visual Stabilization of a Rotorcraft UAV in Unknown Outdoor Environments
    Denuelle, Aymeric
    Thurrowgood, Saul
    Strydom, Reuben
    Kendoul, Farid
    Srinivasan, Mandyam V.
    2015 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS (ICUAS'15), 2015, : 1084 - 1093
  • [23] A Biologically-Inspired Affective Model Based on Cognitive Situational Appraisal
    Shu, Feng
    Tan, Ah-Hwee
    2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [24] Biologically-inspired digital architecture for a cortical model of orientation selectivity
    Torres-Huitzil, Cesar
    Girau, Bernard
    Arias-Estrada, Miguel
    ARTIFICIAL NEURAL NETWORKS - ICANN 2008, PT II, 2008, 5164 : 188 - +
  • [25] A biologically inspired saliency model for color fundus images
    Rangrej, Samrudhdhi B.
    Sivaswamy, Jayanthi
    TENTH INDIAN CONFERENCE ON COMPUTER VISION, GRAPHICS AND IMAGE PROCESSING (ICVGIP 2016), 2016,
  • [26] A biologically-inspired wolf pack multiple robot hunting model
    Weitzenfeld, Alfredo
    Vallesa, Alberto
    Flores, Horacio
    2006 IEEE 3RD LATIN AMERICAN ROBOTICS SYMPOSIUM, 2006, : 90 - 97
  • [27] Biologically-Inspired Episodic Memory Model Considering the Context Information
    Park, Gyeong-Moon
    Cho, Sanghyun
    Kim, Jong-Hwan
    2016 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2016, : 1471 - 1476
  • [28] A Biologically-Inspired Computational Model for Transformation Invariant Target Recognition
    Iftekharuddin, Khan M.
    Li, Yaqin
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 1049 - 1056
  • [29] Biologically-inspired 3D grasp synthesis based on visual exploration
    Gabriel Recatalá
    Eris Chinellato
    Ángel P. del Pobil
    Youcef Mezouar
    Philippe Martinet
    Autonomous Robots, 2008, 25 : 59 - 70
  • [30] Biologically-inspired 3D grasp synthesis based on visual exploration
    Recatala, Gabriel
    Chinellato, Eris
    del Pobil, Angel P.
    Mezouar, Youcef
    Martinet, Philippe
    AUTONOMOUS ROBOTS, 2008, 25 (1-2) : 59 - 70