Human Augmented Cognition Based on Integration of Visual and Auditory Information

被引:0
|
作者
Won, Woong Jae [1 ]
Lee, Wono [1 ]
Ban, Sang-Woo [2 ]
Kim, Minook [3 ]
Park, Hyung-Min [3 ]
Lee, Minho [1 ]
机构
[1] Kyungpook Natl Univ, Sch Elect Engn & Comp Sci, 1370 Sankyuk Dong, Taegu 702701, South Korea
[2] Dongguk Univ, Dept Informat & Commun Engn, Gyeongbuk 780714, South Korea
[3] Sogang Univ, Dept Elect Engn, Seoul 121742, South Korea
基金
新加坡国家研究基金会;
关键词
human augmented cognition; human identification; multiple sensory integration model; visual and auditory; adaptive boosting; selective attention; SELECTIVE ATTENTION; RECOGNITION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a new multiple sensory fused human identification model for providing human augmented cognition. In the proposed model, both facial features and mel-frequency cepstral coefficients (MFCCs) are considered as visual features and auditory features for identifying a human, respectively. As well, an adaboosting model identifies a human using the integrated sensory features of both visual and auditory features. In the proposed model, facial form features are obtained from the principal component analysis (PCA) of a human's face area localized by an Adaboost algorithm in conjunction with a skin color preferable attention model. Moreover, MFCCs are extracted from human speech. Thus, the proposed multiple sensory integration model is aimed to enhance the performance of human identification by considering both visual and auditory complementarily working under partly distorted sensory environments. A human augmented cognition system with the proposed human identification model is implemented as a goggle type, on which it presents information such as unknown people's profile based on human identification. Experimental results show that the proposed model can plausibly conduct human identification in an indoor meeting situation.
引用
收藏
页码:547 / +
页数:3
相关论文
共 50 条
  • [1] SENSORY INTEGRATION OF AUDITORY AND VISUAL INFORMATION
    DOUGHERTY, WG
    JONES, GB
    ENGEL, GR
    CANADIAN JOURNAL OF PSYCHOLOGY, 1971, 25 (06): : 476 - +
  • [2] Nonlinear integration of visual and auditory motion information for human control of posture
    Kitazaki, M
    Kohyama, L
    PERCEPTION, 2005, 34 : 83 - 83
  • [3] Integration of auditory and visual information in the recognition of realistic objects
    Suied, Clara
    Bonneel, Nicolas
    Viaud-Delmon, Isabelle
    EXPERIMENTAL BRAIN RESEARCH, 2009, 194 (01) : 91 - 102
  • [4] Integration of auditory and visual information in the recognition of realistic objects
    Clara Suied
    Nicolas Bonneel
    Isabelle Viaud-Delmon
    Experimental Brain Research, 2009, 194
  • [5] INTEGRATION OF AUDITORY INFORMATION IN CATS VISUAL-CORTEX
    FISHMAN, MC
    MICHAEL, CR
    VISION RESEARCH, 1973, 13 (08) : 1415 - 1419
  • [6] Automatic integration of auditory and visual information is not simultaneous in Chinese
    Huang Mingjin
    Hasko, Sandra
    Schulte-Koerne, Gerd
    Bruder, Jennifer
    NEUROSCIENCE LETTERS, 2012, 527 (01) : 22 - 27
  • [7] Integration of auditory and visual information in human face discrimination in pigeons Behavioral and anatomical study
    Watanabe, Shigeru
    Masuda, Sayako
    BEHAVIOURAL BRAIN RESEARCH, 2010, 207 (01) : 61 - 69
  • [8] The auditory-visual multisensory neurons and auditory-visual information integration in rat cortex
    Yu Li-Ping
    Wang Xiao-Yan
    Li Xiang-Yao
    Zhang Ji-Ping
    Sun Xin-De
    PROGRESS IN BIOCHEMISTRY AND BIOPHYSICS, 2006, 33 (07) : 677 - 684
  • [9] EVALUATION AND INTEGRATION OF VISUAL AND AUDITORY INFORMATION IN SPEECH-PERCEPTION
    MASSARO, DW
    COHEN, MM
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 1983, 9 (05) : 753 - 771
  • [10] Temporal window of integration of auditory information in the human brain
    Yabe, H
    Tervaniemi, M
    Sinkkonen, J
    Huotilainen, M
    Ilmoniemi, RJ
    Näätänen, R
    PSYCHOPHYSIOLOGY, 1998, 35 (05) : 615 - 619