Isolated ASL sign recognition system for deaf persons

被引:59
|
作者
Waldron, Manjula B. [1 ]
Kim, Soowon [1 ]
机构
[1] Ohio State Univ, Columbus, United States
来源
关键词
Backpropagation - Handicapped persons - Joints (anatomy) - Neural networks - Sensors - Speech production aids - Terminology;
D O I
10.1109/86.413199
中图分类号
学科分类号
摘要
In this paper, the design and evaluation of a two-stage neural network which can recognize isolated ASL signs is given. The input to this network is the hand shape and position data obtained from a DataGlobe mounted with a Polhemus sensor. The first level consists of four backpropagation neural networks which can recognize the sign language phonology, namely, the 36 hand shapes, 10 locations, 11 orientations, and 11 hand movements. The recognized phonemes from the beginning, middle, and end of the sign are fed to the second stage which recognizes the actual signs. Both backpropagation and Kohonen's self-organizing neural work was used to compare the performance and the expandability of the learned vocabulary. In the current work, six signers with differing hand sizes signed 14 signs which included hand shape, position, and motion fragile and triple robust signs. When a backpropagation network was used for the second stage, the results show that the network was able to recognize these signs with an overall accuracy of 86%. Further, the recognition results were linearly dependent on the size of the finger to the metacarpohalangeal (MP) joint and the total length of the hand. When the second stage was a Kohonen's self-organizing network, the network could not only recognize the signs with 84% accuracy, but also expand its learned vocabulary through relabeling.
引用
收藏
页码:261 / 271
相关论文
共 50 条
  • [21] SportSign: A Service to Make Sports News Accessible to Deaf Persons in Sign Languages
    Othman, Achraf
    El Ghoul, Oussama
    Jemni, Mohamed
    COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, PROCEEDINGS, PT 2, 2010, 6180 : 169 - 176
  • [22] A SOFM/HMM system for person-independent isolated sign language recognition
    Fang, GL
    Gao, W
    HUMAN-COMPUTER INTERACTION - INTERACT'01, 2001, : 731 - 732
  • [23] Neural organization for recognition of grammatical and emotional facial expressions in deaf ASL signers and hearing nonsigners
    McCullough, S
    Emmorey, K
    Sereno, M
    COGNITIVE BRAIN RESEARCH, 2005, 22 (02): : 193 - 203
  • [24] Higher educational attainment but not higher income is protective for cardiovascular risk in Deaf American Sign Language (ASL) users
    McKee, Michael M.
    McKee, Kimberly
    Winters, Paul
    Sutter, Erika
    Pearson, Thomas
    DISABILITY AND HEALTH JOURNAL, 2014, 7 (01) : 49 - 55
  • [25] Variable factors in the relationship between American sign language (ASL) proficiency and English literacy acquisition in deaf children
    Prinz, PM
    Kuntze, M
    Strong, M
    RESEARCH ON CHILD LANGUAGE ACQUISITION, VOLS 1 AND 2, 2001, : 1429 - 1440
  • [26] Purdue RVL-SLLL ASL database for automatic recognition of American Sign Language
    Martínez, AM
    Wilbur, RB
    Shay, R
    Kak, AC
    FOURTH IEEE INTERNATIONAL CONFERENCE ON MULTIMODAL INTERFACES, PROCEEDINGS, 2002, : 167 - 172
  • [27] Recognition of isolated words of the Polish sign language
    Kapuscinski, T
    Wysocki, M
    Computer Recognition Systems, Proceedings, 2005, : 697 - 704
  • [28] DEAF-AND-MUTE SIGN LANGUAGE GENERATION SYSTEM
    KAWAI, H
    TAMURA, S
    PROCEEDINGS OF THE SOCIETY OF PHOTO-OPTICAL INSTRUMENTATION ENGINEERS, 1984, 515 : 209 - 214
  • [29] Isolated Sign Language Recognition with Depth Cameras
    Oszust, Mariusz
    Krupski, Jakub
    KNOWLEDGE-BASED AND INTELLIGENT INFORMATION & ENGINEERING SYSTEMS (KSE 2021), 2021, 192 : 2085 - 2094
  • [30] DEAF-AND-MUTE SIGN LANGUAGE GENERATION SYSTEM
    KAWAI, H
    TAMURA, S
    PATTERN RECOGNITION, 1985, 18 (3-4) : 199 - 205